4 items tagged "insights"

  • Countering common critics on survey data with market research skills

    Countering common critics on survey data with market research skills

    Don't be mislead by survey data estimating the likelihood of adoption and the relative importance of a choice attribute.

    With some justification, not everyone is a fan of survey data. Periodically, a client or prospective client recites one of two quotes.
    The first quote is from the founder of Apple, Steve Jobs, who said “People don’t know what they want until you show it to them. That’s why I never rely on market research. Our task is to read things that are not yet on the page.
    The second quote is from David Ogilvy.:“The trouble with market research is that people don’t think what they feel, they don’t say what they think, and they don’t do what they say.

    Those who don’t want to do data collection use the offensive play, which is the quote from Steve Jobs. The play goes like this: 'there is no sense undertaking market research because after all, people don’t know what they want'. On the other hand, support for the Ogilvy position is usually invoked once the unpalatable research findings are tabled. The defensive play switches to 'that’s all very well and good, but as we know, respondents don’t do what they say'.

    “People don’t know what they want”

    Rebuttal #1

    Steve Jobs’ perspective is easiest to reconcile. Jobs was talking about new to the world products such as an MP3 player in 2001. He was right in that, if your product innovation is ostensibly ‘new to the world’ and the market has limited ability to conceive its application, then there is not a lot of value to be derived from asking respondents about adoption. Henry Ford was from the same school when he said, “If I had asked people what they wanted, they would have said faster horses.” However, using survey data is useful for illuminating attitudes, market sizing based on core needs, and understanding current alternatives. If Steve Jobs was talking about an invention that was a phone that incorporated an MP3 player, a camera, and GPS then perhaps the market could comprehend such a value-added bundle and more readily estimate adoption. And today, if Apple wanted to understand why users were switching to Samsung, then survey data is surely the best source of data.

    Rebuttal #2 

    The other counter-argument is that with respect, you are not Steve Jobs! Every now and then, a visionary entrepreneur comes along, whose imagination and resources are unconstrained, and proposes something as frame-breaking as putting “a thousand songs in your pocket.”

    For the rest of us – less than visionaries – research is a critical component to optimizing our efforts and investment. However, the rationale for removing insight from the process is different depending on what phase of the research project we are in.

    Market research and the 5 stages of grief

    I should tell you, the context in which I usually hear these comments change depending on where we are in the project. If we are at the end of the project, then the comments often reflect the seminal work by Elisabeth Kübler-Ross. Her five stages of grief model ranges from denial and anger to acceptance and action. There are a handful of alerts researchers receive when the client has slipped into denial and anger. Some of them include questions like “Who commissioned this research? Where did you get the sample from?” And that old chestnut, “You do realize people don’t always do what they say they will do.” We would agree wholeheartedly with the last statement and therefore, apply techniques that sidestep that issue.

    When is survey data criticism justifiable?

    As in all fields of life, there is a degree of variability in the quality of market researchers. And as is the case in most (if not all) services, there is an asymmetry of information. Meaning the seller is better informed than the buyer, and occasionally the buyer struggles to recognize poor quality researchers from good quality researchers. Here are two basic situations where criticism of findings arising from survey data could be justified.

    Situation

    Discussion

    Estimating the likelihood of adoption.

    The wrong answer is a dichotomous yes or no response.

    The better answer is to apply a Juster Scale. In 1966, Professor F. Thomas Juster argued that verbal intentions were disguised probability statements, therefore, why not directly capture the probabilities themselves as expressed by the respondents.

    Using an eleven-point scale ranging from 0-10, for likelihood to adopt, Forethought only counts those respondents scoring eight or above as a yes. According to Professor Juster, an eight is ‘very probable’ on the likelihood scale. We have found this to be a remarkably accurate indication of actual in-market adoption.

    Estimating the relative importance of an attribute in choice.

    The classic folly in understanding the relative importance of attributes, such as price, is to ask the stated importance of price. Respondents do two things. Firstly, they overstate the importance of price, which then leads them to also overstate their price sensitivity.

    The better approach is to assemble the hypothesized list of reasons for adoption (explanatory variables), including price, and to then ask the respondent to rate each explanatory variable on a 0-10 importance scale. With the right dependent variable, such as value for money, multivariate analytics is then applied to infer the importance. We have found this to be a remarkably accurate indication of the relative importance of an attribute in choice.

    A third scenario, usually limited to qualitative research for high involvement goods and services (that is, high perceived social risk), is where a social desirability bias creates distortions in stated choice. Similarly, the resulting mismatch between stated and actual can be averted by using inferential analysis to infer importance.

    I am not sure about other research companies but, about one in five Forethought colleagues is a marketing scientist. These folks are responsible for the design and analysis of survey data. About 60% of our colleagues are consultants arranged into category verticals (banking, telecommunications, healthcare, and so on) to maximize domain knowledge.

    My job is working with survey data; I love it! Pretty much the whole purpose of my professional life has been to assist businesses to predict and affect behavioral change as it relates to marketing. The primary vehicle for that prediction has been and continues to be, survey data. Forethought operates in just two areas of behavioral change: gaining and retaining market share for services businesses. With over 4,000 projects completed, I cannot recall a single instance where our predictive models have been incorrect. Without a doubt, market research insights is a game of skill.

    Author: Ken Roberts, CEO of Forethought

    Source: GreenBook

  • Getting the most out of market research using two speeds

    Getting the most out of market research using two speeds

    We are slowly eradicating binary ideologies in the insights industry; yet there is still some hesitation to commit to more than one speed of research concurrently. Is a two-speed market research strategy the key to unleashing our potential?

    One of the big themes consistently running throughout industry discussions is speed. As insight professionals we are constantly on the search for new, innovative, and, quick methods of producing high-quality data for stakeholders. As such, market research has developed into a very speed-focussed field.

    The demand for quick insights that inform decisions at the speed of business has risen exponentially and we have met that demand with the dedication befitting our expertise. But for all our focus on quick, agile insight generation, we’re in danger of disregarding the benefits of steady, continuous or long-term research.

    However, we don’t need to divert our thinking from quick-paced research entirely, there is a neatly-wrapped solution that allows us to make use of both quick- and slow-paced methodologies in one prominent two-speed strategy. The Two-Speed market research strategy, allows us to conduct concurrent research projects in parallel, helping us keep our finger on the pulse of consumer sentiment, respond to unexpected events when they pop up, and future-proof organisations as predictive insights roll in.

    Two speeds of market research

    Binary thinking is slowly being eradicated in the insights industry: it’s neither quantitative nor qualitative, agile or structured, online or in-person market research. Just like in improv, we are fully embracing the word ‘and’ when it comes to designing the right research projects, which, aided by technology, means we can pull off increasingly elaborate and personalised research experiences for every single requirement stakeholders could possibly realise.

    However, there is still a hangover in the binary ideology, surrounding the scale of research production and the number of projects that can be ran concurrently. This is mostly to do with reasons like budget and time, especially after a pandemic when both considerations are tighter than ever; so insight professionals are either conducting quick ad-hoc projects reacting to situations and events, or a longer-term, slower project that seeks to gather as much detail as possible.

    Insight teams should be looking to combine the two in a two-speed research strategy, a hybrid methodology that brings together the benefits of both, while eradicating the dangerous of only choosing one.

    1. Fast, reactive research

    This first type of research is the efficient speed of insight generation we’ve been working towards attaining for a while now. Quick research typically takes the form of surface-level data gathering, responding to a shift in consumer behaviour or a specific event, exploring the new beginnings of a concept or product, or exploring the market for new opportunities.

    Agile, ad-hoc, or quick-turn solutions are typical examples of fast-paced research strategies, but for these to work best, they need the right participants to really activate the potential and generate the best insights possible to inform stakeholder decisions. As such, this quick research works best when insight teams either have a continuous community or panel of participants on tap, or have a sample provider ready to recruit the right people for the job.

    While all methodologies would fit within this strategy, there are only a few that would be able to do quick research justice; luckily these are core methodologies that market research revolves around: surveys, focus groups, quick and open polls, etc., the tools we can use to whip up a very efficient data gathering project at a moment’s notice.

    2. Slow, future-oriented research

    Slower-paced research is the complete opposite, but works well in conjunction with fast-paced research. There are a wide range of methodologies we can employ, with any tool we could possibly want to implement: surveys, focus groups, and polls, but also in-depth interviews, diary studies, geo-location studies, eye-tracking tools, and so many more techniques can be combined and used effectively with the luxury of a little more time to really make the most of the insight generation process.

    Slower-paced, long-term research has become a primary tool for being able to understand global change, behavioural trends, directional shift of brands, and other truly large-scale research projects that require in-depth studies to truly form a deep and understanding that will inform many important decisions across organisations and industries. But this type of research isn’t necessarily restricted to trend forecasting or future-proofing, although these slow-paced projects are great for three research project themes in particular:

    • Tracking – for example, sentiment, behaviour, opportunity, and competition.
    • Testing – product, concept, service, user or customer experience.
    • Validation – or the disproving of theories and repelling the temptation of gut decision-making.

    Implementing two-speed strategies

    Instead of two medium-impact projects working one after the other, waiting in a queue until there’s time to dedicate, running them both concurrently so they’re feeding into each other will create a new level of high-impact research strategy that enhances the effectiveness of insight steams and boost insight generation so businesses can make agile decisions at will with 100% confidence.

    But for this to happen, insight teams need the right infrastructure in place, the right technology stacked at the ready to implement any projects within the two-speed strategy. While we can’t predict what tools we’ll need right at the start, we can always add more to the stack when necessary, but having that starting stack there to customise and evolve to fit the projects we conduct will help minimise the effort of implementing new research.

    At that point, it’s a case of understanding which projects fit into each stream of the research strategy, continuous research will obviously fit into the slower, longer-term stream, while any ad-hoc or spin-off research projects will fit into the quicker, time-sensitive research stream. Of course, there will be times where one project might need to be split between the two, and we can use the insights from one to better inform the direction and construction of questions in the other; the result of which will be the building of robust, powerful, and accurate insights that are bound to better inform key decisions across the organisation.

    Author: Emily James

    Source: Greenbook Blog

  • MI and knowing your customers, the top 1%

    MI and knowing your customers, the top 1%

    Most organizations have a good idea who their best customers are. But do you know who your top 1% are? Do you know how they are using technology to gain advantages? Who are their role models? What does this mean for organizations?

    Last year, Don Scheibenreif, Ed Thompson, and Jenny Sussin explored these questions as part of Gartner’s annual Maverick research competition. Maverick research is an incubator for new ideas. It inspires to challenge Gartner’s mainstream research and positions. This helps clients take advantage of trends and insights that could impact their strategy and their organization.

    The research results intentionally contradict prevailing wisdom, namely that the internet and technology are great equalizers. Instead, it is suggested that technology is a key way that the wealthiest people gain, retain and grow their wealth and position. And, this is being imitated by your 1% most valuable customers. Often at your expense. Learning how the elite use new technologies as a weapon will prepare CIOs and CX leaders for what is coming next in customer engagement.

    What was found

    • The most successful techniques for and approaches to using technology by billionaires are imitated by the next tier in society, and so on down. This means that what your most valuable 1% learn from the wealthy elite will impact your organization sooner or later.
    • The past 20 years have seen the unequal access to information and technology between supplier and customer shift in favor of the customer. In some cases, particularly the top 1% most valuable, customers now have access to more information and technology than their suppliers and can exert astonishing influence.
    • The top 1% most valuable of your customer base needs more active management, because they are better at using informational, referent and technology expert powers than in the past (historical sources of power amplified in the digital age).

    What organizations can do

    • Investigate how your 1% most valuable customers use referent, informational and expert powers to achieve their own ends. Make proposals to mitigate their negative actions.
    • Position your organization to benefit more from the top 1% most valuable of your customers by
      • Learning who your 1% most valuable customers are and what percentage of revenue, profit and lifetime value they represent
      • Turning your insights into strategy
      • Earning their loyalty while protecting your profitability

    Billionaires have used and will use technology to gain and keep power and wealth, so will the 1% most valuable of your customers. This trend represents the continued erosion of your ability to control the nature of your relationships with your customers. Savvy digital leaders can help their organization to be prepared for the ways in which the 1% most valuable customers can hurt their business.

    Source: Gartner

  • Using data successfully: the role of data democratization

    Using data successfully: the role of data democratization

    An effective culture to underpin your strategy

    A business that looks to become truly data-driven knows that employees are more likely to back the wider strategy if they have access to data that helps them do their jobs better. Data democratization and the positive culture it can create is, therefore, critical to the long-term success of any organization.

    According to a recent reportData Strategy and Culture: Paving the Way to the Cloud, senior decision-makers are confident that they’re opening up access to data sufficiently.

    So do your employees at all levels actually have adequate access to data to boost their decision-making? Does the data at their disposal and how they work with it turn employees into strong advocates for your organization’s data strategy? I seek to address these questions.

    This blog focuses on the crossover between data strategy and deployment decisions. It covers:

    • Why data democratization is critical to developing a positive data culture
    • What are the main barriers to this
    • What else you can learn?

    Data democratization: room for improvement

    Successful organizations identify the key capabilities that are required to execute their data strategy effectively. Infrastructure decisions are an important part of this as any limitations can cause frustration and poor engagement — ultimately, the wrong choice can restrict how well an employee can perform in their role.

    Almost four out of five respondents to our survey say their current IT infrastructure makes it challenging to democratize data in their organization. This is a significant obstacle to be overcome. There are additional barriers, too, such as a lack of relevant data skills or too many new data sources.

    At this point, businesses have to focus on which deployment model best meets their needs. On the topic of data democratization, many will naturally think of the benefits the cloud can bring. The right deployment model allows for data sharing in a secure and cost-effective manner across all levels and departments. It allows people, and therefore the company, to perform at their best.

    Don’t limit your potential

    Despite the importance of this, almost half (46%) of respondents to our latest research believe that the democratization of data isn’t feasible for them.

    This could be a big risk. If your technology infrastructure doesn’t allow you to open up access to data across the whole business, you’re stopping your organization from becoming truly data-driven. This could ultimately mean that insights can’t be gathered quickly enough, projects could be stalled, and a competitive edge on competitors can be lost.

    Make the data work

    There is a clear need for organizations to carefully consider which deployment option gives them the freedom needed to effectively open up access to data. Yet, the story doesn’t end once a decision has been made.

    Teams must constantly monitor whether employees are able to work with the data at their disposal effectively. Can they get the insights they need from the data? Is there an ambition to increase the spread of data democratization within the organization?

    So when it comes to optimizing the success of your data strategy, data democratization is an important and key step in the process — and your company needs to get it right.

    The report investigates all of the key points raised in this blog and explains how developing a positive data culture starts with data democratization. This is the point when you secure your employees’ backing of the project. Only then are you truly ready to choose the right deployment model.

    Author: Mathias Golombek

    Source: Dataversity

EasyTagCloud v2.8