9 items tagged "insights"

  • 5 Tips to support marketing the best way possible with insights

    5 Tips to support marketing the best way possible with insights

    When both insights and marketing work together effectively, everyone benefits. But it’s not always easy to get there. Here's five ways to be a better insights partner to your marketing team, straight from PepsiCo's Global CMO.

    For any great partnership to work, both parties need an empathetic understanding of each other’s wants, needs and general perspectives on life. Without this understanding, you’ll end up with a relationship drenched in confusion, animosity or resentment. (Think about a marriage: both spouses need to understand where the other is coming from, otherwise they’ll end up assuming ill intent whenever they’re not on the same page!)

    One great partnership you find in large organizations is the one between insights and marketing. These teams work hand-in-hand to grow great brands and serve customer needs. When both insights and marketing are working together effectively, everyone benefits (especially the consumer).

    As someone in an insights role, consider your own relationship with your marketing counterparts. Is there room for improvement there?

    At this year’s Virtual Insight Summit, Zappi President Ryan Barry sat down with PepsiCo’s Global CMO, Jennifer Saenz. (Jennifer is also the President of Pepsi’s food business. Rather than complaining about how she’s extremely busy with two full-time jobs, Jennifer claims both roles balance her and make her better overall!)

    A lot of great tips about how insights can be a better partner to marketing came out of that conversation. Let’s dive into them here!

    1. Know the basics of marketing

    "It can be dangerous as you get pulled toward new technologies, you almost forget the basics…It’s a lot harder to get those technologies to work for you if you forget the 'why' behind what you want to achieve."

    As an insights professional, you probably know a lot about data analysis, system 1 and system 2 thinking, qual and quant research methodologies, and so much more. But are you familiar with the basics of marketing that your marketing counterparts live every day? Trends come and go, but the basics don’t change.

    What is the marketing team responsible for day-to-day? What are they trying to accomplish? What tools do they use? How are they measured? Having the answers to these questions is an important first step to being a better insights partner with marketing — or anyone, really.

    Jennifer recommends reading the book How Brands Grow by Byron Sharp (which we’re also strong proponents of at Zappi!) to learn more about the challenges marketers are facing and the levers they can pull to make an impact.

    We would also recommend just having a conversation with your marketing partners, if you haven’t already done so. You can learn a lot about someone’s world by just opening up that dialogue.

    2. Translate knowledge for impact

    “I think the best insights partner is focused on helping us make sure that the knowledge we’re creating is really intended to create an impact. That’s the most important thing the insights partner can bring.”

    Jennifer cautions that you never want to get into the habit of creating knowledge for the sake of it. The best insights partners are those who can translate knowledge for impact.

    Any information that makes someone respond with “ok, that’s cool” is not really effective. But data that makes your audience sit up and say “wow, we need to do X now!” — that’s the dream. That’s how you know you’ve really made an impact with your work.

    It’s up to you to translate what you’ve learned in your research into a form someone can use. Because everyone is busy and no one is suffering from a shortage of data. Give your stakeholders the SO WHAT from your data so they won’t miss it.

    Give some thought to your insights career. How many times do you feel like someone was able to immediately identify the actions they should take based on research you shared? How many “holy shit” moments have you been responsible for? If that number isn’t as high as you’d like, take a few extra moments each time you deliver results to think about the SO WHAT. How can you synthesize your learnings so the person consuming them knows exactly what they need to do next?

    When you can do that every time you deliver insights (and do it quickly, as Jennifer notes that the time it takes to go from data to action is compressing all the time), you’ll make a big impact on your organization.

    3. Democratize insights

    “When you can democratize that information and understanding, all of a sudden the amount of impact you can make increases because so many people have access to it.”

    Ultimately, it’s up to you to make sure the right people have access to the right data to make the right decisions.

    “Democratization” of data is a bit of a buzzword, one that can evoke strong emotions from insights professionals. But it really just means making data available to as many people as possible. When you democratize insights, suddenly every marketer has the benefit of that data right at their fingertips whenever they need it. It can significantly grow the impact of your work.

    This is, of course, easier said than done. PepsiCo has invested in this area because it’s important to both insights and marketing teams. PepsiCo built an internal tool that gives everyone access to customer data and standardized research methods. While a lot of work went into this, Jennifer says that it has certainly been worth it because:

    1. With standardized tools and a common language to use, PepsiCo has more effective cross-functional conversations

    2. The work that the marketing and insights teams are doing now has greater visibility and credibility within the company

    3. PepsiCo’s marketing is getting better and better!

    4. Reflect back to move forward

    “If you're testing to just get to ‘yes,’ you're only looking forward. But if you're testing to learn, you are taking a moment to be reflective, understand the ‘why’ behind it, and then take that ‘why’ and pour it into all the work you do moving forward. It's a very different mindset when you're testing to learn versus testing to ‘check the box.’”

    For too long, insights teams have been tasked with “checking a box” or assigning a grade to an idea at some point in the development process. They give marketers a red or green light and move onto the next project.

    Those days are long gone.

    Jennifer’s view is that you do the work to uncover whether you should move forward with an idea, but that’s just the start. You need to regularly reflect backwards to learn as you go. Reflecting backwards can help your business lean more into what it has done well and improve what it hasn’t done well. In other words, you can use what you’ve learned in the past to affect decisions in the future.

    You can always just answer the questions you’re asked (like, will consumers like this ad?) but you’ll take your partnership to the next level if you can always provide the WHY behind that answer (why do consumers like this ad?) and look at how those trends are evolving over time to predict the future.

    Jennifer notes that this requires a different mindset. It reflects a shift in the role of insights from order taker to true strategic partner. And isn’t that what we’re all aiming for?

    5. Act as an owner of your company

    “If you see an outage, you have to speak up. You have to share that with people. I wouldn’t hold it in and let it fester. Because you can make an impact.”

    It’s easy to feel like your voice doesn’t matter in a big organization. That one person can’t affect change. So if you see something that could be improved (what Jennifer calls an “outage”), it may feel like it’s not worth it to try to do something about it.

    But Jennifer vehemently disagrees with that idea. She suggests you act like an owner of your company and call something out when you see it (this was actually a common refrain in our lineup of C-level feature speakers).

    You have a unique perspective that executives in your company don’t have, and they often won’t know there’s a problem or opportunity unless it’s brought to their attention.

    Jennifer recommends not just calling something out, but also looking a step or two ahead. You may not have the ability to solve the problem on your own, but what steps can you take to address it? What recommendations can you offer? Be a part of that solution.

    Final thoughts

    At the end of the day, you’ll be a better insights partner to marketers if you help marketing do their job better. Work together to make your marketing stronger. That’s really what all of these points come down to.

    Author: Katie Sweet

    Source: Zappi

  • Countering common critics on survey data with market research skills

    Countering common critics on survey data with market research skills

    Don't be mislead by survey data estimating the likelihood of adoption and the relative importance of a choice attribute.

    With some justification, not everyone is a fan of survey data. Periodically, a client or prospective client recites one of two quotes.
    The first quote is from the founder of Apple, Steve Jobs, who said “People don’t know what they want until you show it to them. That’s why I never rely on market research. Our task is to read things that are not yet on the page.
    The second quote is from David Ogilvy.:“The trouble with market research is that people don’t think what they feel, they don’t say what they think, and they don’t do what they say.

    Those who don’t want to do data collection use the offensive play, which is the quote from Steve Jobs. The play goes like this: 'there is no sense undertaking market research because after all, people don’t know what they want'. On the other hand, support for the Ogilvy position is usually invoked once the unpalatable research findings are tabled. The defensive play switches to 'that’s all very well and good, but as we know, respondents don’t do what they say'.

    “People don’t know what they want”

    Rebuttal #1

    Steve Jobs’ perspective is easiest to reconcile. Jobs was talking about new to the world products such as an MP3 player in 2001. He was right in that, if your product innovation is ostensibly ‘new to the world’ and the market has limited ability to conceive its application, then there is not a lot of value to be derived from asking respondents about adoption. Henry Ford was from the same school when he said, “If I had asked people what they wanted, they would have said faster horses.” However, using survey data is useful for illuminating attitudes, market sizing based on core needs, and understanding current alternatives. If Steve Jobs was talking about an invention that was a phone that incorporated an MP3 player, a camera, and GPS then perhaps the market could comprehend such a value-added bundle and more readily estimate adoption. And today, if Apple wanted to understand why users were switching to Samsung, then survey data is surely the best source of data.

    Rebuttal #2 

    The other counter-argument is that with respect, you are not Steve Jobs! Every now and then, a visionary entrepreneur comes along, whose imagination and resources are unconstrained, and proposes something as frame-breaking as putting “a thousand songs in your pocket.”

    For the rest of us – less than visionaries – research is a critical component to optimizing our efforts and investment. However, the rationale for removing insight from the process is different depending on what phase of the research project we are in.

    Market research and the 5 stages of grief

    I should tell you, the context in which I usually hear these comments change depending on where we are in the project. If we are at the end of the project, then the comments often reflect the seminal work by Elisabeth Kübler-Ross. Her five stages of grief model ranges from denial and anger to acceptance and action. There are a handful of alerts researchers receive when the client has slipped into denial and anger. Some of them include questions like “Who commissioned this research? Where did you get the sample from?” And that old chestnut, “You do realize people don’t always do what they say they will do.” We would agree wholeheartedly with the last statement and therefore, apply techniques that sidestep that issue.

    When is survey data criticism justifiable?

    As in all fields of life, there is a degree of variability in the quality of market researchers. And as is the case in most (if not all) services, there is an asymmetry of information. Meaning the seller is better informed than the buyer, and occasionally the buyer struggles to recognize poor quality researchers from good quality researchers. Here are two basic situations where criticism of findings arising from survey data could be justified.

    Situation

    Discussion

    Estimating the likelihood of adoption.

    The wrong answer is a dichotomous yes or no response.

    The better answer is to apply a Juster Scale. In 1966, Professor F. Thomas Juster argued that verbal intentions were disguised probability statements, therefore, why not directly capture the probabilities themselves as expressed by the respondents.

    Using an eleven-point scale ranging from 0-10, for likelihood to adopt, Forethought only counts those respondents scoring eight or above as a yes. According to Professor Juster, an eight is ‘very probable’ on the likelihood scale. We have found this to be a remarkably accurate indication of actual in-market adoption.

    Estimating the relative importance of an attribute in choice.

    The classic folly in understanding the relative importance of attributes, such as price, is to ask the stated importance of price. Respondents do two things. Firstly, they overstate the importance of price, which then leads them to also overstate their price sensitivity.

    The better approach is to assemble the hypothesized list of reasons for adoption (explanatory variables), including price, and to then ask the respondent to rate each explanatory variable on a 0-10 importance scale. With the right dependent variable, such as value for money, multivariate analytics is then applied to infer the importance. We have found this to be a remarkably accurate indication of the relative importance of an attribute in choice.

    A third scenario, usually limited to qualitative research for high involvement goods and services (that is, high perceived social risk), is where a social desirability bias creates distortions in stated choice. Similarly, the resulting mismatch between stated and actual can be averted by using inferential analysis to infer importance.

    I am not sure about other research companies but, about one in five Forethought colleagues is a marketing scientist. These folks are responsible for the design and analysis of survey data. About 60% of our colleagues are consultants arranged into category verticals (banking, telecommunications, healthcare, and so on) to maximize domain knowledge.

    My job is working with survey data; I love it! Pretty much the whole purpose of my professional life has been to assist businesses to predict and affect behavioral change as it relates to marketing. The primary vehicle for that prediction has been and continues to be, survey data. Forethought operates in just two areas of behavioral change: gaining and retaining market share for services businesses. With over 4,000 projects completed, I cannot recall a single instance where our predictive models have been incorrect. Without a doubt, market research insights is a game of skill.

    Author: Ken Roberts, CEO of Forethought

    Source: GreenBook

  • Getting the most out of market research using two speeds

    Getting the most out of market research using two speeds

    We are slowly eradicating binary ideologies in the insights industry; yet there is still some hesitation to commit to more than one speed of research concurrently. Is a two-speed market research strategy the key to unleashing our potential?

    One of the big themes consistently running throughout industry discussions is speed. As insight professionals we are constantly on the search for new, innovative, and, quick methods of producing high-quality data for stakeholders. As such, market research has developed into a very speed-focussed field.

    The demand for quick insights that inform decisions at the speed of business has risen exponentially and we have met that demand with the dedication befitting our expertise. But for all our focus on quick, agile insight generation, we’re in danger of disregarding the benefits of steady, continuous or long-term research.

    However, we don’t need to divert our thinking from quick-paced research entirely, there is a neatly-wrapped solution that allows us to make use of both quick- and slow-paced methodologies in one prominent two-speed strategy. The Two-Speed market research strategy, allows us to conduct concurrent research projects in parallel, helping us keep our finger on the pulse of consumer sentiment, respond to unexpected events when they pop up, and future-proof organisations as predictive insights roll in.

    Two speeds of market research

    Binary thinking is slowly being eradicated in the insights industry: it’s neither quantitative nor qualitative, agile or structured, online or in-person market research. Just like in improv, we are fully embracing the word ‘and’ when it comes to designing the right research projects, which, aided by technology, means we can pull off increasingly elaborate and personalised research experiences for every single requirement stakeholders could possibly realise.

    However, there is still a hangover in the binary ideology, surrounding the scale of research production and the number of projects that can be ran concurrently. This is mostly to do with reasons like budget and time, especially after a pandemic when both considerations are tighter than ever; so insight professionals are either conducting quick ad-hoc projects reacting to situations and events, or a longer-term, slower project that seeks to gather as much detail as possible.

    Insight teams should be looking to combine the two in a two-speed research strategy, a hybrid methodology that brings together the benefits of both, while eradicating the dangerous of only choosing one.

    1. Fast, reactive research

    This first type of research is the efficient speed of insight generation we’ve been working towards attaining for a while now. Quick research typically takes the form of surface-level data gathering, responding to a shift in consumer behaviour or a specific event, exploring the new beginnings of a concept or product, or exploring the market for new opportunities.

    Agile, ad-hoc, or quick-turn solutions are typical examples of fast-paced research strategies, but for these to work best, they need the right participants to really activate the potential and generate the best insights possible to inform stakeholder decisions. As such, this quick research works best when insight teams either have a continuous community or panel of participants on tap, or have a sample provider ready to recruit the right people for the job.

    While all methodologies would fit within this strategy, there are only a few that would be able to do quick research justice; luckily these are core methodologies that market research revolves around: surveys, focus groups, quick and open polls, etc., the tools we can use to whip up a very efficient data gathering project at a moment’s notice.

    2. Slow, future-oriented research

    Slower-paced research is the complete opposite, but works well in conjunction with fast-paced research. There are a wide range of methodologies we can employ, with any tool we could possibly want to implement: surveys, focus groups, and polls, but also in-depth interviews, diary studies, geo-location studies, eye-tracking tools, and so many more techniques can be combined and used effectively with the luxury of a little more time to really make the most of the insight generation process.

    Slower-paced, long-term research has become a primary tool for being able to understand global change, behavioural trends, directional shift of brands, and other truly large-scale research projects that require in-depth studies to truly form a deep and understanding that will inform many important decisions across organisations and industries. But this type of research isn’t necessarily restricted to trend forecasting or future-proofing, although these slow-paced projects are great for three research project themes in particular:

    • Tracking – for example, sentiment, behaviour, opportunity, and competition.
    • Testing – product, concept, service, user or customer experience.
    • Validation – or the disproving of theories and repelling the temptation of gut decision-making.

    Implementing two-speed strategies

    Instead of two medium-impact projects working one after the other, waiting in a queue until there’s time to dedicate, running them both concurrently so they’re feeding into each other will create a new level of high-impact research strategy that enhances the effectiveness of insight steams and boost insight generation so businesses can make agile decisions at will with 100% confidence.

    But for this to happen, insight teams need the right infrastructure in place, the right technology stacked at the ready to implement any projects within the two-speed strategy. While we can’t predict what tools we’ll need right at the start, we can always add more to the stack when necessary, but having that starting stack there to customise and evolve to fit the projects we conduct will help minimise the effort of implementing new research.

    At that point, it’s a case of understanding which projects fit into each stream of the research strategy, continuous research will obviously fit into the slower, longer-term stream, while any ad-hoc or spin-off research projects will fit into the quicker, time-sensitive research stream. Of course, there will be times where one project might need to be split between the two, and we can use the insights from one to better inform the direction and construction of questions in the other; the result of which will be the building of robust, powerful, and accurate insights that are bound to better inform key decisions across the organisation.

    Author: Emily James

    Source: Greenbook Blog

  • MI and knowing your customers, the top 1%

    MI and knowing your customers, the top 1%

    Most organizations have a good idea who their best customers are. But do you know who your top 1% are? Do you know how they are using technology to gain advantages? Who are their role models? What does this mean for organizations?

    Last year, Don Scheibenreif, Ed Thompson, and Jenny Sussin explored these questions as part of Gartner’s annual Maverick research competition. Maverick research is an incubator for new ideas. It inspires to challenge Gartner’s mainstream research and positions. This helps clients take advantage of trends and insights that could impact their strategy and their organization.

    The research results intentionally contradict prevailing wisdom, namely that the internet and technology are great equalizers. Instead, it is suggested that technology is a key way that the wealthiest people gain, retain and grow their wealth and position. And, this is being imitated by your 1% most valuable customers. Often at your expense. Learning how the elite use new technologies as a weapon will prepare CIOs and CX leaders for what is coming next in customer engagement.

    What was found

    • The most successful techniques for and approaches to using technology by billionaires are imitated by the next tier in society, and so on down. This means that what your most valuable 1% learn from the wealthy elite will impact your organization sooner or later.
    • The past 20 years have seen the unequal access to information and technology between supplier and customer shift in favor of the customer. In some cases, particularly the top 1% most valuable, customers now have access to more information and technology than their suppliers and can exert astonishing influence.
    • The top 1% most valuable of your customer base needs more active management, because they are better at using informational, referent and technology expert powers than in the past (historical sources of power amplified in the digital age).

    What organizations can do

    • Investigate how your 1% most valuable customers use referent, informational and expert powers to achieve their own ends. Make proposals to mitigate their negative actions.
    • Position your organization to benefit more from the top 1% most valuable of your customers by
      • Learning who your 1% most valuable customers are and what percentage of revenue, profit and lifetime value they represent
      • Turning your insights into strategy
      • Earning their loyalty while protecting your profitability

    Billionaires have used and will use technology to gain and keep power and wealth, so will the 1% most valuable of your customers. This trend represents the continued erosion of your ability to control the nature of your relationships with your customers. Savvy digital leaders can help their organization to be prepared for the ways in which the 1% most valuable customers can hurt their business.

    Source: Gartner

  • Moving business intelligence, data and insights forward with graph technology

    Moving business intelligence, data and insights forward with graph technology

    Hospitals are one of the best examples to spotlight the complexities of unstructured data. From physicians’ notes in EHRs, emails, text files, photos, videos, and other files, the majority of patient data cannot be read by machines. Research firm IDC estimates that upwards of 80% of data will be unstructured, growing from 33 zettabytes in 2018 to 175 zettabytes by 2025. This example demonstrates a huge challenge in dealing with unstructured data and analyzing it when it is stored across disparate systems. The health care industry is just one prominent example of a sector awash with unstructured information that could have critical clinical and business-related data insights. That’s where graph technology comes in.

    The (Unstructured) Data Deluge 

    Graphs are one way to contextualize and explain data. Graphs themselves can be particularly large, with data sizes of 10 to 100 terabytes. As such, graph data has been particularly beneficial when data is large, continually evolving, and rife with high-value relationships. 

    Knowledge graphs, which make connections between seemingly disparate sources to provide specific business insights – have been in existence for many decades. Historically, knowledge graphs have been associated with search engines such as Google to enhance and hasten its search results, as well as from social media networks such as LinkedIn and Facebook to understand their users and surface relevant content (including relevant ads and common friend connections). 

    In recent years, graph computing companies have flourished exponentially, with the benefits of graph databases, analytics, and AI trickling down from the big tech titans to a slew of organizations and industries. Gartner predicts that by 2025, graph technologies will be used in 80% of data and analytics innovations, up from a mere 10% in 2021. This raises the obvious question: Given graph technology’s long legacy, why is it ballooning in demand and popularity now? 

    Barriers to Data Insights 

    One barrier to embracing graphs has been earlier approaches to gleaning insights from unstructured datasets. Traditional graph databases have aimed to address concerns wit relational databases but were not built with analytics in mind. This meant that organizations hit performance limitations when traversing massive knowledge graphs or query processing – even at low latency and scale. 

    Another barrier has been the lack of standardization in graph technology, which has resulted in high costs for any organization looking to move from one legacy database to another. Today, the industry still needs to make strides to cultivate the right tools and open standards, such as providing common libraries to allow data scientists to easily process large-scale graphs. 

    From Niche to Norm 

    For data-forward organizations, there are a few key solutions. A unified platform, which combines graph query, graph analytics, graph mining, and responsive graph AI, can offer unparalleled flexibility and scalability in understanding massive datasets. Such a platform can bring together disparate systems and reduce time to insight – or how long it takes for an analysis of the data to produce actionable feedback. This enables a faster sharing of those insights to facilitate faster decision-making and foster innovation. The rate of insight is important because industries that rely on graph computing can overtly benefit from real-time intelligence, such as monitoring network traffic for suspicious activity and alerting teams when any such activity is discovered. 

    For virtually every industry – from financial service to health care and pharmaceutics – their analytics and intelligence are only as good as their ability to truly understand and action the vast amounts of data they have. Beyond a unified platform, another option is to create metadata on top of the disparate systems and then build a knowledge graph on top of that, called a “data lakehouse.” In this case, the metadata serves to extract information from the data lakehouse in disparate systems and unifies them into a knowledge graph that can be used to provide actionable insights.

    As organizations continue to experience an exponential rise in data, more enterprises will organically amass graphs that have billions of nodes, and hundreds of billions of edges. The most data-forward organizations will have to build scalable systems that not only reduce time to data insights but address the underlying complexities of unstructured data and legacy architectures.

    Author: Farshid Sabet

    Source: Dataversity

  • Seven insights summarizing the Industrial IoT market

    Seven insights summarizing the Industrial IoT market

    The industrial Internet of Things (IIoT) is projected to become a large-scale, high-growth market that will have a transformative impact on a broad range of industry sectors, from healthcare and retail to automotive and transport.

    Here is a brief overview of the key factors to know about this important market, based on expert insights from the report Industrial Internet of Things Market Outlook and Forecasts 2021-2028 by Mind Commerce, a specialized market research firm focused on digital technologies and the telecommunications industry.

    1. What Is industrial IoT?

    IIoT is considered the fourth industry revolution, with the power to radically change industrial operational processes from start to finish. Other revolutions included mechanical (1784), moving assembly lines (1923), and computerization and the foundation of the Internet (1969).

    “Just as industrial production was transformed by steam power in the nineteenth century and electricity in the early twentieth century, so will industry be profoundly transformed by IIoT and related initiatives such as Industry 4.0 (the so called fourth wave of technological advancement in industry) in the early part of this century,” according to Mind Commerce.

    Industrial IoT offers a vision of the future based on intelligent manufacturing. Today machines are designed to operate in sync with each other in a production line. In the future, machines will communicate and coordinate with each other within smart factories, drawing on the Internet, interconnected sensors, and other cutting-edge technologies to enhance production and take industrial automation to the next level.

    2. Industrial IoT Market Growth

    The global industrial IoT market is expected to near $1,119.4 billion by 2028, increasing at a 17.0% compound annual growth rate (CAGR) from 2021 to 2028.

    Software is the largest market segment, accounting for 44% of the total market in 2021. Europe is the largest region, with 41% of the total market in 2021.

    3. IIoT Technologies

    Industrial IoT will rely on a powerful blend of interrelated technologies such as artificial intelligence, augmented reality, data analytics, machines-as-a-service, robotics, self-driving vehicles, self-organizing production, sensor-driven computing, and virtual reality.

    4. Industrial IoT Market Drivers

    IIoT will help manufacturers optimize their operations by providing real-time monitoring and the ability to conduct predictive maintenance. Companies that leverage IIoT effectively will have the tools and systems to track asset performance, predict failures, increase efficiencies, and reduce unplanned downtime.  

    “Successful companies will be those that understand how and where IoT technologies and solutions will drive opportunities for operational improvements, new and enhanced products and services, as well as completely new business models,” according to Mind Commerce.

    5. IIoT Market Challenges

    While industrial IoT holds great promise, challenges remain. Industrial IoT lacks standardization — some manufacturers rely on a small set of proprietary, incompatible technologies.

    Data security and privacy breaches is another serious concern. Because IoT devices carry out physical tasks, a data breach could have grave consequences.

    6. IIoT Industry Verticals

    IIoT solutions are poised to transform a variety of industry verticals including automotive and transportation, cargo and logistics, healthcare, manufacturing, oil and gas, smart cities, and utilities.

    7. Top Market Players

    Key IIoT companies include ABB, Accenture, AGT International, ARM Holdings, ATOS, B+B SmartWorx, Bosch, C3, Inc., Cisco System Inc. Digi International, Echelon Corporation, Elecsys Corporation, General Electric, Hitachi, IBM, Oracle, PTC, Real Time Innovation, Rockwell Automation, SAP, Sensata Technologies, Siemens, Wind River, Worldsensing, and Wovyn LLC.

    Author: Sarah Schmidt

    Source: Market Research Blog

  • Social intelligence: social media data as a means for market intelligence

    Social intelligence: social media data as a means for market intelligence

    To be successful as researchers, we need to know our customers. We seek to identify our customers’ needs, desires, and opinions about our products, brands, and competitors. In doing so, we understand better how to tailor our products and services to meet our customers’ needs.

    Traditional market research identifies subgroups within a target audience through direct engagement, in-person or through email engagements. Researchers can explicitly define and ensure an individual’s association to a given audience. It’s an effective approach, but it can be expensive, time-consuming, and difficult to execute.

    Where traditional market research and social media meet

    At Microsoft, the Research + Insights (R+I) team focuses on answering the same business questions as sought through traditional market research. However, instead of using standard methods, we utilize the vast set of available data in social. Over the past year, we asked questions like: What if we could leverage public social media feeds to understand what specific audiences people belong to? What messages might be most likely to have a positive impact? What if we could align social market research methods to traditional market research methods to drive a greater understanding of the customer in more dynamic ways?

    Microsoft’s R+I team has developed a method for identifying and grouping social media users based on their online behavior and comments. This is a “listening-based” approach that can be automated to sift through massive amounts of social media traffic and metadata to create business-relevant segments and to find relevant insights.

    How a “listening-based” approach to social intelligence works

    For most social users, their profile encompasses their varying identities. A single profile could perhaps represent a marketing professional, a mother, and someone who likes to travel and dine out. Within this profile, that individual is apt to identify her interests and offer commentary on each of them. A Twitter profile bio, for instance, may contain the line: “Professional marketer, mom of two kids, can’t wait for next trip to France.” Chances are, this individual makes social posts relevant to each of those three personae, as well as a variety of other topics.

    The advantage of our approach is that our audience members self-identify, then engage in natural and organic conversations in a way that can’t be replicated through traditional market research. By utilizing biographical word tagging in social media, we strive to achieve a minimum content inclusion accuracy of 80 percent for audience inclusion and conversation relevancy. Meaning when we analyze 100 profiles in the social media grouping, at least 80% of those profiles are relevant to the grouping. For our Topic inclusion, we apply a similar methodology. This ensures our data is clean before we analyze it at the aggregate level.

    We can expand on the biographical word tagging, using the frequency and relevancy of terms mentioned, to bucket groups for increased accuracy. For example, let’s take software developers. By assigning points based on mentions of developer languages (1 point), developer tools (2 points), developer conferences (3 points), and developer domains (5 points), we can identify group members and segment groups by amateur or professional developers, with those amassing the most points seen as “most active” developers. The methodology above is just an example of how weighted behavior across specific audiences can be leveraged to create meaningful groupings then used to distill business-relevant insights.

    Below is an example output analyzing what is resonating from self-identified Microsoft Employees, Partners vs. the general population and themes in hybrid work discussion.

    We can perform similar group identification among commercial, educational, and consumer customers, as well as Microsoft-centered customers (such as partners or Xbox fans). We can also infer group identification without direct self-identification. We know, for instance, that 70 percent of millennials follow Dan Price, the CEO of Gravity Payments. So, if we come across an individual who also follows Dan Price, in addition to other social behavioral markers scored similarly to the developer example above, we can infer that he or she likely is a millennial. This is a broad example, but you get the idea.

    Customer privacy is at the core of what we do. When we are analyzing user conversations we are doing so at an aggregate level and not on the individual level to ensure personal identifiable information (aka PII) is not reported against. It is built into our best practices to scrub author names and users’ names from the records when we report against these audience segments.  

    What’s next for social insights?

    The most important piece is using our initial data to understand common characteristics in our audiences and to isolate common social traits within each audience. With that, we can broaden our groups by understanding the relevancy of behavior by the audience and then scoring (or weighting) those behaviors.

    Once we’re able to group a certain number of identified behaviors, our understanding of the audience becomes more accurate. We’ll also be able to include individuals who don’t self-identify as part of our target audience. If they follow patterns similar to people who do self-identify, then we can include them within a group. At the end of the day, all of this is done to better understand how to address the needs of our customers in order to continue to empower them to achieve more.

    Authors: Allie Webster & Justin Schoen

    Source: Greenbook

  • The striking similarities between presenting analytics and telling jokes

    The striking similarities between presenting analytics and telling jokes

    Everyone is familiar with the age-old adage that if you must explain a joke after you tell it, then the joke will be a flop. The same principle is true when you put data in front of a live audience, whether with a table, a graph, or a chart. This blog will clarify what seems at-first an unlikely comedic connection.

    The link between comedy and analytics

    No matter how funny a joke may be, it will not be funny if someone does not immediately understand what it is that makes the joke funny. Once explained, the person may logically understand why the joke is funny, but they will not experience the humor in the same way they would have if they had gotten the joke on their own. Somehow, the humor is only truly felt if you “get” the joke both immediately and on your own.

    I often ask my audiences how they feel charts and graphs are like jokes when I am discussing this topic during a session. Over time, I have received several good answers beyond the one I am looking for when I ask. Some of the legitimate ways that audience members have tied jokes to charts and graphs include:

    a) Most are bad

    b) Few people are good at delivering them

    c) The best ones are simple

    d) Context can heavily influence audience reception

    e) If you have to explain it, you’ve failed

    All of those are true, but for this blog, we are going to focus on answer e). Just as you’ve failed in your humor if you have to explain your joke, you’ve failed in your analytics presentation if you have to provide an explanation for your charts and graphs.

    Why simplicity matters

    Whatever format your data is presented in, it is important that it is easy for your audience to comprehend the core information and the point you are making about that information very quickly and with limited effort. If you achieve this, then the audience will remain focused on the narrative and context that you provide to support the chart or graph. This is important for several reasons:

    1. When you are presenting, you want people listening to you and the story you are telling. You do not want them struggling to understand the data projected on the screen
    2. The more an audience struggles to understand what you are showing them, the more they lose interest and the lower your credibility goes
    3. People trust experts that they understand. Want to be trusted? Then be understood!
    4. People walk away impressed and thinking highly of a presentation if the information provided was clear and easy to comprehend
    5. Technical experts have a reputation for being hard to understand, so if you can surprise the audience by making things simple, you will have a win

    As you develop a presentation, always force yourself to look at what you have drafted through the eyes of the audience it is intended for. What may seem obvious and simplistic to you as an expert may not be perceived the same way by an audience that lacks your expertise and experience. You are used to looking at complex measures and comparing them on the fly. Your audience won’t be as comfortable with that as you are and will need to have information provided at a level that they can easily absorb.

    Make your audience want to attend another show

    People will not go see a comedian a second time if many of the comedian’s jokes are hard to understand because that takes the fun out of the show. Similarly, if you spend a lot of time explaining your charts and graphs, the audience will not be inclined to come to another presentation of yours (at least not happily).

    With a little effort and attention, you can create a presentation that includes compelling and effective charts and graphs while also enabling your audience to easily follow along. To do this you must always remember that as with a joke, if you must explain a chart or graph for people to get your point, then you have failed.

    Author: Bill Franks

    Source: Datafloq

  • Using data successfully: the role of data democratization

    Using data successfully: the role of data democratization

    An effective culture to underpin your strategy

    A business that looks to become truly data-driven knows that employees are more likely to back the wider strategy if they have access to data that helps them do their jobs better. Data democratization and the positive culture it can create is, therefore, critical to the long-term success of any organization.

    According to a recent reportData Strategy and Culture: Paving the Way to the Cloud, senior decision-makers are confident that they’re opening up access to data sufficiently.

    So do your employees at all levels actually have adequate access to data to boost their decision-making? Does the data at their disposal and how they work with it turn employees into strong advocates for your organization’s data strategy? I seek to address these questions.

    This blog focuses on the crossover between data strategy and deployment decisions. It covers:

    • Why data democratization is critical to developing a positive data culture
    • What are the main barriers to this
    • What else you can learn?

    Data democratization: room for improvement

    Successful organizations identify the key capabilities that are required to execute their data strategy effectively. Infrastructure decisions are an important part of this as any limitations can cause frustration and poor engagement — ultimately, the wrong choice can restrict how well an employee can perform in their role.

    Almost four out of five respondents to our survey say their current IT infrastructure makes it challenging to democratize data in their organization. This is a significant obstacle to be overcome. There are additional barriers, too, such as a lack of relevant data skills or too many new data sources.

    At this point, businesses have to focus on which deployment model best meets their needs. On the topic of data democratization, many will naturally think of the benefits the cloud can bring. The right deployment model allows for data sharing in a secure and cost-effective manner across all levels and departments. It allows people, and therefore the company, to perform at their best.

    Don’t limit your potential

    Despite the importance of this, almost half (46%) of respondents to our latest research believe that the democratization of data isn’t feasible for them.

    This could be a big risk. If your technology infrastructure doesn’t allow you to open up access to data across the whole business, you’re stopping your organization from becoming truly data-driven. This could ultimately mean that insights can’t be gathered quickly enough, projects could be stalled, and a competitive edge on competitors can be lost.

    Make the data work

    There is a clear need for organizations to carefully consider which deployment option gives them the freedom needed to effectively open up access to data. Yet, the story doesn’t end once a decision has been made.

    Teams must constantly monitor whether employees are able to work with the data at their disposal effectively. Can they get the insights they need from the data? Is there an ambition to increase the spread of data democratization within the organization?

    So when it comes to optimizing the success of your data strategy, data democratization is an important and key step in the process — and your company needs to get it right.

    The report investigates all of the key points raised in this blog and explains how developing a positive data culture starts with data democratization. This is the point when you secure your employees’ backing of the project. Only then are you truly ready to choose the right deployment model.

    Author: Mathias Golombek

    Source: Dataversity

EasyTagCloud v2.8