2 items tagged "survey"

  • Just Using Big Data Isn’t Enough Anymore

    feb16-09-603756761-1024x576Big Data has quickly become an established fact for Fortune 1000 firms — such is the conclusion of a Big Data executive survey that my firm has conducted for the past four years.

    The survey gathers perspectives from a small but influential group of executives — chief information officers, chief data officers, and senior business and technology leaders of Fortune 1000 firms. Key industry segments are heavily represented — financial services, where data is plentiful and data investments are substantial, and life sciences, where data usage is rapidly emerging. Among the findings:



    • 63% of firms now report having Big Data in production in 2015, up from just 5% in 2012
    • 63% of firms reported that they expect to invest greater than $10 million in Big Data by 2017, up from 24% in 2012
    • 54% of firms say they have appointed a Chief Data Officer, up from 12% in 2012
    • 70% of firms report that Big Data is of critical importance to their firms, up from 21% in 2012
    • At the top end of the investment scale, 27% of firms say they will invest greater than $50 million in Big Data by 2017, up from 5% of firms that invested this amount in 2015

    Four years ago, organizations and executives were struggling to understand the opportunity and business impact of Big Data. While many executives loathed the term, others were apostles of the belief that data-driven analysis could transform business decision-making. Now, we have arrived at a new juncture: Big Data is emerging as a corporate standard, and the focus is rapidly shifting to the results it produces and the business capabilities it enables. When the internet was a new phenomenon, we’d say “I am going to surf the World Wide Web” – now, we just do it. We are entering that same phase of maturity with Big Data.

    So, how can executives prepare to realize value from their Big Data investments?

    Develop the right metrics.

    While a majority of Fortune 1000 firms report implementing Big Data capabilities, few firms have shown how they will derive business value over time from these often substantial investments. When I discuss this with executives, they often point out that the lack of highly developed metrics is both a function of the relative immaturity of Big Data implementations, as well as a function of where in the organization sponsorship for Big Data originated and where it currently reports. Organizations that have the executive responsible for data report to the Chief Financial Officer are more likely to have developed precise financial measurements early on.

    Another issue with measuring the effectiveness of Big Data initiatives has been the difficulty of defining and isolating their costs. Big Data has been praised for the agility it brings to organizations, because of the iterative process by which they can load data, identify correlations and patterns, and then load more data that appears to be highly indicative. By following this approach, organizations can learn through trial and error. This poses a challenge to early measurement because most organizations have engaged in at least a few false starts while honing Big Data environments to suit their needs. Due to immature processes and inefficiencies, initial investments of time and effort have sometimes been larger than anticipated. These costs can be expected to level off as experience and efficiencies are brought to bear.

    Identify opportunities for innovation.

    Innovation continues to be a source of promise for Big Data. The speed and agility it permits lend themselves to discovery environments such as life sciences R&D and target marketing activities within financial services. Success stories of Big-Data-enabled innovation remain relatively few at this stage. To date, most Big Data accomplishments have involved operational cost savings or allowing the analysis of larger and more diverse sets of data.

    For example, financial firms have been able to enhance credit risk capabilities through the ability to process seven years of customer credit transactions in the same amount of time that it previously took to process a single year, resulting in much greater credit precision and lower risk of credit fraud. Yet, these remain largely back-office operations; they don’t change the customer experience or disrupt traditional ways of doing business. A few forward-thinking financial services firms have made a commitment to funding Big Data Labs and Centers of Excellence. Companies across industry segments would benefit from making similar investments. But funding won’t be enough; innovating with Big Data will require boldness and imagination as well.

    Prepare for cultural and business change.

    Though some large firms have invested in optimizing existing infrastructure to match the speed and cost benefits offered by Big Data, new tools and approaches are displacing whole data ecosystems. A new generation of data professionals is now emerging. They have grown up using statistical techniques and languages like Hadoop and R, and as they enter the workplace in greater numbers, traditional approaches to data management and analytics will give way to these new techniques.

    When I began advising Fortune 1000 firms on data and analytics strategies nearly two decades ago, I assumed that 95% of what was needed would be technical advice. The reality has been the opposite. The vast majority of the challenges companies struggle as they operationalize Big Data are related to people, not technology: issues like organizational alignment, business process and adoption, and change management. Companies must take the long view and recognize that businesses cannot successfully adopt Big Data without cultural change.

    Source: Harvard Business review

  • What to use for your next market research? Online panel or programmatic sample?

    What to use for your next market research? Online panel or programmatic sample?

    Technology is driving growth across industries, creating space for unconventional ideas and technological innovations that infiltrate traditional models and disrupt the status quo. Learn how to decide between a programmatic vs an online panel for your next research study.

    Companies unable to pivot find themselves in the fight of their lives. Peer to peer ride-sharing services such as Uber and Lyft, for example, have wounded the taxi industry, and entertainment streaming services, like Netflix and Hulu, are slowly sending linear TV to an early grave. Technology’s profound impact on the industry, in general, and especially service industries like the online sample industry will only intensify as emerging technologies like AI and machine learning inch toward mass adoption.

    But let’s turn the clock back to the mid-2000s and talk about the technology that upended the way people communicate: social media. Myspace, Hi5, and Facebook, during that time, were in their infancy. Google launches Gmail, strategically providing millions of users with free personal email addresses to access their suite of services, including the now-defunct Google+. What most people don’t realize is that social networks and the easy access to user email addresses changed the market research industry forever, and here’s how.

    In this corner: Old School 'Online Panel' 

    Online panels are communities of individuals sourced by sample providers to take online surveys. This business model is cost-effective for market researchers as thousands of people can be interviewed in a fraction of the time for a fraction of the cost. Social media provides an alternative way to collect a sample by accessing broader online communities.

    There are many benefits to traditional online panels. Sample providers can create more niche communities because they have more control over who enters the community. By undergoing a series of quality control checks, it can be easily determined if the respondent is a good fit. If the potential respondent passes muster, they are accepted into the panel. The most effective panels provide live customer service to respondents, typically via phone with a real person on the other end to answer the respondent’s questions. In successful cases, the panelist moves from the actual panel to the best match survey (double opt-in), and usually takes about 5-10 surveys per year. This process improves data quality.

    The problem with traditional online panels is keeping up with demand. The sample company must have bid managers and project managers available to process client requests 24 hours a day, which erodes the cost savings previously mentioned.

    Now, let’s fast forward to the mid-2010s.

    In this corner: New Age 'Programmatic Sample'

    By this time, the internet is faster, WIFI is ubiquitous, and artificial intelligence is making our devices smarter. Technology is cheaper and more accessible, and the term 'programmatic' embeds itself in industry vernacular. A new business model emerges for the online sample industry called programmatic sampling composed of routers and APIs, which have become the preferred business model among sample companies.

    A programmatic sample is a platform that connects to and tracks user behavior across multiple internet mediums, from websites to social media, affiliate sites to online panels, data is captured by an API (Applied Programmatic Integration). This new technology aggregates millions of people from the internet and places them at the ready for the next available online survey with a few clicks of a mouse.

    The benefits of a programmatic sample are straight forward. They can run 24 hours a day, seven days a week, 365 days per year all over the world with minimal or no intervention from project managers or bids mangers.

    But programmatic has its challenges as well. The interaction between the respondents is greatly diminished or disappears entirely, and the respondents are given a blanket label of 'online traffic'. The online traffic that enters the routers is less controlled, and the respondent’s survey experience is less than ideal. They encounter multiple survey windows before they get to the actual survey. The double opt-in model is lost, and it becomes more of a quintuple opt-in model, which some respondents don’t like. Due to this, most respondents in this model will only take one or two surveys in a lifetime and then leave the platform forever.

    However, the router will keep the wheels turning and consistently generate new online traffic (online respondents). Like a Ferris Wheel, people hop on and hop off, but programmatic sample always keeps adding new people and continues to do so until it is turned off.

    And the winner is?

    It’s hard to say but it favors a tie. Some believe that traditional online panels are dying due to the high cost of maintaining them. However, this model offers better data quality and access to niche sample sectors like U.S. Hispanics or other minority groups. The issue is speed and sample volume, which the programmatic sample model excels in. Perhaps the best bet is for sample companies to provide a hybrid of both models. But even then, as technology evolves and clients feel more pressure to produce relevant marketing campaigns that generate positive ROI, programmatic sampling still has the advantage. It will be interesting to watch how technology transforms the sample industry over the next ten or so years.

    Author: Art Padilla

    Source: Greenbook Blog

EasyTagCloud v2.8