9 items tagged "Decision Making"

  • 3 Strategic Questions the Media Industry’s Future Depends On

    There is no question that the media industry is experiencing dramatic disruption on many fronts—in the way it creates content, distributes content to consumers, and monetizes audiences. These changes are driven by seismic shifts in consumer behavior and an explosion of both consumer- and B2B-facing technologies. The disruption reveals itself in the fast growth of newer content brands like Refinery29 and Vice, the increased use of technologies like Outbrain and Taboola to drive traffic, and the growth of programmatic approaches to advertising revenue. As we reflect on disruption across the industry there are key strategic questions, all of which fundamentally consider balance:

    • What is the right balance between humans and technology across the full media and advertising ecosystem?
    • How do we maximize our creativity as an industry while integrating data-driven approaches?
    • When and how do we shift our businesses from legacy operating models to ones that better reflect the future?

    It is these tensions that now shape the most important considerations for advertisers, their agencies, and the media companies that convene audiences at scale.

    Humans and Automation

    The tension between human-driven and technology-driven capabilities is often miscast, positioning automation as a threat to the people that drive our industry. However, the more strategic opportunity is to enable humans to do what they do best and leverage technology to drive processes that are best served either by highly repeatable algorithmic tasks or by analytical complexity that surpasses the capacity of the human brain. If we draw the line carefully between these complementary approaches, we can unleash the talent in our organizations and apply humans to areas of growth and competitive differentiation. The grounding principle is: let humans do what humans do best, and let technology do what technology does best.

    Approaches to content, distribution, and monetization across the media industry all afford opportunities to explore the nuances of blending people and automation:

    Content and creative. In the content arena, long-form, quality journalism depends on the highest-caliber talent in reporting and editing. However, new technologies like CrowdTangle are better suited to spotting trends from social media to inform reporting and to identifying the optimal promotional mix for a news organization. National Public Radio and Upworthy are just two media organizations using CrowdTangle to power fast-moving social media trend analysis and news curation. Advertising also offers opportunities to blend humans and automation. On the one hand, the recent creative from Nike celebrating “losers” could only have come from the raw ideation of the best creative minds, but at the automated end of the spectrum, hyper-targeted 1:1 digital campaigns may not only benefit from precision in finding consumers but perhaps also from dynamic creative strategies matching multiple creative options with precise audience targets, an approach which can only be executed at scale through technology — via tools like CPXi’s AdReady.

    Distribution strategies. In traditional marketing communications, a well-informed brief will shape a human-driven strategy and insight-guided planning process. Smart cross-functional teams cull through ideas on the best ways to find and influence consumers to embrace a particular perception or take a specific action. While that human ideation is still critical, the inputs get exponentially richer with the right use of data and technology. The transparency and volume of social media interactions, for example, enable us to look past traditional demographic or psychographic characteristics to find clusters of consumers or conversations that are defined by data science to have mathematical density and importance as real communities. Execution against such sophisticated targeting strategies is guided by human insight but also requires powerful data analysis and technology. In any data- or technology-driven process, if people don’t connect the dots between different parts of the strategy, add judgment and context to analyses, and help frame the questions that data enables us to answer, we will not achieve the right outcome. The balance is subtle and sophisticated.

    Monetization. Traditionally, marketplaces for media were largely created by people. Sales people from media companies, agency account teams serving brands, and the clients themselves connected demand with supply. Briefs from clients informed RFPs from agencies, which informed responses and pitches from media companies. Of course, at this moment, particularly for any media that is digital, supply and demand can intersect in real time via bidding in a range of auctions and exchanges. Programmatic technologies allow us to perform a match between placement and price in a highly dynamic, high-volume environment, as a complement to the people-driven processes. To declare that programmatic will become the entire marketplace of the future is too extreme and undervalues the balance between humans and technology. The future demands a balance between big ideas like sponsorships and branded entertainment that can only be developed through conversation and human ideation, and highly efficient media amplification strategies that can best be executed via technology.

    The more sophisticated our approach to balancing humans and technology, the more likely that we can simultaneously unleash the creativity and intelligence of our teams, while making them able to get more impact out of the ideas they create through scalable technologies.

    Creativity and Data

    Creativity sits at the core of brand stewardship, advertising, and content creation. Chief marketing officers and their teams contemplate and shape the brand attributes that best define the relationship between a product or service and its customers. Creative agencies unfurl their best ideas to make advertising memorable while informing or entertaining audiences. And at media companies, journalists, photographers, video producers, and illustrators bring their talent, skills, and experience to shaping stories and features, large and small. But the increased availability of data, and perhaps more importantly, the ability to derive meaningful insights from it, provide new opportunities to inform our creative ideas and to measure their impact.

    A willingness to embrace data strategies as part of a creative process can become a point of differentiation and advantage:

    Content creation. Whether we are shaping important news stories as journalists or producing award-winning advertising creative, storytelling is the means by which we connect messages to audiences in resonant, meaningful ways. Historically, content creation was an exclusively human process but careful blending of technology into the mix can drive even greater editorial or advertising success. Think of the story-building inspiration of a mood board, which is a collage of visual stimuli that evoke the essence, tone, identity, and intent of a potential advertising campaign hoping to reach a specific target audience. By contrast, contemplate the possibility of monitoring a data-defined cluster of that same target audience to evaluate the visual media (think Instagram, Pinterest, memes) that they might be sharing in real time on Twitter. A world of transparent social media engagement offers powerful new sources of insight into the content that most readily engages communities of customers. This data-driven approach yields a dynamic mood board algorithmically calculated based on tweet and retweet volumes within the target audience cluster (full credit to Scale Model at Betaworks for this concept). Data-driven processes cannot replace human creativity and judgment, but they can be a rich complement.

    Marketing strategies. A sharp creative mind can generate ideas to engage audiences by bringing stories to life. Revlon’s Times Square billboard, which projects real-time images of people gathered below on the street over the tagline “Love Is On” (also displayed online), would not have surfaced but for creatives who connected strategy with a means to bring the brand to life. It is one of countless examples where the sheer power of human ideas defines success. But as channels and platforms proliferate, it becomes less feasible to see creativity as the sole factor in deciding how to impact audiences—from media-mix models to precise digital targeting approaches. As the options continue to multiply, data becomes an objective means to evaluate potential strategies across paid, owned, and earned channels. And, instead of traditional demographic breaks dominating the media choices, data reveals more dynamic and meaningful views of audience segmentation to elicit true engagement. Still, while data can offer a starting point for more nuanced views of clustering, human judgment that allows us to discern the data worth a keener focus.

    Measurement. Data is used most robustly for measuring the results and impact of engagement strategies. Whether the metric is outcome-based, like sales, leads, and traffic, or more qualitative, like brand perception and lift, data is widely used to understand the effect of campaigns and to develop audience. And yet, measurement is perhaps the arena most challenged on the metrics front. The fragmentation of digital platforms has fostered a lack of consistent standards, and many of the most innovative experiences rely solely on proprietary publisher-owned metrics for reporting. Impact measurement only becomes more complex as ideas are executed across platforms. And most measurement scenarios do not offer a seamless view across paid, owned, and earned data sets. Yet even as better, more consistent measurement emerges, data will only take us so far in the journey to understand impact. Thoughtful analytics are best married to human judgment to derive insight laden with broader context. Ultimately, human judgment is best poised to truly understand the more subtle dimensions of brand equity and influence.

    Data enables us to free up time previously deployed against the manual parsing and review of the many marketing, communications, and media options, offering new opportunities to apply human creativity to bigger ideas that capture audiences’ imaginations.

    Future Transformation

    The future requires change on a massive scale for most organizations, and the best approach involves leadership’s embracing the complexity, not only of developing the right strategy but executing it with deep attention to the details that matter. Operating models can shift but require a conscious approach to a range of issues, including organizational structure, workflows, technology platforms and overall change management. The product mix can be re-architected to rely increasingly on newer and high-growth offerings, but not before buyers are ready to embrace the new opportunities beyond experimentation. The challenge is to lead the marketplace and be sure new supply connects with demand in real time. And from a financial perspective, investment decisions and revenue expectations require careful forecasting and pacing against expectations to understand the multifaceted shift from legacy business lines to newer ones.

    Disruption of the media industry often feels like a brute force, moving quickly and without discretion. However, the ways we must respond as participants in the ecosystem is quite the opposite, requiring judicious, nuanced approaches. The critical concept is to balance the tensions to drive powerful results.

  • Business Intelligence nog steeds hot….

    Business Intelligence outdated? Niets is minder waar zo bewees het Heliview congres ‘Decision making by smart technologies’ dat afgelopen dinsdag in de Brabanthallen in Den Bosch werd georganiseerd.

    200 Klantorganisaties luisterden naar presentaties van o.a. Rick van der Lans, Peter Jager, Frank de Nijs en Arent van ‘t Spijker. Naast het bekende geluid was er ook veel nieuws te beluisteren in Den Bosch.

    Nieuwe technologieën maken heel veel meer mogelijk. Social media en, moderne, big data technologie stellen organisaties in staat veel meer waarde uit data te halen. Hoe organisaties dat moeten doen is veelal nog een uitdaging. Toepassing van de technologie is geen doel op zich zelf. Het gaat erom toegevoegde waarde voor organisaties te produceren. Of door optimalisatie van processen. Dan wel door het beter bedienen van de klant door productontwikkeling. In extremis kan data zelfs de motor achter nieuwe business concepten of –modellen zijn. Voorwaarde is wel een heldere bedrijfsvisie (al dan niet geproduceerd met intelligent gebruik van data en informatie). Belangrijk om te voorkomen dat we ongericht miljoenen stuk slaan op nieuwe technologie.

    Voor de aanwezigen was het gehoorde geluid soms bekend, maar soms ook een confrontatie met zichzelf. Een ding is zeker: De rol van data en informatie bij het intelligent zaken doen is nog niet uitgespeeld. Business Intelligence leeft.

    30 JANUARI 2015

  • Data access: the key to better decision making

    Data access: the key to better decision making

    When employees have better access to data, they end up making better decisions.

    Companies across sectors are already well in the habit of collecting relevant historical and business data to make projections and forecast the unknown future. They’re collecting this data at such a scale that 'big data' has become a buzzword technology. They want lots of it because they want an edge wherever they can get it. Who wouldn’t?

    But it’s not only the quantity and quality of the data a company collects that play a pivotal role in how that company moves forward, it’s also a question of access. When businesses democratize access to that data such that it’s accessible to workers throughout a hierarchy (and those workers end up actually interacting with it), it increases the quality of decisions made on lower rungs of the ladder. Those decisions end up being more often data-informed, and data is power.

    But that’s easier said than done lately. Businesses have no issue collecting data nowadays, but they do tend to keep it cordoned off.

    Data sticks to the top of a business hierarchy

    A business’s C-suite (often with help from a technical data science team) makes the big-picture decisions that guide the company’s overall development. This means the employees using data to inform a chosen course of action (like last year’s revenue versus this year’s revenue, or a certain client’s most common order) are either highly ranked within the company, or are wonky data specialists. Data lives behind a velvet rope, so to speak.

    But this data would be eminently useful to people throughout an organization, regardless of their rank or tenure. Such a level of access would make it more likely that data guides every decision, and that would lead to more desirable business outcomes over time. It might even overtly motivate employees by subtly reinforcing the idea that results are tracked and measured.

    Data tends not to trickle down to the appropriate sources

    Who better to have a clear view of the business landscape than the employees who toe the front lines every day? What would change if disparate employees scattered throughout an organization suddenly had access to actionable data points? These are the people positioned to actually make a tweak or optimization from the get-go. Whoever comes up with a data-informed strategy on a strong way forward, these are the people actually implementing it. But an organization-level awareness of an actionable data point doesn’t necessarily equate to action.

    As previously established, data has a high center of gravity. It is managerial food for thought on the way to designing and executing longer-term business strategies.

    But when companies change their culture around access to data and make it easy for everyone to interact with data, they make every worker think like such a strategist.

    By the time a piece of data reaches an appropriate source, it’s notnecessarily in a form he or she can’t interact with or understand

    As much as managers might like to think otherwise, there are people in their organization thinking in less than granular terms. They aren’t necessarily thinking about the costs their actions may or may not be having on the company, they don’t think about the overall bottom line. That’s why it’s important that data be in a form that people can use or understand, because it doesn’t always reach them that way.

    Getting data into a useable, understandable form happens by preserving connection between departments and avoiding disconnects.

    There seems to be a big data disconnect at the intersection of engineering and product development

    This is the intersection is where a business’s technical prowess meets its ability to design a great product. While the two pursuits are clearly related to one another on the way to great product design, it’s rare that one person should excel at both.

    The people who design groundbreaking machine learning algorithms aren’t necessarily the people who design a groundbreaking consumer product, and vice versa. They need each other’s help to understand each other.

    But data is the shared language that makes understanding possible. Not everyone has years of data science training, not everyone has business leadership experience, but even people doing menial things can still benefit from great access to data. Coming across the year’s growth goal, for example, might trigger a needle-moving idea from someone on how to actually get there. Great things happen when employees build a shared understanding of the raw numbers that drive everything they do.

    Businesses already collect so much data in the course of their day-to-day operations. But they could start using that data more effectively by bringing it out from behind the curtain, presenting employees across the board with easy access and interaction for it. The motivation for doing so should be clear: when more people think about the same problem in the same terms, that problem is more likely to be solved.

    All they need is access to the data that makes it possible.

    Author: Simone Di Somma

    Source: Insidebigdata

  • Data integration applied to BI: making data useful for decision making

    Data integration applied to BI: making data useful for decision making

    In this technology-driven world, the influx of data can seem overwhelming, if not properly utilized. With data coming in from so many different sources, the only way to extract real insights from these raw inputs is through integration.

    Properly integrated data has a trickle-down effect on all business processes, such as sales, vendor acquisition, customer management, business intelligence, etc. Implementing this level of integration enables businesses to make continuous improvements to their products and services.

    Business intelligence (BI) is one of the most significant data integration use cases. An effective BI process incorporates everything from predictive analytics to reporting and operations management. But this sort of comprehensive analytics framework requires integrated enterprise data to identify process inefficiencies, missed opportunities, and other improvement areas.

    What complicates BI integration?

    Given that enterprise information comes from different sources in varying formats and often contains inconsistencies, duplicates, and errors, users must ensure that quality issues identified during the data extraction process do not propagate to their end results. These unchecked outputs impact the integrity and accuracy of reporting, which in turn negatively influences decision making leading to further inefficiencies across business processes.

    Creating well-defined integration processes that not only consolidate data but standardize it for consistency and quality can make high-quality data readily available for decision making.

    Streamlining BI integration: best practices

    Raw data becomes valuable when transformed into analytics-ready, actionable information. By bringing disparate formats together into a unified data repository, an integrated BI system offers better visibility and efficiency into the enterprise assets.

    Therefore, successful BI initiatives are a combination of an effective integration and analytics strategy. The best practices stated below can help you make the best of it:

    Document a BI strategy

    Every business has a reporting process in place. Before implementing a new BI strategy, it’s important to evaluate existing systems to identify the areas that need improvement. Based on that information, you can design a new strategy, which can include several components depending on your specific business structure. However, the major ones that cannot be ignored include the following:

    • Narrow down the data source channels essential for your reporting process. This may consist of stakeholder or departmental information from databases, files, or web sources.
    • The purpose of BI tools is essential to track business KPIs with supporting data. Therefore, identifying the custom KPIs for your organization is imperative in presenting a broad picture of your business growth and losses.
    • Set a format for reporting: visual or textual. Based on your preferences and the input sources, you can select a vendor for the BI system.

    Set up data integration tools

    The integration stage of the entire process will be time-consuming. You can go about it in two ways:

    • Opt for the manual approach, where you rely on your developers and IT team to develop a BI architecture for your custom requirements.
    • The simpler and faster approach would be to buy an enterprise-ready integration solution from the market. These solutions extract data from different sources using built-in connectors, transform it into the required format, and load into the destination system that is connected to BI tools. Several data integration solutions offer out-of-the-box connectivity to BI tools. Therefore, purchasing a data integration solution would serve the dual purpose of integration and reporting.

    Factor in data security

    Setting up security measures before implementing BI is imperative in protecting your information assets against data breaches. By configuring authorization or authentication protocols and outlining procedures to carry out secure data processes, you can control access to data sets.

    BI is no longer a privilege for enterprises, it’s a necessity that enables organizations to stay ahead of the competition and optimize decision-making.

    Identifying the challenges in their reporting journey and implementing the best practices mentioned above will help organizations leverage the BI capabilities and become data-focused.

    Author: Ibrahim Surani

    Source: Dataversity

  • Finance gedwongen te moderniseren door digitalisering

    Finance-Data1

    Gedreven door technologie en toenemende transparantie-eisen groeit finance uit tot het dataknooppunt van de organisatie. Hoe werkt dat in de praktijk? Drie CFO's bieden een blik achter de schermen.

     De finance-functie moderniseert. Met real-time analytics, altijd en overal beschikbare data, nieuwe samenwerkingstools en een behoorlijke dosis aanpassingsvermogen wordt de strijd aangegaan met de voortdurend veranderende omstandigheden waarbinnen de onderneming moet aantonen van toegevoegde waarde te zijn voor zijn stakeholders. 'Eigenlijk staat in elke branche het verdienmodel onder druk', zegt Robert van de Graaf, CFO met brede ervaring in de financiële sector. 'En overal ervaart men de noodzaak daar antwoorden op te formuleren. Het is immers een kwestie van 'to disrupt or to be disrupted'.' Dat finance in dat proces een leidende rol vervult, is in de ogen van Van de Graaf 'logisch'. 'Finance is per slot van rekening de hoeder over de continuïteit van de onderneming en het daarbij behorende businessmodel.'

    Data-voorsprong

    De sensationele voorbeelden - Uber en Airbnb die hele branches bedreigen - trekken uiteraard de meeste aandacht, maar de veranderingen zijn niet altijd meteen ingrijpend en zichtbaar. Wat overigens niets afdoet aan de noodzaak de confrontatie ermee te zoeken, vindt Van de Graaf. 'In vele branches heeft men nu nog een data-voorsprong op de klanten, maar over drie tot vijf jaar is daar geen sprake meer van. Denk aan de impact van The Internet of Things in de woning en scanners waarmee je de eigen gezondheid kan bepalen. Als je wacht tot het zover is, is het te laat. Je moet nú de vraag gaan beantwoorden wat die ontwikkeling gaat betekenen voor je onderneming.'

    In zijn rol van aanjager in dat proces moet finance uit zijn comfortzone stappen, vindt Van de Graaf. 'Ik zie graag dat finance het aangaan van kort cyclische projecten stimuleert. Eis dan niet een business case vooraf, maar spreek bijvoorbeeld af dat binnen drie maanden duidelijk wordt of er al dan niet een klantbehoefte is aangeboord. Houd daarbij de teams in eerste instantie klein, want dat bevordert de creativiteit.'

    Volumedaling

    Het is een proces waarmee inrichter Koninklijke Ahrend ervaring heeft. 'De vastgoedcrisis van 2008 halveerde de Europese markt voor inrichten en had daarmee een majeure impact op de omzet van dit bedrijf', zegt CFO Rolf Verspuij. Vervolgens kwamen daar de effecten van trends als digitalisering overheen. 'Flexwerken en thuiswerken zorgden voor een verdere volumedaling. De verkoop van kasten en werkplekken waren de kurk waar het bedrijf op dreef, maar dat tijdperk liep af, zoveel was duidelijk.'

    In 2012 ging daarom het roer om. 'Vanzelfsprekend' had finance een leidende rol bij die veranderingsoperatie, vindt Verspuij, die begin 2012 bij Ahrend in dienst trad. 'Uiteindelijk gaat het om het halen van financiële doelstellingen. Van alle onderdelen van het nieuwe businessmodel moet toch bepaald worden of en in welke mate ze bijdragen aan het resultaat.' Bovendien noopte de nieuwe koers tot het aanpassen van informatiesystemen voor meer inzicht in de performance.

    'Flexibiliteit en aanpassingsvermogen zijn zeer belangrijk geworden voor onze organisatie', aldus Verspuij, 'want we onderscheiden ons nu door marktgerichtheid. Vroeger was Ahrend min of meer een productiehuis: we ontwikkelden en produceerden een mooi product, om dat in hoge volumes weg te zetten. Nu is luisteren naar de markt het devies. We spelen daar vervolgens op in met nieuwe concepten en allerlei slimme inrichtingsoplossingen. '

    Registratiesysteem

    Waarbij Ahrend nu juist gebruikmaakt van digitalisering: if you can't beat them, join them. 'Zo maken we het de facility-manager gemakkelijk door meubilair te voorzien van geavanceerde technologieën waarmee registratie van gebruik en het creëren van overzicht tot de mogelijkheden behoort. Beheer, onderhoud en kostenbewaking zijn dan efficiënter uit te voeren.' Daarnaast wordt het interessant innovaties toe te passen waarbij gebruikgemaakt wordt van de mobiele telefoon.

    Door al deze veranderingen is Ahrends omzet weer gestegen, zelfs tot boven het niveau van 2007, mede door een aantal overnames vorig jaar. De doorgevoerde veranderingen dragen bovendien zichtbaar positief bij aan de resultaten. 'We zijn er nog niet', zegt Verspuij, 'maar er zijn grote stappen gezet.'

    Wisselwerking

    Ook bij de HVC Groep, een afval-, energie- en grondstoffenbedrijf, hebben marktverschuivingen geleid tot digitalisering van het product- en dienstenpakket. Zo chipt HVC de afvalbakken. De beschikbaarheid van data is een stuwende factor voor de verbreding van de informatievoorziening door de finance-functie, zo vertelt CFO Ingrid Tigchelaar. 'Waarbij er duidelijk een wisselwerking is tussen vraag en aanbod. De roep om transparantie en de technologische mogelijkheden tot dataverzameling en -analyse versterken elkaar.' HVC Groep is in handen van overheden en in die zin een 'klassiek' nutsbedrijf. 'Dat betekent dus: talrijke stakeholders die steeds meer informatie willen over de prestaties en bedrijfsvoering van de organisatie', vertelt Tigchelaar. 'Transparantie is een maatschappelijke norm geworden.'

    Ze werkt momenteel aan de omwenteling om aan die norm te voldoen. 'In de basis is de HVC Groep een volcontinu procestechnologisch bedrijf. We waren al gewend veel gegevens te verzamelen over de bedrijfsvoering, vooral met het oog op de monitoring van de continuïteit en de veiligheid van de bedrijfsprocessen. Echter, die gegevens werden altijd alleen intern gebruikt. Om ze geschikt te maken voor andere stakeholders is een kwaliteitsslag nodig; de buitenwereld stelt nu eenmaal andere kwaliteitseisen aan die informatie. Met alle gevolgen dus voor de ordening, organisatie, verslaggeving en rapportage van die gegevens.'

    Betrouwbaarheid

    Kenmerkend voor de manier waarop finance zich ontwikkelt, zo zegt Tigchelaar, is dat financiële en niet-financiële informatie steeds meer verweven raken. 'In dit type bedrijf liggen de uitdagingen niet in het proces van de verwerking van financiële gegevens, dat is wel op orde. Wel is het belangrijk dat je die financiële gegevens kunt laten aansluiten op al die andere data die belangrijk zijn voor de bedrijfsvoering. Eén bron van informatie: daarmee verhoog je de betrouwbaarheid ervan enorm.'

    Om welke gegevens gaat het dan? Tigchelaar noemt als voorbeeld de hoeveelheid gerecycled afval. 'Met het Rijk zijn daar in de zogeheten VANG-doelstellingen afspraken over gemaakt. Zo moet in 2020 75 procent van het afval gerecycled worden. Deze doelstellingen zijn overgenomen door lagere overheden en die willen verantwoording afleggen aan de burgers. Dat betekent dat wij als inzamelaar en verwerker van afval daar informatie over moeten geven; dat is ook vastgelegd in de dienstverleningsovereenkomsten met onze stakeholders.'

    Een ander voorbeeld is de onlangs afgesloten brand- en ongevallenverzekering. 'Waarbij een goede registratie van alle incidenten in het bedrijf van groot belang is. Dat deden we al, maar alleen voor intern gebruik. Laten zien aan externe stakeholders dat we in control zijn, stelt extra eisen aan de verzameling en verwerking van de betreffende data.'

     

    source: www.fd.nl

     

  • How a Video Game Helped People Make Better Decisions

     

    oct15 14 games aResearchers in recent years have exhaustively catalogued and chronicled the biases that affect our decisions. We all know the havoc that biased decisions can wreak. From misguided beliefs about the side effects of vaccinating our children, to failures in analysis by our intelligence community, biases in decision making contribute to problems in business, public policy, medicine, law, education, and private life.

    Researchers have also long searched for ways to train people to reduce bias and improve their general decision making ability – with little success. Traditional training, designed to debias and improve decision-making, is effective in specific domains such as firefighting, chess, or weather forecasting. But even experts in such areas fail to apply what they’ve learned to new areas. Weather forecasters, for instance, are highly accurate when predicting the chance of rain, but they are just as likely as untrained novices to show bias when making other kinds of probability estimates, such as estimating how many of their answers to basic trivia questions are correct.

    Because training designed to improve general decision making abilities has not previously been effective, most efforts to debias people have focused on two techniques. The first is changing the incentives that influence a decision. Taxing soda, for example, in the hopes that the increased cost will dissuade people from buying it. The second approach involves changing the way information for various choices is presented or choices are made, such as adding calorie information to fast-food menus or offering salad as the default side order to entrées instead of French fries. However, these methods are often not always effective, and when effective, only affect specific decisions, not decision-makers’ ability to make less biased decisions in other situations.

    My research collaborators and I wondered if an interactive training exercise might effectively debias decision-makers. (The team included Boston University’s Haewon Yoon, City University London’s Irene Scopelliti, Leidos’ Carl W. Symborski, Creative Technologies, Inc.’s James H. Korris and Karim Kassam, a former assistant professor at Carnegie Mellon University.) So we spent the past four years developing two interactive, “serious” computer games to see if they might substantially reduce game players’ susceptibility to cognitive bias.

    There was scant evidence that this kind of one-shot training intervention could be effective, and we thought our chances of success were slim. But, as we report in a paper just published in Policy Insights in the Behavioral and Brain Sciences,the interactive games not only reduced game players’ susceptibility to biases immediately, those reductions persisted for several weeks. Participants who played one of our games, each of which took about 60 minutes to complete, showed a large immediate reduction in their commission of the biases (by more than 31%), and showed a large reduction (by more than 23%) at least two months later.

    The games target six well-known cognitive biases. Though these biases were chosen for their relevance to intelligence analysis, they affect all kinds of decisions made by professionals in business, policy, medicine, and education as well. They include:

    • Bias blind spot – seeing yourself as less susceptible to biases than other people
    • Confirmation bias – collecting and evaluating evidence that confirms the theory you are testing
    • Fundamental attribution error – unduly attributing someone’s behavior to enduring aspects of that person’s disposition rather than to the circumstance in which the person was placed
    • Anchoring – relying too heavily on the first piece of information considered when making a judgment
    • Projection – assuming that other people think the same way we do
    • Representativeness – relying on some simple and often misleading rules when estimating the probability of uncertain events

    We ran two experiments. In the first experiment, involving 243 adult participants, one group watched a 30-minute video, “Unbiasing Your Biases,” commissioned by the program sponsor, the Intelligence Advanced Research Projects Activity (IARPA), a U.S. research agency under the Director of National Intelligence. The video first defined heuristics – information-processing shortcuts that produce fast and efficient, though not necessarily accurate, decisions. The video then explained how heuristics can sometimes lead to incorrect inferences. Then, bias blind spot, confirmation bias, and fundamental attribution error were described and strategies to mitigate them were presented.

    Another group played a computer game, “Missing: The Pursuit of Terry Hughes,” designed by our research team to elicit and mitigate the same three cognitive biases. Game players make decisions and judgments throughout the game as they search for Terry Hughes – their missing neighbor. At the end of each level of the game, participants received personalized feedback about how biased they were during game play. They were given a chance to practice and they were taught strategies to reduce their propensity to commit each of the biases.

    We measured how much each participant committed the three biases before and after the game or the video. In the first experiment, both the game and the video were effective, but the game was more effective than the video. Playing the game reduced the three biases by about 46% immediately and 35% over the long term. Watching the video reduced the three biases by about 19% immediately and 20% over the long term.

    In a second experiment, involving 238 adult participants, one group watched the video “Unbiasing Your Biases 2” to address anchoring, projection, and representativeness. Another group played the computer detective game“Missing: The Final Secret,” in which they were to exonerate their employer of a criminal charge and uncover criminal activity of her accusers. Along the way, players made decisions that tested their propensity to commit anchoring, projection, and representativeness. After each level of the game, their commission of those biases was measured and players were provided with personalized feedback, practice, and mitigation strategies.

    Again, the game was more effective than the video. Playing the game reduced the three biases by about 32% immediately and 24% over the long term. Watching the video reduced the three biases by about 25% immediately and 19% over the long term.

    The games, which were specifically designed to debias intelligence analysts, are being deployed in training academies in the U.S. intelligence services. But because this approach affects the decision maker rather than specific decisions, such games can be effective in many contexts and decisions – and with lasting effect. (A commercial version of the games is in production.)

    Games are also attractive because once such approaches are developed, the marginal costs of debiasing many additional people are minimal. As this and other recent work suggests, such interactive training is a promising addition to the growing suite of techniques that improve judgment and reduce the costly mistakes that result from biased decision making.

    Source: http://www.scoop.it/t/strategy-and-competitive-intelligencebig

     

  • How modern ETL processes lead to better decision making

    How modern ETL processes lead to better decision making

    We live in a world of data: there’s more of it than ever before, in a ceaselessly expanding array of forms and locations. Dealing with data is your window into the ways data teams are tackling the challenges of this new world to help their companies and their customers thrive.

    In recent years we’ve seen data become vastly more available to businesses. This is mostly due to the rise of data warehouses, decrease in cost for data storage, and modern ETL tools that make storing and accessing data more approachable than ever before. This has allowed companies to become more and more data driven in all areas of their business. In fact, being data driven has become ubiquitous and imperative to survival in today’s climate. This article will discuss at a high level how modern businesses are leveraging new technology to ingest a wider variety of data sources. 

    As of 2019, according to Bissfully’s 2020 SaaS trends report, companies in all phases of business growth make use of many SaaS aaplications. smaller companies (0-50 employees) use an average of 102 SaaS applications, while mid-sized ones (101-250 employees) use about 137 SaaS applications. Enterprises, meanwhile, can have as many as 288 SaaS applications in use across the entire organization!

    These SaaS products range from CRMs to marketing automation apps to billing software to help desk tools, etc. These applications can log thousands to millions of rows of data made available via API, much of which can hold valuable insight for businesses. Add to this the company’s own internally-gathered product data and it’s clear that even a small or medium-sized business can do a lot with the data at its disposal. In this article, we’ll cover what the ETL process is, why it’s important to powering data-driven businesses, and how the rise of modern ETL tools has changed the ETL process forever.

    The ETL process: foundation of data-driven businesses

    Every company wants every team within their business to make smarter, data-driven decisions. Customer support teams look at trends in support tickets or do text analysis on conversations to understand where they can provide better onboarding and documentation. Marketing teams want better visibility into their ad performance across different platforms and the ROI on their spend. Product and engineering teams dig into productivity metrics or bug reports to help them better prioritize their resources. 

    The ETL process empowers all these different teams to get the information they need to understand and perform their jobs better. Businesses ingest data from a wide array of sources through the ETL process, which stands for Extract, Transform, Load. The prepared data is then available for analysis and use by the various teams who need it, as well as for advanced analytics, embedding into applications, and use for other data monetization efforts. Whatever you want to do with data, you have to ETL it first. 

    Explaining the traditional ETL process

    Traditionally, the extraction step of the process involves batch-pulling the data from the source via files, application database, or an API. In the transformation step, the data is cleaned and modeled so that it’s fit for analysis and ingestible by the data warehouse. Finally, the cleaned tables are loaded into the data warehouse for analysis. 

    This traditional ETL process is very difficult to scale. It often requires full-time data engineers to develop and maintain the scripts that keep the data flowing. This is because the data providers often make changes to their schemas or APIs, which then break the scripts that power the ETL process. Every time there’s a change, the data engineers scramble to update their scripts to accommodate them, resulting in downtime. With businesses now needing to ingest data from so many disparate (often fast-moving) data sources, maintaining ETL scripts for each one is not scalable. 

    Better living through ETL tools: the modern ETL process

    The modern ETL process follows a slightly different order of operations, dubbed ELT. This new process arose as a result of the introduction of tools to update the ETL process, as well as the rise of modern data warehouses with relatively low storage costs.

    Today, ETL tools do the heavy lifting for you. They have integrations for many of the major SaaS applications, and have teams of engineers who maintain those integrations, taking the pressure off of your in-house data team. These ETL tools are built to connect to most major data warehouses, allowing businesses to plug in their apps on one end and their warehouse on the other, while the ETL tools do the rest.

    Users can usually control orchestration via a simple drop-down selections within the apps, alleviating the need to stand up your own servers or EC2 box or building DAGs to run on platforms like Airflow. ETL tools can also typically offer more robust options for appending new data incrementally, or only updating new and modified rows, which can allow for more frequent loads, and closer to real-time data for the business. With this simplified process for making data available for analysis, data teams can focus on finding new applications for data to generate value for the business.

    The ETL process and data warehouses

    Data warehouses are the present and future of data and analytics. Storage costs on data warehouses have drastically reduced in recent years, which allows businesses to load as many raw data sources as possible without the same concerns they might have had before.

    Today, data teams can ingest raw data before transforming it, allowing them to do the transformations in the warehouse instead of a separate staging area. With the increased availability of data and a common language to access that data, SQL, it allows the business more flexibility in leveraging their data to make the right decisions

    The modern ETL process: delivering better results, faster

    Under the traditional ETL process, as data and processing requirements grew, the chance that on-premise data warehouses would fail grew as well. When this happened, IT would have to swoop in to fix the issue, which usually meant adding more hardware.

    The modern ETL process in today’s data warehouses sidesteps this issue by offloading the compute resource management to the cloud data warehouse. Many cloud data warehouses offer compute scaling that allows for dynamic scaling when needs spike. This allows data teams to still see scalable performance while holding increased numbers of computationally expensive data models and ingesting more large data sources. The decreased cost in compute power along with compute scaling in cloud data warehouses allows data teams to efficiently scale resources up or down to suit their needs and better ensure no downtime. The bottom line is that, instead of having your in-house data and/or IT team fretting over your data storage and computing issues, you can offload that pretty much completely to the data warehouse provider.

    Data teams can then build tests on top of their cloud data warehouse to monitor their data sources for quality, freshness, etc. giving them quicker, more proactive visibility into any problems with their data pipelines.

    From ETL to ELT and beyond

    Data, analytics, and BI have radically evolved since their inception. We are leaps and bounds beyond Excel tables and on-prem-centric data sources. Cloud-native data warehouses, cloud-native architecture in analytics and BI platforms, and embedded analytics powered by these systems have redefined what it means to be truly data-driven in our modern age.

    The ETL process has been updated and can now deliver insights from a wide array of datasets, which helps companies and teams of all kinds make smarter decisions, faster. It also opens the doors for advanced analytics, next-level data monetization, and much more. Whatever you’re building with your data, a modern ELT setup will help you get more from your data, easier.

    While we have seen a large migration to data warehouses, looking to the future, we might see even another evolution that involves data lakes being more widely used as the initial destination for all raw data sources, offering even further cost benefits. Additionally, new tools are starting to take form that allow for transformation within these data lakes, thus continuing the evolution of tools and processes within the data pipeline following the path of ETL to ELT.

    Author: Adam Luba

    Source: Sisense

  • Talend: A personal data lovestory

    Talend: A personal data lovestory

    When I was in my mid-twenties, I thought I had it all. I had just recently graduated from a top law school, passed the California Bar Exam, and was working as a junior associate at a prestigious San Francisco law firm. Three short years later, I had turned my back on law and embarked on a career in the technology field, which after many twists and turns, including stints as an analyst at Gartner, positions at a number of start-ups (some of which were actually somewhat successful) and some of the world’s largest companies (Dell and EMC), has landed me at my current position at Talend’s product marketing team.

    Over the years, I have been asked many times why I left the practice of law. My usual answer has always been what you would expect. Quality of life (i.e. no time left for a personal life), office politics (need to cozy up to the right partners to advance), and an unhealthy dislike for billable hours (who wants to document and charge for every minute of every day) were some of my go-to responses. But now that I have been working at Talend for more than half a year, I have realized that the true reason went much deeper than that. Let me try to explain.

    Talend provides data integration, quality and management solutions to organizations of all sizes, from smaller companies to some of the world’s largest enterprises. Our number one goal is to make sure that organizations have all the data they need to make the right decisions and take the right actions, whether it is to have more compelling engagements with customers, develop better products, make better strategic decisions or make more efficient and cost-effective operational decisions. And I believe in this goal. When you think about it, this is the exact opposite of what a lawyer does.

    A lawyer’s job (and I am speaking from the perspective of a trial lawyer, which is what I did) is to limit the amount of data, evidence in the legal parlance, that is used by the ultimate decision maker (whether it is a jury or a judge) as much as possible to what favors your client’s side. Through a variety of motions before a trial and objections during trial (think of terms like hearsay, prejudicial, or irrelevant that you have heard in numerous TV shows or movies), lawyers try to limit the data or evidence that should be considered in making the ultimate decision.

    While this seems to work fine in an adversarial situation, think what it would be like if your business decisions were made the same way. What if a company decided to develop one product over the other because the product development team for the chosen product was able to limit what the other team could share with the executive decision makers. Or, if a decision to expand to a new territory was made based on incomplete market data from all regions.

    I have always been a data head deep down. In college, my favorite class (and my highest grade) was statistics. Looking back on it, I think I realized at a sub-conscious level that limiting or hiding data was not what I wanted to do for a living. That’s why I find it so appropriate that I ultimately ended up at Talend, a company whose goal is the opposite.

    If you are guilty of being as data driven as I am and want to ensure that you have all the information you need to make the right decisions and take the right actions, consider how your organization can benefit from improved data transparency and data access.

    Author: Jason Tolu

    Source: Talend

  • The human impact of data literacy

    The human impact of data literacy

    What if I told you only 32% of business executives said that they’re able to create measurable value from data, and just 27% said their data and analytics projects produce actionable insights? Let me put it another way: How excited would you be if I said I made you some chocolate chip cookies, but I only put in 32% of the required sugar and 27% of the required flour?

    I sure hope you wouldn’t eat those cookies. The cookies would be underprepared and not correctly baked with all the necessary ingredients for tasty success. To make an analogy, there are companies creating data and analytics (think: making cookies) without the necessary cultural and organizational ingredients to derive the greatest value from their creations.

    To help others better understand how data literacy – properly and programmatically implemented – can encourage organizations to use these needed ingredients, I recently co-presented a webinar with Martha Bennett, VP and Principal Analyst, from Forrester, and Rishi Muchhala, Manager of Enterprise Intelligence, from Nemours Children’s Health System. The webinar had thousands of attendees, and we received many good questions. I’ve formulated them and provided detailed answers below.

    Question topic 1: What about the data culture of an organization?

    This was a recurring theme in each of the questions that were asked and for good reason. The number one obstacle to data literacy success has nothing to do with data, technology or the software you deploy; it has everything to do with your culture and the people in your organization. Now, how many of you reading this think changing a culture is easy? If so, trust me – it’s not.

    Changing a culture is definitely not easy. It involves changing the DNA of an organization, so that people embrace – not just accept – data. This means data fluency, data literacy, analytical competence and data mentoring must be encouraged and reinforced at multiple touchpoints throughout the organization. Part of the solution is convincing people at all levels that data is empowering.

    Question topic 2: What are key areas to focus on in a data literacy program?

    This question is very large in scope, and you could get lost trying to address all facets of a data literacy program. Below are a few key areas a data literacy program should concentrate on.

    • Leadership – For any data literacy program to succeed, it must have leadership buy-in. The leaders of any organization set the tone and agenda for cultural change, marking how to measure it, conveying its progress and extolling its virtues.
    • Tailored learning – Remember that each individual is at his or her own data literacy stage, and we cannot expect a program to succeed if we try to fit everyone into the same puzzle space. One size does not fit all – people learn at different speeds in different ways, and you should provide for differing learning experiences that nurture data literacy growth across that spectrum.
    • Curiosity, creativity and critical thinking – Work hard to foster the '3 Cs of Data Literacy', which form the foundational pillars of nearly all data literacy programs. People should have a strong desire to know and understand, as well as engage in divergent and novel thinking. This is more likely to occur when the tenets of such thinking are embedded in every part of a data literacy program.

    Mind you: I am not recommending that everyone go back to school, study statistics and so forth. But, I am saying we need a culture that encourages the questioning and challenging of assumptions.

    Question topic 3: Who should lead the data literacy effort in the company?

    This is another great question. I have been approached by people who wonder if a grassroots movement among the employee base is the key to data literacy success. I have been approached by people who wonder if it is the executive team that leads the charge. The short answer is both.

    In order for your data literacy program to succeed, you must have leadership and executive buy-in. By having buy-in from the executive team, you ensure the workforce understands the company is behind the data literacy initiative. Then, create excitement through grassroots work and data literacy evangelists. These two techniques help organizations drive a holistic and inclusive approach to data literacy.

    Conclusion

    The human impact of data literacy cannot be overemphasized. A workforce and society empowered by data leads to smarter, better-informed decision making, which makes us less prone to errors, groupthink and orthodoxy. This means we will be more open to challenging others’ practices that are not supported by evidence and also more accepting of data-based feedback that challenges our own approaches. In short, as a society, increased data literacy can only help us grow, as professionals and people, enriching and deepening our perspectives.

    Author: Jordan Morrow

    Source: Qlik

EasyTagCloud v2.8