25 items tagged "predictive analytics"

  • 10 Big Data Trends for 2017

    big-dataInfogix, a leader in helping companies provide end-to-end data analysis across the enterprise, today highlighted the top 10 data trends they foresee will be strategic for most organizations in 2017.
     
    “This year’s trends examine the evolving ways enterprises can realize better business value with big data and how improving business intelligence can help transform organization processes and the customer experience (CX),” said Sumit Nijhawan, CEO and President of Infogix. “Business executives are demanding better data management for compliance and increased confidence to steer the business, more rapid adoption of big data and innovative and transformative data analytic technologies.”
     
    The top 10 data trends for 2017 are assembled by a panel of Infogix senior executives. The key trends include:
     
    1.    The Proliferation of Big Data
        Proliferation of big data has made it crucial to analyze data quickly to gain valuable insight.
        Organizations must turn the terabytes of big data that is not being used, classified as dark data, into useable data.   
        Big data has not yet yielded the substantial results that organizations require to develop new insights for new, innovative offerings to derive a competitive advantage
     
    2.    The Use of Big Data to Improve CX
        Using big data to improve CX by moving from legacy to vendor systems, during M&A, and with core system upgrades.
        Analyzing data with self-service flexibility to quickly harness insights about leading trends, along with competitive insight into new customer acquisition growth opportunities.
        Using big data to better understand customers in order to improve top line revenue through cross-sell/upsell or remove risk of lost revenue by reducing churn.
     
    3.    Wider Adoption of Hadoop
        More and more organizations will be adopting Hadoop and other big data stores, in turn, vendors will rapidly introduce new, innovative Hadoop solutions.
        With Hadoop in place, organizations will be able to crunch large amounts of data using advanced analytics to find nuggets of valuable information for making profitable decisions.
     
    4.    Hello to Predictive Analytics
        Precisely predict future behaviors and events to improve profitability.
        Make a leap in improving fraud detection rapidly to minimize revenue risk exposure and improve operational excellence.
     
    5.    More Focus on Cloud-Based Data Analytics
        Moving data analytics to the cloud accelerates adoption of the latest capabilities to turn data into action.
        Cut costs in ongoing maintenance and operations by moving data analytics to the cloud.
     
    6.    The Move toward Informatics and the Ability to Identify the Value of Data
        Use informatics to help integrate the collection, analysis and visualization of complex data to derive revenue and efficiency value from that data
        Tap an underused resource – data – to increase business performance
     
    7.    Achieving Maximum Business Intelligence with Data Virtualization
        Data virtualization unlocks what is hidden within large data sets.
        Graphic data virtualization allows organizations to retrieve and manipulate data on the fly regardless of how the data is formatted or where it is located.
     
    8.    Convergence of IoT, the Cloud, Big Data, and Cybersecurity
        The convergence of data management technologies such as data quality, data preparation, data analytics, data integration and more.
        As we continue to become more reliant on smart devices, inter-connectivity and machine learning will become even more important to protect these assets from cyber security threats.
     
    9.    Improving Digital Channel Optimization and the Omnichannel Experience
        Delivering the balance of traditional channels with digital channels to connect with the customer in their preferred channel.
        Continuously looking for innovative ways to enhance CX across channels to achieve a competitive advantage.
     
    10.    Self-Service Data Preparation and Analytics to Improve Efficiency
        Self-service data preparation tools boost time to value enabling organizations to prepare data regardless of the type of data, whether structured, semi-structured or unstructured.
        Decreased reliance on development teams to massage the data by introducing more self-service capabilities to give power to the user and, in turn, improve operational efficiency.
     
    “Every year we see more data being generated than ever before and organizations across all industries struggle with its trustworthiness and quality. We believe the technology trends of cloud, predictive analysis and big data will not only help organizations deal with the vast amount of data, but help enterprises address today’s business challenges,” said Nijhawan. “However, before these trends lead to the next wave of business, it’s critical that organizations understand that the success is predicated upon data integrity.”
     
    Source: dzone.com, November 20, 2016
  • A Shortcut Guide to Machine Learning and AI in The Enterprise

    advanced-predictive-proactive-etc-Two-men-fighting

    Predictive analytics / machine learning / artificial intelligence is a hot topic – what’s it about?

    Using algorithms to help make better decisions has been the “next big thing in analytics” for over 25 years. It has been used in key areas such as fraud the entire time. But it’s now become a full-throated mainstream business meme that features in every enterprise software keynote — although the industry is battling with what to call it.

    It appears that terms like Data Mining, Predictive Analytics, and Advanced Analytics are considered too geeky or old for industry marketers and headline writers. The term Cognitive Computing seemed to be poised to win, but IBM’s strong association with the term may have backfired — journalists and analysts want to use language that is independent of any particular company. Currently, the growing consensus seems to be to use Machine Learning when talking about the technology and Artificial Intelligence when talking about the business uses.

    Whatever we call it, it’s generally proposed in two different forms: either as an extension to existing platforms for data analysts; or as new embedded functionality in diverse business applications such as sales lead scoring, marketing optimization, sorting HR resumes, or financial invoice matching.

    Why is it taking off now, and what’s changing?

    Artificial intelligence is now taking off because there’s a lot more data available and affordable, powerful systems to crunch through it all. It’s also much easier to get access to powerful algorithm-based software in the form of open-source products or embedded as a service in enterprise platforms.

    Organizations today have also more comfortable with manipulating business data, with a new generation of business analysts aspiring to become “citizen data scientists.” Enterprises can take their traditional analytics to the next level using these new tools.

    However, we’re now at the “Peak of Inflated Expectations” for these technologies according to Gartner’s Hype Cycle — we will soon see articles pushing back on the more exaggerated claims. Over the next few years, we will find out the limitations of these technologies even as they start bringing real-world benefits.

    What are the longer-term implications?

    First, easier-to-use predictive analytics engines are blurring the gap between “everyday analytics” and the data science team. A “factory” approach to creating, deploying, and maintaining predictive models means data scientists can have greater impact. And sophisticated business users can now access some the power of these algorithms without having to become data scientists themselves.

    Second, every business application will include some predictive functionality, automating any areas where there are “repeatable decisions.” It is hard to think of a business process that could not be improved in this way, with big implications in terms of both efficiency and white-collar employment.

    Third, applications will use these algorithms on themselves to create “self-improving” platforms that get easier to use and more powerful over time (akin to how each new semi-autonomous-driving Tesla car can learn something new and pass it onto the rest of the fleet).

    Fourth, over time, business processes, applications, and workflows may have to be rethought. If algorithms are available as a core part of business platforms, we can provide people with new paths through typical business questions such as “What’s happening now? What do I need to know? What do you recommend? What should I always do? What can I expect to happen? What can I avoid? What do I need to do right now?”

    Fifth, implementing all the above will involve deep and worrying moral questions in terms of data privacy and allowing algorithms to make decisions that affect people and society. There will undoubtedly be many scandals and missteps before the right rules and practices are in place.

    What first steps should companies be taking in this area?
    As usual, the barriers to business benefit are more likely to be cultural than technical.

    Above all, organizations need to make sure they have the right technical expertise to be able to navigate the confusion of new vendors offers, the right business knowledge to know where best to apply them, and the awareness that their technology choices may have unforeseen moral implications.

    Source: timoelliot.com, October 24, 2016

     

  • Big Data Predictions for 2016

    A roundup of big data and analytics predictions and pontifications from several industry prognosticators.

    At the end of each year, PR folks from different companies in the analytics industry send me predictions from their executives on what the next year holds. This year, I received a total of 60 predictions from a record 17 companies. I can't laundry-list them all, but I can and did put them in a spreadsheet (irony acknowledged) to determine the broad categories many of them fall in. And the bigger of those categories provide a nice structure to discuss many of the predictions in the batch.

    Predictions streaming in
    MapR CEO John Shroeder, whose company just added its own MapR Streams component to its Hadoop distribution, says "Converged Approaches [will] Become Mainstream" in 2016. By "converged," Schroeder is alluding to the simultaneous use of operational and analytical technologies. He explains that "this convergence speeds the 'data to action' cycle for organizations and removes the time lag between analytics and business impact."

    The so-called "Lambda Architecture" focuses on this same combination of transactional and analytical processing, though MapR would likely point out that a "converged" architecture co-locates the technologies and avoids Lambda's approach of tying the separate technologies together.

    Whether integrated or converged, Phu Hoang, the CEO of DataTorrent predicts 2016 will bring an ROI focus to streaming technologies, which he summarizes as "greater enterprise adoption of streaming analytics with quantified results." Hoang explains that "while lots of companies have already accepted that real-time streaming is valuable, we'll see users looking to take it one step further to quantify their streaming use cases."

    Which industries will take charge here? Hoang says "FinTech, AdTech and Telco lead the way in streaming analytics." That makes sense, but I think heavy industry is, and will be, in a leadership position here as well.

    In fact, some in the industry believe that just about everyone will formulate a streaming data strategy next year. One of those is Anand Venugopal of Impetus Technologies, who I spoke with earlier this month. Venugopa, in fact, feels that we are within two years of streaming data becoming looked upon just another data source.

    Internet of predicted things
    It probably won't shock you that the Internet of Things (IoT) was a big theme in this year's round of predictions. Quentin Gallivan, Pentaho's CEO, frames the thoughts nicely with this observation: "Internet of Things is getting real!" Adam Wray, CEO at Basho, quips that "organizations will be seeking database solutions that are optimized for the different types of IoT data." That might sound a bit self-serving, but Wray justifies this by reasoning that this will be driven by the need to "make managing the mix of data types less operationally complex." That sounds fair to me.

    Snehal Antani, CTO at Splunk, predicts that "Industrial IoT will fundamentally disrupt the asset intelligence industry." Suresh Vasudevan, the CEO of Nimble Storage proclaims "in 2016 the IoT invades the datacenter." That may be, but IoT technologies are far from standardized, and that's a barrier to entry for the datacenter. Maybe that's why the folks at DataArt say "the IoT industry will [see] a year of competition, as platforms strive for supremacy." Maybe the data center invasion will come in 2017, then.

    Otto Berkes, CTO at CA Technologies, asserts that "Bitcoin-born Blockchain shows it can be the storage of choice for sensors and IoT." I hardly fancy myself an expert on blockchain technology, so I asked CA for a little more explanation around this one. A gracious reply came back, explaining that "IoT devices using this approach can transact directly and securely with each other...such a peer-to-peer configuration can eliminate potential bottlenecks and vulnerabilities." That helped a bit, and it incidentally shines a light on just how early-stage IoT technology still is, with respect to security and distributed processing efficiencies.

    Growing up
    Though admittedly broad, the category with the most predictions centered on the theme of value and maturity in Big Data products supplanting the fascination with new features and products. Essentially, value and maturity are proxies for the enterprise-readiness of Big Data platforms.

    Pentaho's Gallivan says that "the cool stuff is getting ready for prime time." MapR's Schroeder predicts "Shiny Object Syndrome Gives Way to Increased Focus on Fundamental Value," and qualifies that by saying "...companies will increasingly recognize the attraction of software that results in business impact, rather than focusing on raw big data technologies." In a related item, Schroeder predicts "Markets Experience a Flight to Quality," further stating that "...investors and organizations will turn away from volatile companies that have frequently pivoted in their business models."

    Sean Ma, Trifacta's Director of Product Management, looking at the manageability and tooling side of maturity, predicts that "Increasing the amount of deployments will force vendors to focus their efforts on building and marketing management tools." He adds: "Much of the capabilities in these tools...will need to replicate functionality in analogous tools from the enterprise data warehouse space, specifically in the metadata management and workflow orchestration." That's a pretty bold prediction, and Ma's confidence in it may indicate that Trifacta has something planned in this space. But even if not, he's absolutely right that this functionality is needed in the Big Data world. In terms of manageability, Big Data tooling needs to achieve not just parity with data warehousing and BI tools, but needs to surpass that level.

    The folks at Signals say "Technology is Rising to the Occasion" and explain that "advances in artificial intelligence and an understanding [of] how people work with data is easing the collaboration between humans and machines necessary to find meaning in big data." I'm not sure if that is a prediction, or just wishful thinking, but it certainly is the way things ought to be. With all the advances we've made in analyzing data using machine learning and intelligence, we've left the process of sifting through the output a largely manual process.

    Finally, Mike Maciag, the COO at AltiScale, asserts this forward-looking headline: "Industry standards for Hadoop solidify." Maciag backs up his assertion by pointing to the Open Data Platform initiative (ODPi) and its work to standardize Hadoop distributions across vendors. ODPi was originally anchored by Hortonworks, with numerous other companies, including AltiScale, IBM and Pivotal, jumping on board. The organization is now managed under the auspices of the Linux Foundation.

    Artificial flavor
    Artificial Intelligence (AI) and Machine Learning (ML) figured prominently in this year's predictions as well. Splunk's Antani reasons that "Machine learning will drastically reduce the time spent analyzing and escalating events among organizations." But Lukas Biewald, Founder and CEO of Crowdflower insists that "machines will automate parts of jobs -- not entire jobs." These two predictions are not actually contradictory. I offer both of them, though, to point out that AI can be a tool without being a threat.

    Be that as it may, Biewald also asserts that "AI will significantly change the business models of companies today." He expands on this by saying "legacy companies that aren't very profitable and possess large data sets may become more valuable and attractive acquisition targets than ever." In other words, if companies found gold in their patent portfolios previously, they may find more in their data sets, as other companies acquire them to further their efforts in AI, ML and predictive modeling.

    And more
    These four categories were the biggest among all the predictions but not the only ones, to be sure. Predictions around cloud, self-service, flash storage and the increasing prominence of the Chief Data Officer were in the mix as well. A number of predictions that stood on their own were there too, speaking to issues as far-reaching as salaries for Hadoop admins to open source, open data and container technology.

    What's clear from almost all the predictions, though, is that the market is starting to take basic big data technology as a given, and is looking towards next-generation integration, functionality, intelligence, manageability and stability. This implies that customers will demand certain baseline data and analytics functionality to be part of most technology solutions going forwards. And that's a great sign for everyone involved in Big Data.

    Source: ZDNet

     

  • Business Intelligence in 3PL: Mining the Value of Data

    data-mining-techniques-create-business-value 1In today’s business world, “information” is a renewable resource and virtually a product in itself. Business intelligence technology enables businesses to capture historical, current and predictive views of their operations, incorporating such functions as reporting, real-time analytics, data and process mining, performance management, predictive analytics, and more. Thus, information in its various forms and locations possesses genuine inherent value.
     
    In the real world of warehousing, the availability of detailed, up-to-the minute information on virtually every item in the operators’ custody, from inbound dock to delivery site, leads to greater efficiency in every area it touches. Logic would offer that greater profitability ensues.
     
    Three areas of 3PL operations seem to be most benefitted through savings opportunities identified through business intelligence solutions: labor, inventory, and analytics.
    In the first case, business intelligence tools can help determine the best use of the workforce, monitoring its activity in order to assure maximum effective deployment. The result: potentially major jumps in efficiency, dramatic reductions in downtime, and healthy increases in productivity and billable labor.
     
    In terms of inventory management, the metrics obtainable through business intelligence can stem inventory inaccuracies that would have resulted in thousands of dollars in annual losses, while also reducing write-offs.
     
    Analytics through business intelligence tools can also accelerate the availability of information, as well as provide the optimal means of presentation relative to the type of user. One such example is the tracking of real-time status of work load by room or warehouse areas; supervisors can leverage real-time data to re-assign resources to where they are needed in order to balance workloads and meet shipping times. A well-conceived business intelligence tool can locate and report on a single item within seconds and a couple of clicks.
     
    Extending the Value
    The value of business intelligence tools is definitely not confined to the product storage areas.
     
    With automatically analyzed information available in a dashboard presentation, users – whether in the office or on the warehouse floor – can view the results of their queries/searches in a variety of selectable formats, choosing the presentation based on its usefulness for a given purpose. Examples:
    • Status checks can help identify operational choke points, such as if/when/where an order has been held up too long; if carrier wait-times are too long; and/or if certain employees have been inactive for too long.
    • Order fulfillment dashboards can monitor orders as they progress through the picking, staging and loading processes, while also identifying problem areas in case of stalled processes.
    • Supervisors walking the floor with handheld devices can both encourage team performance and, at the same time, help assure efficient dock-side activity. Office and operations management are able to monitor key metrics in real-time, as well as track budget projections against actual performance data.
    • Customer service personnel can call up business intelligence information to assure that service levels are being maintained or, if not, institute measures to restore them.
    • And beyond the warehouse walls, sales representatives in the field can access mined and interpreted data via mobile devices in order to provide their customers with detailed information on such matters as order fill rates, on-time shipments, sales and order volumes, inventory turnover, and more.
    Thus, well-designed business intelligence tools not only can assemble and process both structured and unstructured information from sources across the logistics enterprise, but can deliver it “intelligently” – that is, optimized for the person(s) consuming it. These might include frontline operators (warehouse and clerical personnel), front line management (supervisors and managers), and executives.
     
    The Power of Necessity
    Chris Brennan, Director of Innovation at Halls Warehouse Corp., South Plainfield N.J., deals with all of these issues as he helps manage the information environment for the company’s eight facilities. Moreover, as president of the HighJump 3PL User Group, he strives to foster collective industry efforts to cope with the trends and issues of the information age as it applies to warehousing and distribution.
     
    “Even as little as 25 years ago, business intelligence was a completely different art,” Brennan has noted. “The tools of the trade were essentially networks of relationships through which members kept each other apprised of trends and happenings. Still today, the power of mutual benefit drives information flow, but now the enormous volume of data available to provide intelligence and drive decision making forces the question: Where do I begin?”
     
    Brennan has taken a leading role in answering his own question, drawing on the experience and insights of peers as well as the support of HighJump’s Enterprise 3PL division to bring Big Data down to size:
     
    “Business intelligence isn’t just about gathering the data,” he noted, “it’s about getting a group of people with varying levels of background and comfort to understand the data and act upon it. Some managers can glance at a dashboard and glean everything they need to know, but others may recoil at a large amount of data. An ideal BI solution has to relay information to a diverse group of people and present challenges for them to think through.”
     
    source: logisticviewpoints.com, December 6, 2016
  • Data als ingrediënt op weg naar digitale volwassenheid

    0cd4fbcf0a4f81814f388a75109da149ca643f45Stéphane Hamel deed op 21 januari de High Tech Campus in Eindhoven aan: dé kans voor een flinke dosis inspiratie door één van ’s wereld meest vooraanstaande denkers in digital analytics. Hamel lichtte op digital maturity day 2016 (#DMD2016) het Digital Analytics Maturity-model toe.

    Imperfecte data

    Volgens Stéphane Hamel is het verschil tussen een goede en een excellente analyst het volgende: de excellente analyst weet ook bij imperfecte data te komen tot beslissingen of zinvol advies. “Data will never be perfect, know how bad the data is is essential. If you know 5 or 10% is bad, there is no problem”, aldus Hamel.

    Analytics = Context + Data + Creativity

    Analytics klinkt als een vakgebied voor datageeks en nerds. Dat beeld klopt niet: buiten de data is het onderkennen van de context waarbinnen de data zijn verzameld en creativiteit bij het interpreteren ervan essentieel. Om data te begrijpen moet je vanachter je laptop of PC vandaan komen. Alleen door de wereld ‘daarbuiten’ mee te nemen in je analyse kun je als data-analist tot zinvolle inzichten en aanbevelingen komen.

    Hamel geeft een voorbeeld uit de collegebanken: toen een groep studenten de dataset van Save the Children uit 2010 te zien kreeg, dachten sommigen dat de factor 10 toename in websiteverkeer te danken was aan een campagne of toeval. De werkelijke oorzaak was de aardbeving in Haïti.

    Digital Maturity Assessment

    Het Digital Maturity Assessment-model is ontwikkeld aan de hand van de digitale transformatie van honderden bedrijven wereldwijd. Op basis van deze ervaringen weet Stéphane welke uitdagingen bedrijven moeten overwinnen op weg naar digital leadership.

    Digital Analytics Maturity SHamel

    Dit model kun je natuurlijk gebruiken om de eigen organisatie te benchmarken tegen andere bedrijven. De meerwaarde volgens Hamel zit echter in het ‘benchmarken van jezelf versus jezelf’. Het helpt kortom om het gesprek intern aan te gaan. Als je voor de derde keer van tooling switcht, ben je zelf het probleem, niet de technologie.

    Hamel geeft de voorkeur aan een consistente score op de vijf criteria van dit Digital Maturity Assessment-model: liever een twee overall dan uitschieters naar boven of beneden. De factor die meestal het zwakst scoort is ‘process’.

    Dit criterium staat voor de werkwijze om te komen tot dataverzameling, -analyse en -interpretatie. Vaak zit dit proces zelf helemaal niet zo slecht in elkaar, maar worstelen data-analisten om aan collega’s of het managementteam uit te leggen welke stappen ze hebben gezet. Hamel benadrukt daarom: “you need a digital culture, not a digital strategy”.

    Omhels de jongens van IT

    Geef IT de kans om jou echt te helpen. Niet door te zeggen ‘voer dit uit of fix dat’. Wel door IT te vragen om samen met jullie een probleem op te lossen. Hamel ziet digitale analisten daarom vooral als change-agents, niet als stoffige dataprofessionals. Juist die shift in benadering en rol betekent dat we binnenkort niet meer spreken over digital analytics, maar over ‘analytics’.

    Data is the raw material of my craft

    Hamel’s favoriete motto “data is the raw material of my craft” verwijst naar het vakmanschap en de passie die Stéphane Hamel graag aan het vakgebied digital analytics toevoegt. Stéphane’s honger om het verschil te maken in digital analytics werd ooit tijdens een directievergadering aangewakkerd. Hamel zat in die vergadering erbij als de ‘IT guy’ en werd niet serieus genomen toen hij met data de business problemen en kansen wilde benoemen.

    Dit prikkelde Hamel om, met steun van zijn baas, een MBA te gaan doen. En met resultaat: hij rondde de MBA af behorende tot de top 5 procent van alle studenten. Sindsdien opereert hij op het snijvlak van data en bedrijfsprocessen, ondermeer in het beurswezen en in de verzekeringsbranche.

    Digital is de grote afwezige in het onderwijs

    Hamel’s zeer indrukwekkende loopbaan tonen ondermeer een erkenning als een van ’s werelds weinige Certified Web Analysts, ‘Most Influential Industry Contributor’ door de Digital Analytics Association en mede-beheerder van de grootste community op Google+ over Google Analytics. Toch vindt Hamel zijn allergrootste prestatie het afwerpen van het stempel ‘IT’er’.

    Zijn grootste ambitie voor de nabije toekomst is het schrijven van een tekstboek over digital analytics. Er is veel informatie digitaal beschikbaar, maar er mist nog veel content in offline formaat. Juist omdat ook andere sprekers op #DMD16 wezen naar het achterblijvend niveau van onze HBO- en WO-opleidingen in digitale vaardigheden vroeg ik Hamel welke tips hij heeft voor het Nederlands onderwijs.

    In de basis dient volgens Hamel de component ‘digital’ veel meer als rode draad in het curriculum te worden opgenomen. Studenten dienen daarbij gestimuleerd te worden om de content zelf te verrijken met eigen voorbeelden. Zo komt er in cocreatie tussen docenten, auteurs en studenten steeds betere content tot stand.

    De belofte van big data en marketingautomatisering

    Hamel ziet zeker in B2B de toegevoegde waarde van marketing automation. Je relatie met klant en prospect is immers meer persoonlijk. Marketingautomatisering wordt echter soms foutief ingezet waarbij email wordt ingezet om de indruk te wekken van een persoonlijke, menselijke dialoog. Hamel: “I still believe in genuine, human interaction. There is a limit to how you can leverage marketingautomation.”

    Digital Maturity bron PREZI Joeri Verbossen

    Het grootste probleem bij de succesvolle introductie van marketingautomatisering is dan ook ook de maturiteit van de organisatie. Zolang deze niet voldoende is, zal een softwarepakket altijd vooral een kostenpost zijn. Een cultuuromslag moet plaatsvinden zodat de organisatie de software als noodzakelijke randvoorwaarde beschouwt voor het kunnen uitvoeren van de strategie.

    Dezelfde nuchtere woorden gebruikt Hamel over de belofte van big data. Al te vaak hoort hij in bedrijven: “We need Big Data!” Zijn antwoord is dan: “No, you don’t big data, you need solutions. As long as it does the job, I’m happy.”

    Source: Marketingfacts

  • De 5 beloftes van big data

    4921077Big data is een fenomeen dat zichzelf moeilijk laat definiëren. Velen zullen gehoord hebben van de 3 V’s: volume, velocity en variety. Kortgezegd gaat big data over grote volumes, veel snelheid (realtime) en gevarieerde/ongestructureerde data. Afhankelijk van de organisatie kent big data echter vele gezichten.

    Om te analyseren hoe big data het beste in een bedrijf geïntegreerd kan worden, is het van belang eerst duidelijk in beeld te hebben wat big data precies biedt. Dit is het beste samen te vatten in de volgende viif beloftes:


    1. Predictive: Big data genereert voorspellende resultaten die iets zeggen over de toekomst van uw organisatie of resultaat van een concrete actie;
    2. Actionable results: Big data levert mogelijkheden op voor directe acties op gevonden resultaten, zonder menselijke interventie;
    3. Realtime: De nieuwe snelheidsnormen zorgen dat je direct kunt reageren op nieuwe situaties;
    4. Adaptive: Een goed ontworpen model past zich constant automatisch aan wanneer situaties en relaties veranderen;
    5. Scalable: Verwerking en opslagcapaciteit is lineair schaalbaar, waardoor u flexibel kunt inspelen op nieuwe eisen.

    Deze vijf big data beloftes kunnen alleen worden gerealiseerd met inzet van drie big data disciplines/rollen: De big data scientist, de big data engineer en de big data infrastructuur specialist.

    Predictive

    In een klassieke Business Intelligence omgeving worden rapportages gegenereerd over de huidige status van het bedrijf. In het geval van big data praat men echter niet over het verleden of de huidige situatie, maar over predictive analytics.

    Voorspellende rapportages worden mogelijk gemaakt doordat de data scientist patroonherkenningstechnieken toepast op historische data en de gevonden patronen uitwerkt in een model. Het model kan vervolgens de historie inladen en op basis van actuele events/transacties de patronen doortrekken naar de toekomst. Op deze manier kan een manager schakelen van reactief management naar anticiperend management.

    Actionable results

    Actionable results ontstaan wanneer gevonden resultaten uit de modellen van de data scientist direct worden vertaald naar beslissingen in bedrijfsprocessen. Hierbij maakt de data engineer de koppeling en zorgt de data scientist dat het model de output in het juiste formaat aanbiedt. De belofte van actionable results wordt zodoende deels ingelost door de big data-specialisten, echter komt het grootste deel voor rekening van de attitude van het management team.

    Het management heeft de taak om een nieuwe manier van sturing aan te wenden. Er wordt niet meer gestuurd op de micro-processen zelf, maar op de modellen die deze processen automatiseren. Zo wordt er bijvoorbeeld niet meer gestuurd op wanneer welke machine onderhouden moet worden, maar welke risicomarges het beslissende model mag hanteren om de onderhoudskosten te optimaliseren.

    Realtime

    Bij big data wordt vaak gedacht aan grote volumes van terabytes aan data die verwerkt moeten worden. De 'big' van big data is echter geheel afhankelijk van de dimensie van snelheid. Zo is 10 TB aan data verwerken in een uur big data, maar 500 MB verwerken is ook big data als de eis is dat dit in tweehonderd milliseconde moet gebeuren. Realtime verwerking ligt in dat laatste hogesnelheidsdomein van verwerking. Er is geen gouden regel, maar men spreek vaak van realtime wanneer de reactiesnelheid binnen vijfhonderd milliseconde is. Om deze hoge snelheden te realiseren is een combinatie van alle drie de big data disciplines nodig.

    De big data infrastructuur specialist heeft de taak om het opslaan en inlezen van data te optimaliseren. Snelheidsoptimalisatie vind je door de data geheel te structureren op de manier waarop het door het model wordt ingelezen. Zo laten we alle flexibiliteit in de data los om deze vanuit één perspectief zo snel mogelijk te consumeren.

    De big data engineer realiseert dit door de snelheid van de koppelingen tussen de databronnen en de afnemers te optimaliseren, door de koppelingen in een gedistribueerd format aan te bieden. Zo kunnen een theoretisch oneindig aantal resources worden aangeschakeld om de data gedistribueerd te krijgen en elke verdubbeling van resources zorgt voor een verdubbeling van capaciteit. Ook is het aan de big data engineer om de modellen die de data scientist ontwikkelt om te zetten in een format dat alle sub-analyses van het model isoleert - en zoveel mogelijk distribueert over de beschikbare resources. Data scientists werken vaak in programmeertalen als R en Matlab, die ideaal zijn voor het exploreren van de data en de verschillende mogelijke modellen. Deze talen lenen zich echter niet goed voor distributed processing en de big data engineer moet daarom vaak in samenwerking met de data scientist een vertaling van het prototype model verwezenlijken in een productiewaardige programmeertaal als Java of Scala.

    De data scientist verzorgt zoals besproken de modellen en daarmee de logica van de dataverwerking. Om realtime te kunnen opereren is het de taak aan deze persoon om de complexiteit van de dataverwerking in te perken tot een niveau beneden exponentieel. Zodoende is een samenwerking van de drie disciplines vereist om tot een optimaal resultaat te komen.

    Adaptive

    We kunnen spreken van een adaptive omgeving - ook wel machine learning of artificial intelligence genoemd - wanneer de intelligentie van deze omgeving zich autonoom aanpast aan nieuwe ontwikkelingen binnen het te modelleren domein. Om dit mogelijk te maken is het belangrijk dat het model genoeg ervaring heeft opgedaan om zelf te kunnen leren. Hoe meer informatie er beschikbaar is over het model door de geschiedenis heen, hoe breder de ervaring is die we op kunnen doen.

    Scalable

    Schaalbaarheid wordt bereikt wanneer er een theoretisch oneindige verwerkingscapaciteit is als oneindig veel computers worden bijgeschakeld. Dit betekent wanneer je vier keer zoveel capaciteit nodig hebt, vier keer zoveel computers worden bijgeschakeld - en wanneer je duizend keer meer nodig hebt er duizend computers worden toegevoegd. Dit lijkt eenvoudig, maar tot voorheen was deze samenwerking tussen computers een zeer complexe taak.

    Iedere discipline heeft een rol in het schaalbaar maken en schaalbaar houden van big data-oplossingen. Zo verzorgt de big data infrastructuur specialist de schaalbaarheid van het lezen, schrijven en opslaan van data. De big data engineer verzorgt de schaalbaarheid van het consumeren en produceren van data en de big data scientist verzorgt de schaalbaarheid van de intelligente verwerking van de data.

    Big data, big deal?

    Om van de volledige mogelijkheden van big data gebruik te maken is het dus van groot belang een multidisciplinair team in te schakelen. Dit klinkt wellicht alsof er direct zeer grote investeringen gedaan moeten worden, echter biedt big data ook de mogelijkheid om klein te beginnen. Dit kan door een data scientist de verschillende analyses te laten doen op een laptop of een lokale server, om zo met een minimale investering een aantal ‘short-term wins’ voor je organisatie te creëren. Wanneer je de toegevoegde waarde van big data inzichtelijk hebt, is het een relatief kleine stap om een big data omgeving in productie te zetten en zodoende ook jouw organisatie op een data-gedreven manier te kunnen sturen.

    Source: Computable

  • Five Mistakes That Can Kill Analytics Projects

    Launching an effective digital analytics strategy is a must-do to understand your customers. But many organizations are still trying to figure out how to get business values from expensive analytics programs. Here are 5 common analytics mistakes that can kill any predictive analytics effort.

    Why predictive analytics projects fail

    failure of analytics

    Predictive Analytics is becoming the next big buzzword in the industry. But according to Mike Le, co-founder and chief operating officer at CB/I Digital in New York, implementing an effective digital analytics strategy has proven to be very challenging for many organizations. “First, the knowledge and expertise required to setup and analyze digital analytics programs is complicated,” Le notes. “Second, the investment for the tools and such required expertise could be high. Third, many clients see unclear returns from such analytics programs. Learning to avoid common analytics mistakes will help you save a lot of resources to focus on core metrics and factors that can drive your business ahead.” Here are 5 common mistakes that Le says cause many predictive analytics projects to fail.

    Mistake 1: Starting digital analytics without a goal

    “The first challenge of digital analytics is knowing what metrics to track, and what value to get out of them,” Le says. “As a result, we see too many web businesses that don’t have basic conversion tracking setup, or can’t link the business results with the factors that drive those results. This problem happens because these companies don’t set a specific goal for their analytics. When you do not know what to ask, you cannot know what you'll get. The purpose of analytics is to understand and to optimize. Every analytics program should answer specific business questions and concerns. If your goal is to maximize online sales, naturally you’ll want to track the order volume, cost-per-order, conversion rate and average order value. If you want to optimize your digital product, you’ll want to track how users are interact with your product, the usage frequency and the churn rate of people leaving the site. When you know your goal, the path becomes clear.”

    Mistake 2: Ignoring core metrics to chase noise

    “When you have advanced analytics tools and strong computational power, it’s tempting to capture every data point possible to ‘get a better understanding’ and ‘make the most of the tool,’” Le explains. “However, following too many metrics may dilute your focus on the core metrics that reveal the pressing needs of the business. I've seen digital campaigns that fail to convert new users, but the managers still setup advanced tracking programs to understand user 

    behaviors in order to serve them better. When you cannot acquire new users, your targeting could be wrong, your messaging could be wrong or there is even no market for your product - those problems are much bigger to solve than trying to understand your user engagement. Therefore, it would be a waste of time and resources to chase fancy data and insights while the fundamental metrics are overlooked. Make sure you always stay focus on the most important business metrics before looking broader.”

    Mistake 3: Choosing overkill analytics tools

    “When selecting analytics tools, many clients tend to believe that more advanced and expensive tools can give deeper insights and solve their problems better,” Le says. “Advanced analytics tools may offer more sophisticated analytic capabilities over some fundamental tracking tools. But whether your business needs all those capabilities is a different story. That's why the decision to select an analytics tool should be based on your analytics goals and business needs, not by how advanced the tools are. There’s no need to invest a lot of money on big analytics tools and a team of experts for an analytics program while some advanced features of free tools like Google Analytics can already give you the answers you need.”

    Mistake 4: Creating beautiful reports with little business value

    “Many times you see reports that simply present a bunch of numbers exported from tools, or state some ‘insights’ that has little relevance to the business goal,” Le notes. “This problem is so common in the analytics world, because a lot of people create reports for the sake of reporting. They don’t think about why those reports should exist, what questions they answer and how those reports can add value to the business. Any report must be created to answer a business concern. Any metrics that do not help answer business questions should be left out. Making sense of data is hard. Asking right questions early will

    help.”

    Mistake 5: Failing to detect tracking errors

    “Tracking errors can be devastating to businesses, because they produce unreliable data and misleading analysis,” Le cautions. “But many companies do not have the skills to setup tracking properly, and worse, to detect tracking issues when they happen. There are many things that can go wrong, such as a developer mistakenly removing the tracking pixels, transferring incorrect values, the tracking code firing unstably or multiple times, wrong tracking rule's logic, etc. The difference could be so subtle that the reports look normal, or are only wrong in certain scenarios. Tracking errors easily go undetected because it takes a mix of marketing and tech skills. Marketing teams usually don’t understand how tracking works, and development teams often don’t know what ‘correct’ means. To tackle this problem, you should frequently check your data accuracy and look for unusual signs in reports. Analysts should take an extra step to learn the technical aspect of tracking, so they can better sense the problems and raise smart questions for the technical team when the data looks suspicious.”

    Author: Mike Le

    Source: Information Management

  • Hoe onderscheiden data gedreven organisaties zich echt?

    We are data driven Image

    Je hoort het vaak in bestuurskamers: we willen een data-driven organisatie zijn. We willen aan de slag met IoT, (predictive) analytics of location based services. En ja, dat zijn sexy toepassingen. Maar wat zijn de werkelijke business drivers? Die blijven vaak onderbelicht. Onderzoek laat zien op welke terreinen organisaties met een hoge ‘datavolwassenheid’ vooroplopen.

    SAS ondervroeg bijna 600 beslissers en kon op basis van de antwoorden de respondenten onderverdelen in drie groepen: de koplopers, een middengroep en de achterblijvers. Zo ontstaat goed zicht op waarin de koplopers zich onderscheiden van de achterblijvers.

    Het eerste wat opvalt is de proactieve houding. Koplopers maken budget vrij om oude processen en systemen te vervangen en investeren in de uitdaging van data-integratie. Er heerst bovendien een cultuur van ‘continuous improvement’. Deze bedrijven zijn voortdurend actief op zoek naar verbetermogelijkheden. Dit in tegenstelling tot de achterblijvers, die pas willen investeren in verbeteringen als ze precies weten hoe hoog de ROI is.

    De koplopers vervangen hun oude systemen het vaakst door open source data platformen, waarbij Hadoop verreweg het meest populaire platform is. Behalve in technologie investeren deze bedrijven ook meer in het opschonen van data. Ze hebben goede processen ingericht om ervoor te zorgen dat data up-to-date en van de juiste kwaliteit is voor het beoogde gebruik. En ook de governance op deze processen is beter dan in de bedrijven die achterblijven (lees hier over het verhogen van de ROI op data en IT).

    Ook investeren koplopers meer in talent. 73 procent van deze bedrijven heeft een dedicated datateam dat wordt bezet met eigen mensen. De achterblijvers hebben vaker ofwel helemaal geen datateam ofwel een team dat wordt ingevuld door externe mensen. Koplopers investeren ook meer in werving en selectie van gekwalificeerd personeel. Daardoor ondervindt ‘slechts’ 38 procent van de koplopers een tekort aan interne vaardigheden, tegenover 62 procent van de achterblijvers.

    Dit alles leidt ertoe dat koplopers beter zijn voorbereid op de GDPR-regelgeving, die in 2018 zijn intrede doet.

    Ze zijn beter in staat om de risico’s te benoemen die verbonden zijn aan een data-driven strategie en ze hebben maatregelen genomen om deze risico’s af te dekken of te verkleinen.

    De komst van de GDPR is voor veel organisaties een aanleiding om te investeren in een goede datastrategie. Maar dit is niet de enige reden. Bedrijven met een hoge datavolwassenheid kunnen:

    • sneller ingewikkelde vragen beantwoorden
    • sneller beslissingen nemen
    • sneller innoveren en groeien
    • de klantervaring verbeteren
    • groei realiseren in omzet en marktaandeel
    • kortere time-to-market voor nieuwe producten en diensten realiseren
    • business processen optimaliseren
    • betere strategische plannen en rapportages maken

    Alle reden dus om écht in data governance en data management te investeren en niet alleen maar te roepen dat je organisatie data-driven is. 90 procent van de ondervraagden vindt zichzelf namelijk datagedreven, maar de realiteit is helaas minder rooskleurig.

    Interesse in de volledige onderzoeksresultaten?
    Download hier het rapport ‘How data-driven organisations are winning’.

     

    Bron: Rein Mertens (SAS)

    In: www.Analyticstoday.nl

  • Hoe waarde creatie met predictive analysis en datamining

    De groeiende hoeveelheid data brengt een stortvloed aan vragen met zich mee. De hoofdvraag is wat we met die data kunnen Data miningen betere diensten aangeboden kunnen worden en risico’s vermeden? Helaas blijft bij de meeste bedrijven die vraag onbeantwoord. Hoe kunnen bedrijven waarde aan data toevoegen en overgaan tot predictive analytics, machine learning en decision management?

    Predictive analytics: de glazen bol voor de business

    Via data mining worden verborgen patronen in gegevens zichtbaar waardoor de toekomst voorspeld kan worden. Bedrijven, wetenschappers en overheden gebruiken al tientallen jaren dit soort methoden om vanuit data inzichten voor toekomstige situaties te verkrijgen. Moderne bedrijven gebruiken data data mining en predictive analytics om onder andere fraude op te sporen, cybersecurity te voorkomen en voorraadbeheer te optimaliseren. Dankzij een iteratief analytisch proces brengen zij data, verkenning van de data en de inzet van de nieuwe inzichten uit de data samen.

    Data mining: business in de lead

    Decision management zorgt dat deze inzichten worden omgezet in acties in het operationele proces. De vraag is hoe dit proces binnen een bedrijf vorm te geven. Het begint altijd bij een vraag vanuit de business en eindigt bij een evaluatie van de acties. Hoe deze Analytical Life Cycle eruit ziet en welke vragen relevant zijn per branche, leest u in de Data Mining From A to Z: How to Discover Insights and Drive Better Opportunities.

     

    Naast dit model waarin duidelijk wordt hoe uw bedrijf dit proces kan inzetten, wordt dieper ingegaan op de rol van data mining in het stadium van onderzoek. Door dit verder uit te diepen via het onderstaande stappenplan kan nog meer waarde uit data worden gehaald.

    1. Business-vraag omvormen tot een analytische hypothese

    2. Data gereedmaken voor data mining

    3. Data verkennen

    4. Data in een model plaatsen

    Wilt u weten hoe uw bedrijf ook data in kan zetten om de vragen van morgen te kunnen beantwoorden en een betere service kan verlenen? Download dan “Data Mining From A to Z: How to Discover Insights and Drive Better Opportunities.”

  • How to Sell Your C-suite on Advanced Analytics

    Advanced AnalyticsBut just because businesses are more open to exploring analytics and executives are dropping data science buzzwords in meetings doesn’t mean you don’t still have to sell your C-suite on investing in such technology. Analytics done right with the best tools and a skilled staff can get extremely expensive, and your C-suite isn’t just going to write you a blank check, especially if you can’t communicate how this investment will positively impact the bottom line.

    As the co-founder of Soothsayer Analytics, which applies artificial intelligence to build analytics tools for companies, Christopher Dole has experienced firsthand how difficult it can be to sell senior leadership on the ROI of advanced analytics.

    Since the founding of his company two years ago, he has continued to hone his pitch on prescriptive analytics, and he’s learned what information C-suite executives look for both before and after the launch of an analytics platform. He listed four pieces of advice for how to not only pitch an analytics program, but also ensure its continued success after its launch.

    Do your homework

    Prior to even scheduling a meeting with senior leadership, you must first arm yourself with the answers to every question that might get thrown your way.

    “I would definitely plan on meeting with any relevant colleagues, peers, or other internal stakeholders about issues and opportunities that they’d like to address,” said Dole. “And once you have some ideas you should also, in advance, meet with your data team and identify any relevant data — preferably data that’s clean and comprehensive — so then when you’re actually in front of the C-suite or board you can start by clearly defining where you’re currently at in the analytics journey, whether it’s the descriptive, diagnostic, predictive, or prescriptive level. If leadership says that your company is already doing analytics, yet they can’t predict what will happen or what can be done to perturb it, then they aren’t really doing analytics, and you should clearly articulate that.”

    It’s also important during your research to find examples of other companies’ experience with analytics solutions similar to the ones you’re proposing.

    “Talk about the value it created for them,” said Dole. “So, for example, if you’re starting on an analytics initiative and you’re a telecom provider, talk about how a competitor tapped into their stream of customer data to reduce churn and provide millions of dollars per year of savings.” When generating a list of examples, he said, try to focus more on instances that generated revenue or prevented losses as opposed to reduced waste. “Making money is often seen as sexier than saving money.”

    Start with the low hanging fruit

    If you’re just starting out in the analytics game, it may be tempting to ramp up a state-of-the-art program. But it’s actually more important to get some early wins by capturing the low-hanging fruit.

    “If possible, start with a larger problem that can be easily split into sub projects,” said Dole. “For instance, if you decide to focus on customer understanding, start with scientific customer segmentation. That way, once you know who your customers are, you can start to solve other problems that would require that understanding as a foundation anyway, whether it’s identifying opportunities for cross-sell and upsell, predicting and preventing churn, or forecasting customer lifetime value. These quick wins can typically be achieved within 12 weeks.”

    Set the proper expectations

    It can be incredibly tempting to hype the potential payoff of analytics, but overselling it can result in the C-suite viewing outcomes as failures when they would otherwise be considered wins.

    “It may be a few month or two before any snippets of insight can be garnered, so it’s important that they are patient during the process,” said Dole. “A lot of what a data scientist is doing is identifying, collecting and compiling clean data into usable formats, and this can often take up to 60 percent of their time. Make sure they understand that a properly structured analytics project typically provides as much as a 13x ROI. There are many steps to achieving this, and everyone needs to be aligned on the ultimate goal.”


    Above all, you should keep it simple stupid. It’s all too easy for a data scientist to get bogged down into technical jargon and respond to questions with arcane answers.

    “Use rich visualizations when possible because it’s much easier to understand a graphic than an equation or complex model,” said Dole. “Remove as much of the math and science as possible and just focus on the insights and the value that it’s going to create as well as all of the potential to expand upon it.”

    Information Management, 2016, Simon Owens

  • Insights from Dresner Advisory Services’ 2016 The Internet of Things and Business Intelligence Market Study

    • Sales and strategic planning teams see IoT as the most valuable.
    • IoT advocates are 3X as likely to consider big data critical to the success of their initiatives & programs.
    • Amazon and Cloudera are the highest ranked big data distributions followed by Hortonworks and Map/R.
    • Apache Spark MLib is the most known technology on the nascent machine learning landscape today.

    These and many other excellent insights are from Dresner Advisory Services’ 2016 The Internet of Things and Business Intelligence Market Study published last month. What makes this study noteworthy is the depth of analysis and insights the Dresner analyst team delivers regarding the intersection of big data and the Internet of Things (IoT), big data adoption, analytics, and big data distributions. The report also provides an analysis of Cloud Business Intelligence (BI) feature requirements, architecture, and security insights. IoT adoption is thoroughly covered in the study, with a key finding being that large organizations or enterprises are the strongest catalyst of IoT adoption and use. Mature BI programs are also strong advocates or adopters of IoT and as a result experience greater BI success. IoT advocates are defined as those respondents that rated IoT as either critical or very important to their initiatives and strategies.

    Key takeaways of the study include the following:

    • Sales and strategic planning see IoT as the most valuable today.The combined rankings of IoT as critical and very important are highest for sales, strategic planning and the Business Intelligence (BI) Competency Centers. Sales ranking IoT so highly is indicative of how a wide spectrum of companies, from start-ups to large-scale enterprises, is attempting to launch business models and derive revenue from IoT. Strategic planning’s prioritization of IoT is also driven by a long-term focus on how to capitalize on the technology’s inherent strengths in providing greater contextual intelligence, insight, and potential data-as-a-service business models.

    IoT-Importance-by-Function-cp

    • Biotechnology, consulting, and advertising are the industries that believe IoT is the most important to their industries.Adoption of IoT across a wide variety of industries is happening today, with significant results being delivered in manufacturing, distribution including asset management, logistics, supply chain management, and marketing. The study found that the majority of industries see IoT as not important today, with the exception of biotechnology.

    IOT-Importance-by-Industry-cp

    • Location intelligence, mobile device support, in-memory analysis, and integration with operational systems are the four areas that most differentiate IoT advocates’ interests and focus.Compared to the overall sample of respondents, IoT advocates have significantly more in-depth areas of focus than the broader respondent base. The four areas of location intelligence, mobile device support, in-memory analysis, and integration with operational systems show they have a practical, pragmatic mindset regarding how IoT can contribute greater process efficiency, revenue and integrate with existing systems effectively.

    IoT-Advocates-Circle-cp1

    • An organization’s ability to manage big data analytics is critically important to their success or failure with IoT. IoT advocates are 3X as likely to consider big data critical, and 2X as likely to consider big data very important. The study also found that IoT advocates see IoT as a core justification for investing in and implementing big data analytics and architectures.

    importance-of-big-data-cp

    • Data warehouse optimization, customer/social analysis, and IoT are the top three big data uses cases organizations are pursuing today according to the study. Data warehouse optimization is considered critical or very important to 50% of respondents, making this use case the most dominant in the study. Large-scale organizations are adopting big data to better aggregate, analyze and take action on the massive amount of data they generate daily to drive better decisions. One of the foundational findings of the study is that large-scale enterprises are driving the adoption of IoT, which is consistent with the use case analysis provided in the graphic below.

    big-data-use-cases-with-cp

    • IoT advocates are significantly above average in their use of advanced and predictive analytics today. The group of IoT advocates identified in the survey is 50% more likely to be current users of advanced and predictive analytics apps as well. The study also found that advanced analytics users tend to be the most sophisticated and confident BI audience in an organization and see IoT data as ideal for interpretation using advanced analytics apps and techniques.

    advanced-and-predictive-analytics-cp

    • Business intelligence experts, business analysts and statisticians/data scientists are the greatest early adopters of advanced and predictive analytics. More than 60% of each of these three groups of professionals is using analytics often, which could be interpreted as more than 50% of their working time.

    users-of-advanced-and-predictive-analytics-cp

    • Relational database support, open client connectors (ODBC, JDBC) and automatic upgrades are the three most important architectural features for cloud BI apps today. Connectors and integration options for on-premises applications and data (ERP, CRM, and SCM) are considered more important than cloud application and database connection options. Multitenancy is considered unimportant to the majority of respondents. One factor contributing to the unimportance of multi-tenancy is the assumption that this is managed as part of the enterprise cloud platform.

    Cloud-BI-Architectural-Requirements-cp

    • MapReduce and Spark are the two most known and important big data infrastructure technologies according to respondents today. 48% believe that MapReduce is important and 42% believe Spark is. The study also found that all other categories of big data infrastructure are considered less important as the graphic below illustrates.

    big-data-infrastructure-cp

     Forbes, 4 oktober 2016

  • Integration Will Accelerate Internet Of Things, Industrial Analytics Growth In 2017

    • internet-of-things-cityscape-graphic-hqEnabling real-time integration across on-premise and cloud platforms often involves integrating SAP, Salesforce, third-party and legacy systems. 2017 will be a break-out year for real-time integration between SAP, Salesforce, and third party systems in support of Internet of Things and Industrial Analytics.
    • McKinsey Global Institute predicts that the Internet of Things (IoT) will generate up to $11T in value to the global economy by 2025
    • Predictive and prescriptive maintenance of machines (79%), customer/marketing related analytics (77%) and analysis of product usage in the field (76%) are the top three applications of Industrial Analytics in the next 1 to 3 years.

    Real-Time Integration Is the Cornerstone Of Industrial Analytics

    Industrial Analytics (IA) describes the collection, analysis and usage of data generated in industrial operations and throughout the entire product lifecycle, applicable to any company that is manufacturing and selling physical products. It involves traditional methods of data capture and statistical modeling. Enabling legacy, third-party and Salesforce, SAP integration is one of the most foundational technologies that Industrial Analytics relies on today and will in the future. Real-time integration is essential for enabling connectivity between Internet of Things (IoT) devices, in addition to enabling improved methods for analyzing and interpreting data. One of the most innovative companies in this area is enosiX, a leading global provider of Salesforce and SAP integration applications and solutions. They’re an interesting startup to watch and have successfully deployed their integration solutions at Bunn, Techtronic Industries, YETI Coolers and other leading companies globally.

    A study has recently been published that highlights just how foundational integration will be to Industrial Analytics and IoT. You can download the Industrial-Analytics-Report-2016-2017.pdf. This study was initiated and governed by the Digital Analytics Association e.V. Germany (DAAG), which runs a professional working group on the topic of Industrial Analytics. Research firm IoT Analytics GmbH was selected to conduct the study. Interviews with 151 analytics professionals and decision-makers in industrial companies were completed as part of the study. Hewlett-Packard Enterprise, data science service companies Comma Soft and Kiana Systems sponsored the research. All research and analysis related steps required for the study including interviewing respondents, data gathering, data analysis and interpretation, were conducted by IoT Analytics GmbH. Please see page 52 of the study for the methodology.

    Key Takeaways:

    • With real-time integration, organizations will be able to Increase revenue (33.1%), increase customer satisfaction (22.1%) and increase product quality (11%) using Industrial Analytics. The majority of industrial organizations see Industrial Analytics as a catalyst for future revenue growth, not primarily as a means of cost reduction. Upgrading existing products, changing the business model of existing products, and creating new business models are three typical approaches companies are taking to generate revenue from Industrial Analytics. Integration is the fuel that will drive Industrial Analytics in 2017 and beyond.

    biggest-benefits-of-industrial-analytics

    • For many manufacturers, the more pervasive their real-time SAP integration is, the more effective their IoT and Industrial Analytics strategies will be. Manufacturers adopting this approach to integration and enabling Industrial Analytics through their operations will be able to attain predictive and prescriptive maintenance of their product machines (79%). This area of preventative maintenance is the most important application of Industrial Analytics in the next 1 – 3 years. Customer/marketing-related analytics (77%) and analysis of product usage in the field (76%) are the second- and third-most important. The following graphic provides an overview of the 13 most important applications of Industrial Analytics.

    Most-important-applications-of-Industrial-Analytics

    • 68% of decision-makers have a company-wide data analytics strategy, 46% have a dedicated organizational unit and only 30% have completed actual projects, further underscoring the enabling role of integration in their analytics and IoT strategies. The study found that out of the remaining 70% of industrial organizations, the majority of firms have ongoing projects in the prototyping phase.
      data-analytics-strategy
    • Business Intelligence (BI) tools, Predictive Analytics tools and Advanced Analytics Platforms will be pivotal to enabling industrial data analysis in the next five years. Business Intelligence Tools such as SAP Business Objects will increase in importance to industrial manufacturing leaders from 39% to 77% in the next five years. Predictive Analytics tools such as HPE Haven Predictive Analytics will increase from 32% to 69%. The role of spreadsheets used for industrial data analytics is expected to decline (i.e., 27% think it is important in 5 years vs. 54% today).

    advanced-analytics-BI

    • The Industrial Analytics technology stack is designed to scale based on the integration of legacy systems, industrial automation apps and systems, MES and SCADA systems integration combined with sensor-based data. IoT Analytics GmbH defines the technology stack based on four components inclouding data sources, necessary infrastructure, analytics tools, and applications. The following graphic illustrates the technology stack and underscores how essential integration is to the vision of Industrial Analytics being realized.

    technology-stack

    • Industrial Internet of Things (IIoT) and Industry 4.0 will rely on real-time integration to enable an era of shop-floor smart sensors that can make autonomous decisions and trade-offs regarding manufacturing execution. IoT Analytics GmbH predicts this will lead to smart processes and smart products that communicate within production environments and learn from their decisions, improving performance over time. The study suggests that Manufacturing Execution System (MES) agents will be vertically integrated into higher level enterprise planning and product change management processes so that these organizations can synchronously orchestrate the flow of data, rather than go through each layer individually.

     game-changer

    Source: business2community.com, 19 december 2016

  • Is Predictive Analytics the future of BI? Or is it something totally different

    Predictive analyticsM. Zaman already stated a view years ago: the market is witnessing an unprecedented shift in business intelligence (BI), largely because of technological innovation and increasing business needs. The latest shift inthe BI market is the move from traditional analytics to predictive analytics. Although predictive analytics belongs to the BI family, it is emerging as a distinct new software sector.

    We can ask ourselves if predictive analytics is a new variant on BI or if it is something new. In essence it is important to understand both the commonalities as the differentialities. Accordingly, and as a consequence, it is a matter of definition whether Predictive Analytics belongs to the BI family or not. Let's focus on the differences.

    Traditional analytical tools claim to have a real 360° view of the enterprise or business, but they analyze only historical data—data about what has already happened. Traditional analytics help gain insight for what was right and what went wrong in decision-making.

    However, past and present insight and trend information are not enough to be competitive in business. Business organizations need to know more about the future, and in particular, about future trends, patterns, and customer behavior in order to understand the market better

    Predictive analytics are used to determine the probable future outcome of an event or the likelihood of a situation occurring. It is the branch of data mining concerned with the prediction of future probabilities and trends. Predictive analytics is used to automatically analyze large amounts of data with different variables; it includes clustering, decision trees, market basket analysis, regression modeling, neural nets, genetic algorithms, text mining, hypothesis testing, decision analytics, and more. A lot of techniques which are not common use in BI.

    The core element of predictive analytics is the predictor, a variable that can be measured for an individual or entity to predict future behavior. Multiple predictors are combined into a predictive model, which, when subjected to analysis, can be used to forecast future probabilities with an acceptable level of reliability. In predictive modeling, data is collected, a statistical model is formulated, predictions are made, and the model is validated (or revised) as additional data become available.

    Predictive analytics combine business knowledge and statistical analytical techniques to apply with business data to achieve insights. These insights help organizations understand how people behave as customers, buyers, sellers, distributors, etc.

    Multiple related predictive models can produce good insights to make strategic company decisions, like where to explore new markets, acquisitions, and retentions; find up-selling and cross-selling opportunities; and discovering areas that can improve security and fraud detection. Predictive analytics indicates not only what to do, but also how and when to do it, and to explain what-if scenarios.

  • Making Content Marketing Work

    Making Content Marketing Work

    Speeds & feeds. “Hero” shots. Print ads. Product placement. Really expensive TV advertisements featuring celebrity endorsements.

    Pitching a product and service back when those phrases dominated marketing and advertising discussions seems very quaint today.

    In an era where the incumbent media companies are seeing their audiences fragment across a host of different devices and online sites (including the online versions of the incumbent media providers), those old school techniques are losing their juice.

    Consumers no longer want a spec sheet or product description that tells them what the product or service is — they want to be shown what the product or service can do for them. And they want to see how other actual people — just like them — use the product or service.

    As if that wasn’t tough enough, today’s consumers can spot inauthentic pitches from a mile away. They will happily share your lack of authenticity with millions of their closest friends on Facebook, via Twitter etc., etc. and etc.

    Content marketing has emerged in the past three years as a practice that allows marketers to maintain the balance between richer, deeper information, or content, about their products and doing it authentically.

    Like so many things in life, describing what content marketing is, and what it can accomplish, is way easier than actually doing content marketing successfully.

    In one of Gartner’s earlier docs on content marketing, my colleague Jake Sorofman exhorted marketers to “think like publishers.” Sound advice but many marketers find that to be difficult. To-date, while many marketers are getting much better at sourcing and distributing the kind of content elements for their needs, measuring content marketing’s contribution is not easy. But it can be done.

    Using content analytics gives content marketers insight into how their efforts are being received by consumers, providing the kind of objective measures that previous generations of marketers dreamed of having. Jake’s most research round-up on content marketing has some timely examples of companies which have wrestled with the content marketing challenge and are realizing the value of not merely finding, creating and distributing content, they’re also focusing on using all the tools available to amplify their efforts. The story about IKEA’s work in the area is particularly interesting.

    Yep, times have changed and it’s a much more complex field than marketing used to be. Digital, content, social, mobile marketers are jobs titles that didn’t exist 15 years ago, for the most part. The good news is that the tools and techniques those new job titles require are increasingly available.

    By Mike McGuire | April 6, 2015 |

  • Predictive analytics in customer surveys: closing the gap between data and action

    Schermafbeelding 2018 01 24 om 10.02.51Customer surveys are a conduit to the voice of the customer (VoC). However, simply capturing survey data is no longer enough to achieve better results.

    When used appropriately, customer surveys can help companies more effectively identify new markets with the most potential for success, create a data-driven pricing strategy, and gauge customer satisfaction. However, capturing survey data is only the first step.

    Companies must analyze and act on survey data to achieve their goals. This is where predictive analytics comes into the picture. As illustrated in Figure 1, companies using predictive analytics to process survey data achieve far superior results across several key performance indicators (KPIs), compared to those without this technology. 

    Since happy customers are more likely to maintain or increase, their spend with a business, growth in customer lifetime value among predictive analytics users signals improvement in customer satisfaction rates. Similarly, companies using this technology also attain 4.6 times the annual increase in overall sales team attainment of quota, compared to non-users. This correlation indicates that predictive analytics can help companies convert survey data into top-line revenue growth.

    Use of predictive analytics to forecast and predict the likelihood of certain events, such as potential sales or changes in customer satisfaction, requires companies to have a comprehensive view of customer and operational data. Most organizations don’t struggle with a lack of survey data given the wealth of insights they glean through the activities noted above. Instead, they are challenged with putting this data to good use. Indeed, findings from Aberdeen’s May 2016 study, CEM Executive's Agenda 2016: Aligning the Business Around the Customer, show that only 15% of companies are fully satisfied with their ability to use survey data in customer experience programs.

    How to Use Predictive Analytics to Maximize Your Performance

    Data shows that Best-in-Class firms (see sidebar) are 20% more likely to be fully satisfied with their use of survey data when conducting customer conversations. A closer look at these organizations reveals that they have 59% greater adoption rate when it comes to predictive analytics, compared to All Others (35% vs. 22%).

    For any organization not currently using predictive analytics to analyze survey data, this technology holds the key to significant performance improvements. As such, we see that with a mere 35% adoption rate, many top performers could use predictive analytics to do even better.

    One mistake companies make when adopting new technologies is assuming that simply deploying the technology will result in sudden – and recurring – performance improvements. The

    situation is no different with predictive analytics. The fact of the matter is, if an organization is looking to increase customer lifetime value or profit margins, the organization must design and execute a well-crafted strategy for utilizing predictive analytics in conjunction with customer surveys.

    On a high level, predictive analytics can be used in two ways:

    1. Systematic analysis: Organizations can establish an analytics program to measure and manage survey data on a regular basis. These programs are aimed at accomplishing certain goals, such as gauging customer satisfaction levels at regular intervals to correlate changes in customer satisfaction rates with changes in the marketplace and overall business activities.

    2. Ad-hoc analysis: Companies can also analyze survey data on an as-needed basis. For example, a company could conduct a one-time analysis of the potential customer spend in a new market to decide whether to enter that market.

    It’s important to note that companies can use both systematic and ad-hoc analysis. Use of systematic analysis allows organizations to continuously monitor their progress towards ongoing performance goals, such as improving customer satisfaction. Ad- hoc analysis, on the other hand, allows companies to use the same analytical capabilities to answer specific questions that may arise.

    Having outlined the two general ways companies use predictive analytics, it’s also important to share the two general types of processes that can be used to produce such analysis:

    1. Statistical analysis: Predictive analytics can provide decision maker across the business with insights into hidden trends and correlations. For example, companies conducting statistical analysis can identify how use of certain customer interaction channels (e.g. web, email, or social media) correlates with customer satisfaction rates as revealed through surveys. This, in turn, allows companies to identify which channels work best in meeting (and exceeding) the needs of target clientele.

    2. Modeling: This second type breaks into two sub-categories:

    1. Forecasting: Companies can use historical and real- time survey data to forecast the likelihood of certain outcomes. For example, a company curious about the potential sales uplift to be expected from a new market would survey potential buyers in the area and ask about their intent to buy and preferred price-points. The forecasting capability of their predictive analytics platform would then allow the company to forecast potential sales numbers.

    2. Predicting: This analysis refers to analyzing historical and real-time survey data to estimate a specific result that might have already happened, might happen currently or will happen in the future. For example, an organization might decide to build a model that helps identify customer spend in a specific market. This might start by developing a model for past sales results where the model produces a result similar to the actual results observed by the company. Having ensured the accuracy of the model, the organization can now use it to predict current and future sales based on changes in the factors built into the same predictive model.

      The difference between forecasting and predicting is that the former only looks at future events or values whereas the latter can look at future, current or historical events when building models. Also, the former requires relying on already available past data (e.g. snow blower purchases) to make forecasts whereas the latter allows companies to predict a certain outcome, in this case snow blower purchases by looking at related factors influencing this result, including recent temperatures, change in average income, and others.

     

    Conclusions:

    Companies have many ways to capture survey data, however only 15% are fully satisfied in their ability to use this data. Predictive analytics helps companies alleviate this challenge by answering business questions designed to improve performance results.

    However, it’s imperative to remember that the statistical insights gleaned through predictive analytics, as well as the models predictive analytics can produce, will only yield results if companies act on the intelligence thus acquired. Don’t overlook the importance of coupling analysis and action. If you are planning to invest in this technology (or have already invested but seek to improve your results), we recommend that you make bridging the gap between data and action a key priority for your business. 

    Author: Omer Minkara

    Source: white paper Aberdeen Group (sponsored by IBM)

  • Predictive modelling in Market Intelligence is hot

    IRCMSTR14533 Global Predictive Analytics Market 500x457

    Market intelligence is nog steeds een functie in bedrijven die onderbelicht is. Hoe vaak hebben bedrijven accuraat en actueel in beeld hoe groot hun markt precies is? En of deze groeit of krimp vertoont?

    B2C bedrijven kunnen tegen aanzienlijke bedragen nog dure rapporten kopen bij de informatiemakelaars van deze wereld. En als ze dan het geluk hebben dat voor hen relevante segmentaties zijn gebruikt kan dat inderdaad wat opleveren. B2B bedrijven hebben een veel grotere uitdaging. Markt data is doorgaans niet commercieel beschikbaar en zal moeten worden geproduceerd (al dan niet met behulp van B2C data). Waarmee markt data voor deze bedrijven eigenlijk nog duurder wordt.

    Bovenstaande discussie gaat bovendien nog slechts om data over de marktomvang en –waarde. De basis zou je kunnen zeggen. Data over concurrenten, marktaandelen, productontwikkelingen en marktbepalende trends is minstens zo relevant om een goede koers te kunnen bepalen maar ook tactische (inkoop, pricing, distributie) beslissingen te kunnen nemen.

    Toch zijn er mogelijkheden! Ook met behulp van schaarse data is het mogelijk marktdata te gaan reconstrueren. Het uitgangspunt: Als we op zoek gaan in die markten waar we wel data hebben naar voorspellende variabelen dan kunnen andere marktdata wellicht worden ‘benaderd’ of ‘geschat’. Een vorm van statistische reconstructie van marktdata die vaak betrouwbaarder blijkt dat dan die van surveys of expert panels. Meer en meer wordt deze techniek toegepast in market intelligence. Dus ook in dit vakgebied doet data science haar intrede.

    Als dit gemeengoed is, is de stap naar het voorspellen van markten natuurlijk niet ver meer weg. Meer en meer wordt die vraag natuurlijk gesteld. Kunnen we ook in kaart brengen hoe de markt er over 5 of misschien zelfs 10 jaar uitziet? Dit kan! En de kwaliteit van die voorspellingen neemt toe. En daarmee het gebruik. Market intelligence wordt er alleen maar leuker van! En het spel om de knikkers natuurlijk alleen maar interessanter.

    Source: Hammer, market intelligence

    http://www.hammer-intel.com

     

     

  • Retailers are using big data for better marketing

    Durjoy-Patranabish-Blueocean-Market-IntelligenceToday, the customers’ expectations are growing by leaps and bounds and the credit goes to the technology that has given ample choices to them. Retailers are leaving no stone unturned to provide better shopping experience by adapting to analytical tools to catch up with the changing expectations of the consumers. Durjoy Patranabish, Senior Vice President, Blueocean Market Intelligence divulged Dataquest about the role of analytics in retail sector. 

    How retailers are using big data analytics to drive real business value?
    The idea of data creating business value is not new; however, the effective use of data is
    becoming the basis of competition. Retailers are using big data analytics to make variety of intelligent decisions to help delight customers and increase sales.

    These decisions range from assessing the market, targeting the right segment, forecasting demand to product planning, and localizing promotions. Advanced analytics
    solutions such as inventory analysis, price point optimization, market basket analysis, cross-sell/ up-sell analytics, real-time sales analytics, etc, can be achieved using
    techniques like clustering, segmentation, and forecasting. Retailers have now realized the importance of big data and are using it to draw useful insights and managing the customer journey.

    How advanced clustering techniques can be used to predict better purchasing behaviors in targeted marketing campaigns?
    Advanced clustering techniques can be used to group customers based on their historical purchase behavior, providing retailers with a better definition of customer segmentation on the basis of similar purchases. The resulting clusters can be used to characterize different customer groups, which enable retailers to advertise and offer promotions to these targeted groups. In addition to characterization, clustering allows retailers to predict the buying patterns of new customers based on the profiles generated. Advanced clustering techniques can build a 3D-model of the clusters based on key business metrics,

    such as orders placed, frequency of orders, items ordered or variation in prices. This business relevance makes it easier for decision makers to identify the problematic clusters that force the retailers to use more resources to attain a targeted outcome. They can then focus their marketing and operational efforts on the right clusters to enable optimum utilization of resources.

    What trends are boosting big data analytics space?

    Some of the trends in the analytics space are:


    „„1. The need for an integrated, scalable, and distributed data store as a single repository will give rise to the growth of data lakes. This will also increase the need for data governance.
    „„2. Cloud-based big data analytics solutions are expected to grow three times more quickly than spending on on-premises solutions.
    „„3. Deep learning which combines machine learning and artificial intelligence to uncover relationships and patterns within various data sources without needing specific models or programming instructions will emerge
    4. „„ The explosion of data coming from the Internet of Things will accelerate real-time and streaming analytics, requiring data scientists to sift through data in search of repeatable patterns that can be developed into event processing models
    „„5. Analytics industry will become data agnostic, primarily having analytics solutions focused around people and machine rather than on structured and unstructured data
    6. „„ Data will become an asset which organizations can monetize by selling or providing value added content.

    What are your views on ‘Big Data for Better Marketing’. How retailers can use analytics tools to be ahead of their competitors?

    Whether it is to provide a smarter shopping experience that influences the purchase decisions of customers to drive additional revenue, or to deliver tailor made relevant real-time offers to customers, big data offers a lot of opportunities for retailers to stay ahead of the competition.


    Personalized Shopping Experience: Data can be analyzed to create detailed customer profiles that can be used for micro-segmentation and offer a personalized shopping experience. A 360 degrees customer view will inform retailers how to best contact their customers and recommend products to them based on their liking and shopping pattern.
    Sentiment analysis can tell retailers how customers perceive their actions, commercials, and products they have on offer. The analysis of what is being said online will provide retailers with additional insights into what customers are really looking for and it will enable retailers to optimize their assortments to local needs and wishes.
    Demand Forecast: Retailers can predict future demand using various data sets such as web browsing patterns, buying patterns, enterprise data, social media sentiment, weather data, news and event information, etc, to predict the next hot items in coming seasons. Using this information, retailers can stock up and deliver the right products
    and the right amount to the right channels and regions. An accurate demand forecast will not only help retailers to optimize their inventory and improve just-in-time delivery but
    also optimize in-store staffing, thus bringing down the cost.
    Innovative Optimization: Customer demand, competitor activity, and relevant news & events can be used to create models that automatically synchronize pricing with inventory levels, demand and the competition. Big data can also enable retailers to optimize floor plans and find revenue optimization possibilities.

    Source: DataQuest

  • The vision of IBM on Analytics


    IBM’s Vision user conference brings together customers who use its software for financial and sales performance management (FPM and SPM, respectively) as well as governance, risk management and compliance (GRC). Analytics is a technology that can enhance each of these activities. The recent conference and many of its sessions highlighted IBM’s growing emphasis on making more sophisticated analytics eaBi-kring sier to use by – and therefore more useful to – general business users and their organizations. The shift is important because the IT industry has spent a quarter of a century trying to make enterprise reporting (that is, descriptive analytics) suitable for an average individual to use with limited training. Today the market for reporting, dashboards and performance management software is saturated and largely a commodity, so the software industry – and IBM in particular – is turning its attention to the next frontier: predictive and prescriptive analytics. Prescriptive analytics holds particular promise for IBM’s analytics portfolio.

    The three basic types of analytics – descriptive, predictive and prescriptive – often are portrayed as a hierarchy, with descriptive analytics at the bottom and predictive and prescriptive (often referred to as “advanced analytics”) on the next two rungs. Descriptive analytics is like a rear-view mirror on an organization’s performance. This category includes variance and ratio analyses, dashboards and scorecards, among others. Continual refinement has enabled the software industry to largely succeed in making descriptive analytics an easy-to-use mainstream product (even though desktop spreadsheets remain the tool of choice). Today, companies in general and finance departments in particular handle basic analyses well, although they are not as effective as they could be. Our research on next-generation finance analytics shows, for example, that most financial analysts (68%) spend the largest amount of their time in the data preparation phases while a relatively small percentage (28%) use the bulk of their time to do what they are supposed to be doing: analysis. We find that this problem is mainly the result of issues with data, process and training.

    The upward shift in focus to the next levels of business analytics was a common theme throughout the Vision conference. This emphasis reflects a key element of IBM’s product strategy: to achieve a competitive advantage by making it easy for most individuals to use advanced analytics with limited training and without an advanced degree in statistics or a related discipline.

    VR2 June 2015

    The objective in using predictive analytics is to improve an organization’s ability to determine what’s likely to happen under certain circumstances with greater accuracy. It is used for four main functions:

    • Forecasting – enabling more nuanced projections by using multiple factors (such as weather and movable holidays for retail sales)
    • Alerting – when results differ materially from forecast values
    • Simulation – understanding the range of possible outcomes under different circumstances
    • Modeling – understanding the range of impacts of a single factor.

    Our research on next-generation business planning finds that despite its potential to improve the business value of planning,  only one in five companies use predictive analytics extensively in their planning processes.

    Predictive analytics can be useful for every facet of a business and especially for finance, sales and risk management. It can help these functions achieve greater accuracy in sales or operational plans, financial budgets and forecasts. The process of using it can identify the most important drivers of outcomes from historical data, which can support more effective modeling. Because plans and forecasts are rarely 100 percent accurate, a predictive model can support timely alerts when outcomes are significantly different from what was projected, enabling organizations to better understand the reasons for a disparity and to react to issues or opportunities sooner. When used for simulations, predictive models can give executives and managers deeper understanding of the range of potential outcomes and their most important drivers.

    Prescriptive analytics, the highest level, help guide decision-makers to make the best choice to achieve strategic or tactical objectives under a specified set of circumstances. The term is most widely applied to two areas:

    • Optimization – determining the best choice by taking into account the often conflicting business objectives or other forms of trade-offs while factoring in business constraints – for example, determining the best price to offer customers based on their characteristics. This helps businesses achieve the best balance of potential revenue and profitability or farmers to find the least costly mix of animal feeds to achieve weight objectives.
    • Stochastic Optimization – determining the best option as above but with random variables such as a commodity price, an interest rate or sales uplift. Financial institutions often use this form of prescriptive analytics to understand how to structure fixed income portfolios to achieve an optimal trade-off between return and risk.

    General purpose software packages for predictive and prescriptive analytics have existed for decades, but they were designed for expert users, not the trained rank-and-file. However, some applications that employ optimization for a specific purpose have been developed for nonexpert business users. For example, price and revenue optimization software, which I have written about is used in multiple industries.  Over the past few years, IBM has been making progress in improving ease of use of general purpose predictive and prescriptive analytics. These improvements were on display at Vision. One of the company’s major initiatives in this area is Watson Analytics. It is designed to simplify the process of gathering a set of data, exploring it for meaning and importance and generating graphics and storyboards to convey the discoveries. Along the way, the system can evaluate the overall suitability of the data the user has assembled for creating useful analyses and assisting general business users in exploring its meaning. IBM offers a free version that individuals can use on relatively small data sets as a test drive. Watson is a cognitive analytics system, which means it is by nature a work in progress. Through experience and feedback it learns various things including terminologies, analytical methods and the nuances of data structures. As such it will become more powerful as more people use it for a wider range of uses because of the system’s ability to “learn” rather than rely on a specific set of rules and logic.

    Broader use of optimization is the next frontier for business software vendors. Created and used appropriately, optimization models can deliver deep insights into the best available options and strategies more easily, accurately, consistently and effectively than conventional alternatives. Optimization eliminates individual biases, flawed conventional wisdom and the need to run ongoing iterations to arrive at the seemingly best solution. Optimization is at the heart of a network management and price and revenue optimization, to name two common application categories. Dozens of optimization applications (including ILOG, which IBM acquired) are available, but they are aimed at expert users.

    IBM’s objective is to make such prescriptive analytics useful to a wider audience. It plans to infuse optimization capabilities it into all of its analytical applications. Optimization can be used on a scale from large to small. Large-scale optimization supports strategic breakthroughs or major shifts in business models. Yet there also are many more ways that the use of optimization techniques embedded in a business application – micro-optimization – can be applied to business. In sales, for example, it can be applied to territory assignments taking into account multiple factors. In addition to making a fair distribution of total revenue potential, it can factor in other characteristics such as the size or profitability of the accounts, a maximum or minimum number of buying units and travel requirements for the sales representative. For operations, optimization can juggle maintenance downtime schedules. It can be applied to long-range planning to allocate R&D investments or capital outlays. In strategic finance it can be used to determine an optimal capital structure where future interest rates, tax rates and the cost of equity capital are uncertain.

    Along the way IBM also is trying to make optimization more accessible to expert users. Not every company or department needs or can afford a full suite of software and hardware to create applications that employ optimization. For them, IBM recently announced Decision Optimization on Cloud (DOcloud), which provides this capability as a cloud-based service; it also broadens the usability ofIBM ILOG CPLEX Optimizer. This service can be especially useful to operations research professionals and other expert users. Developers can create custom applications that embed optimization to prescribe the best solution without having to install any software. They can use it to create and compare multiple plans and understand the impacts of various trade-offs between plans. The DOcloud service also provides data analysis and visualization, scenario management and collaborative planning capabilities. One example given by IBM is a hospital that uses it to manage its operating room (OR) scheduling. ORs are capital-intensive facilities with high opportunity costs; that is, they handle procedures that utilize specific individuals and different combinations of classes of specialists. Procedures also have different degrees of time flexibility. Without using an optimization engine to take account of all the variables and constraints, crafting a schedule is time-consuming. And since “optimal” solutions to business problems are fleeting, an embedded optimization engine enables an organization to replan and reschedule quickly to speed up decision cycles.

    Businesses are on the threshold of a new era in their use of analytics for planning and decision support. However, numerous barriers still exist that will slow widespread adoption of more effective business practices that take full advantage of the potential that technology offers. Data issues and a lack of awareness of the potential to use more advanced analytics are two important ones. Companies that want to lead in the use of advanced analytics need leadership that focuses on exploiting technology to achieve a competitive advantage.

    Author: Robert Kugel

  • Using Business Analytics to improve your business

    business analytics new

    I have often loudly advocated that enterprise performance management and corporate performance management is the integration of dozens of methods, like strategy maps, key performance indicator (KPI) scorecards, customer profitability analysis, risk management and process improvement. 

    But I have insisted that each method requires imbedded analytics of all flavors, and especially predictive analytics is 

    needed. Predictive analytics anticipate the future with reduced uncertainty to enable being proactive with decisions and not being reactive after the fact, when it may be too late.

    A practical example is analytics imbedded in strategy maps, the visualization of an executive team’s causally linked strategic objectives. Statistical correlation analysis can be applied among influencing and influenced KPIs. Organizations struggle with identifying what KPIs are most relevant to measure and then determine what the best target is for that measure. 

    Software from business analytics vendors can now calculate the strength or weakness of causal relationships among the KPIs using correlation analysis and display them visually, such as with the thickness or colors of the connecting arrows in a strategy map. This can validate the quality of KPIs selected. It creates a scientific laboratory for strategy management.

     

    Using the example of professional baseball, an evolving application of business analytics relates to dynamic home stadium ticket prices to optimize revenues. The San Francisco Giants experiment with mathematical equations that weigh ticket sales data, weather forecasts, upcoming pitching matchups and other variables to help decide whether the team should incrementally raise or lower prices right up until game day. The revenue from a seat in a baseball stadium is immediately perishable after the game is played. So any extra available seat sold at any price directly drops to the bottom line as additional profit.

    Another baseball analytics example involves predicting player injuries, which are increasing at an alarming rate. Using an actuarial approach similar to the insurance industry, the Los Angeles Dodgers’ director of medical services and head athletic trainer, Stan Conte, has been refining a mathematical formula designed to help the Dodgers avoid players who spend their days in the training room and not on the ball field. A player on the injured reserve list is expensive in terms of the missed opportunity from their play and the extra cost to replace them. Conte has compiled 15 years of data plus medical records to test his hypothesis that predict the chances a player will be injured and why. 

    Greater statistical analysis is yet to come. The New York Times has reported on new technology that could shift previously hard-to-quantify baseball debates such as the rangiest shortstop or the quickest center fielder from the realm of argument to mathematical equations. A camera and associated software records the precise speed and location of the ball and every player on the field. It dynamically digitizes everything, allowing a treasure trove of new statistics to analyze. Which right fielders charge the ball quickest and then throw the ball the hardest and most accurately? Guesswork and opinion will give way to fact-based measures. 

    An obsession for baseball statistics

    Gerald W. Sculley was an economist most known for his article, “Pay for Performance in Major League Baseball,” which was published in The American Economic Review in December 1974. The article described a method of determining the contribution of individual players to the performance of their teams. He used statistical measures like slugging percentage for hitters and the strikeout-to-walk ratio for pitchers and devised a complex formula for determining team revenue that involved a team’s won-lost percentage and market characteristics of its home stadium, among other factors.

    The Society for American Baseball Research (www.sabr.org), of which I have been a member since the mid-1980s, includes arguably the most obsessive “sabermetrics” fanatics. As a result of hard efforts to reconstruct detailed box scores of every baseball game ever played, and load them into accessible databases, SABR members continue to examine daily every imaginable angle of the game. Bill James, one of SABR’s pioneers and author of The Bill James Baseball Abstract, first published in 1977, is revered as a top authority of baseball analytics.

    How does an organization create a culture of metrics and analytics? Since it is nearing baseball’s World Series time an example is the community of baseball, including its managers, team owners, scouts, players and fans. With better information and analysis of that information, baseball teams perform better – they win!

    Legendary baseball manager Connie Mack’s 3,776 career victories is one of the most unbreakable records in baseball. Mack won nine pennants and five World Series titles in a career that spanned the first half of the 20th century. One way he gained an advantage over his contemporary managers was by understanding which player skills and metrics most contributed to winning. He was way before his time in that he favored hitting power and on-base percentage players to those with a high batting average and speed – an idea that would later become the standard throughout the sport.

    The 2003 book about the business of baseball, Moneyball, describes the depth of analytics that general managers like Billy Beane of the Oakland Athletics apply to selecting the best players, plus batter and pitcher tactics based on the conditions of the team scores, inning, number of outs, and runners on base.

    More recently, the relatively young general manager of the Boston Red Sox, Theo Epstein (who is now with the Chicago Cubs), assured himself of legendary status for how he applied statistics to help overcome the Curse of the Bambino – supposedly originating when the team sold Babe Ruth in 1920 to the New York Yankees – to finally defeat their arch-rival Yankees in 2004 and win a World Series. It ended Boston’s 86-year drought – since 1918 – without a World Series title.

    Author: Gary Cokins

    Source: Information Management

  • Visualization, analytics and machine learning - Are they fads, or fashions?

    Machine learningI was recently a presenter in the financial planning and analysis (FP&A) track at an analytics conference where a speaker in one of the customer marketing tracks said something that stimulated my thinking. He said, “Just because something is shiny and new or is now the ‘in’ thing, it doesn’t mean it works for everyone.”

    That got me to thinking about some of the new ideas and innovations that organizations are being exposed to and experimenting with. Are they fads and new fashions or something that will more permanently stick? Let’s discuss a few of them:

    Dashboards

    Visualization software is a new rage. Your mother said to you when you were a child, “Looks are not everything.” Well, she was wrong. Viewing table data visually, like in a bar histogram, enables people to quickly grasp information with perspective. But be cautious. Yes, it might be nice to import your table data from your spreadsheets and display them in a dashboard! Won’t that be fun? Well it may be fun, but what are the unintended consequences of reporting performance measures as a dial or barometer?

    A concern I have is that measures reported in isolation of other measures provides little to no context as to why the measure is being reported and what “drives” the measure. Ideally dashboard measures should have some cause-and-effect relationship with key performance indicators (KPIs) that should be derived from a strategy map and reported in a balanced scorecard. 

    KPIs are defined as monitoring the progress toward accomplishing the 15-25 strategic objective boxes in a strategy map defined by the executive team. The strategy map provides the context from which the dashboard performance indicators (PIs) can be tested and validated for their alignment with the executive team’s strategy.

    Business analytics

    Talk about something that is “hot.” Who has not heard the terms Big Data and business analytics? If you raised your hand, then I am honored that I am apparently the first blogger you have ever read. Business analytics is definitely a next managerial wave. I am biased towards them because my 1971 university degree was in industrial engineering and operations research. I love looking at statistics. So do television sports fans who are now provided “stats” for teams and players in football, baseball, golf and every kind of televised sport. But the peril of business analytics is they need to serve a purpose for problem solving or seeking opportunities. 

    The analytics thought leader James Taylor advises, “Work backwards with the end in mind.” That is, know why you are applying analytics. Experienced analysts typically start with a hypothesis to prove or disprove. They don’t apply analytics as if they are searching for a diamond in a coal mine. They don’t flog the data until it confesses with the truth. Instead, they first speculate that two or more things are related or that some underlying behavior is driving a pattern seen in various data.

    Machine learning and cognitive software

    There are an increasing number of articles and blogs with this theme related to artificial intelligence – the robots are coming and they will replace jobs. Here is my take. Many executives, managers, and organizations underestimate how soon they will be affected and the severity of the impact. This means that many organizations are unprepared for the effects of digital disruption and may pay the price through lower competitive performance and lost business. Thus it is important to recognize not only the speed of digital disruption, but also the opportunities and risks that it brings, so that the organization can adjust and re-skill its employees to add value.

    Organizations that embrace a “digital disruptor” way of thinking will gain a competitive edge. Digitization will create new products and services for new markets providing potentially substantial returns for investors in these new business models. Organizations must either “disrupt” or “be disrupted”. Companies often fail to recognize disruptive threats until it is too late. And even if they do, they fail to act boldly and quickly enough. Embracing “digital transformation” is their recourse for protection.

    Fads or fashions?

    Are these fads and fashions or the real deal? Are managers attracted to them as the shiny new toys that they must have on their resume for their next bigger job and employer? My belief is these three “hot” managerial methods and tools are essential. But they need to be thought through and properly designed and customized; and not just slapped in willy-nilly just to have them as shiny new toys.

    Bron: Gary Cokins (Information Management)


     
     
  • Voorspelmodellen voor efficiëntere en effectievere campagnes

    voorspelmodel grafiek

     

    voorspelmodellenWie benader je om conversie te verhogen of churn te verlagen? Over welke kanalen verdeel je je marketingbudget? Ontwikkel je een dure brochure of ga je voor een goedkoper kanaal als e-mail? Vragen die je als marketeer stelt bij het opstellen van een nieuwe campagne. Een voorspelmodel geeft antwoordenDe vijf W’s van de marketing

    Bij het opstellen van een nieuwe campagne wordt vaak gebruik gemaakt van de vijf W’s:

    • Waarom wil ik deze campagne voeren?
    • Wie wil ik benaderen?
    • Wat moet ik ze aanbieden?
    • Waar moet ik ze benaderen?
    • Wanneer moet ik ze benaderen?

    In de praktijk blijft het lastig om op al deze vragen een goed en onderbouwd antwoord te geven. Het waarom is vaak wel duidelijk. Het gaat vaak om het realiseren van verkoop of het voorkomen van churn. Maar wie je het beste kan benaderen om conversie te verhogen of churn te verlagen, met welke boodschap en via welk kanaal is niet altijd helder. Zonde, want je wil je campagne zo effectief en efficiënt mogelijk inrichten.

    De praktijk

    Neem een bedrijf dat producten en diensten verkoopt aan consumenten op basis van een abonnement of contract. Denk aan een leaseauto, telefoonabonnement, energiecontract of een verzekering. Op een gegeven moment loopt het contract af en rijst de vraag: gaat de klant het contract (stilzwijgend) verlengen, of loopt hij naar de concurrent?

    Nu kan je iedere klant, waarvan het contract afloopt benaderen met de vraag of ze alsjeblieft klant blijven. Uit het verleden is gebleken dat maar 20% van deze klanten ook daadwerkelijk opzegt. Gevolg is dat je:

    • onnodige kosten maakt. 100.000 brochures of telefoontjes kosten aanzienlijk meer dan 20.000;
    • slapende honden wakker gaat maken, met de kans dat je klanten denken “Hé, ik kan mijn contract beëindigen, laat ik eens kijken of er ergens anders een leuke aanbieding is”;
    • je klanten lastig gaat vallen met onnodige communicatie en je voortaan in de spam inbox beland.

    Om dit te voorkomen wil je weten welke klanten een verhoogde kans hebben om hun contract op te zeggen. In deze situatie is het zeer verstandig om een voorspelmodel in te zetten en hiermee je targetgroep te bepalen.

    Het model: bepaal je high risk klanten

    Met een voorspelmodel schat je de kans dat een klant waar het contract afloopt, deze ook daadwerkelijk opzegt. Je gebruikt zoveel mogelijk beschikbare informatie als (historische) klant, marketing-, sales- en/of webdata als input voor het model.

    Allereerst identificeer je de klanten die in het verleden uit contract liepen en kijkt of zij hun contract hebben beëindigd.
    Daarna bepaal je met het model welke (klant)kenmerken samenhangen met het uiteindelijk wel of niet opzeggen. Concreet zijn dit persoonskenmerken als geslacht, inkomen, het aantal jaren dat iemand al klant is en of er ook andere producten worden afgenomen.

    Vervolgens geeft het model alle geselecteerde kenmerken een gewicht mee. Deze gewichten worden geprojecteerd op de klanten waar de aankomende tijd het contract van afloopt. Zo krijg je per klant de kans dat hij je gaat verlaten in een bepaalde periode.

    Creëer inzichten voor content en kanaal

    Nu bekend is wat de high risk klanten zijn, kan je de targetgroep voor de campagne afbakenen. Je weet dus Wie je moet benaderen. Met een geavanceerder duurmodel kan je ook bepalen Wanneer je iemand moet benaderen.

    De uitkomsten van het model zet je vervolgens in om je targetgroep in te delen in verschillende risicogroepen. Zo bepaal je per groep met Welk marketingkanaal je ze benadert. Klanten met het hoogste risico om ze kwijt te raken wil je persoonlijk benaderen met een goede aanbieding. Klanten met een lager risico kan je dan via een goedkoper kanaal benaderen zoals e-mail. Op deze manier zet je je marketingbudget gericht in.

    Wat wordt de boodschap van je campagne? Dat bepaal je door de kenmerken van je targetgroep in kaart te brengen. Gaat het om jonge of oude mensen? Zijn het relatief vaak vrouwen? Zijn ze al lang klant? Deze informatie is bruikbaar voor de bepaling van de content van je campagne. Doordat je de boodschap gericht afstemt op je doelgroep, worden je campagneresultaten beter!

    De voordelen van voorspelmodellen

    Of het nu gaat om campagnes gericht op prospects, upsell of churn, door gebruik te maken van voorspelmodellen worden je campagnes efficiënter en effectiever. Ze helpen je bij de invulling van de W’s en resulteert uiteindelijk weer in een hogere Return on Investment. Voorspelmodellen:

    • helpen je bij het vaststellen van je targetgroep (Wie)
    • bieden handvaten voor effectieve content van je campagne (Wat)
    • leiden tot een efficiëntere inzet van je marketingbudget via de marketingkanalen (Waar)
    • kunnen inschatten op welk moment je iemand moet benaderen (Wanneer)

     

    Door: Olivier Tanis

    Bron: 2Organize.nl

  • Waarom een leergang Business Data Scientist

    data scientist

    Elke organisatie die veranderd, is op zoek. Die zoektocht heeft vaak betrekking op data: hoe kunnen we data beter toepas

    sen? Hoe kunnen we nieuwe toepassingen voor data vinden? Hebben we wel de juiste data? Wat moeten we doen met data science en big data? Hoe kunnen we data inzetten om betere besluiten te nemen en dus ook beter te presteren?

    Organisaties moeten een antwoord vinden op deze vragen. Deels onder druk van de verder ontwikkelende markt en veranderende concurrentie. Daarmee krijgt data een centrale plaats in de bedrijfsvoering en worden organisaties dus 'data driven'.

    Uiteraard heb je hier 'data science' voor nodig: de omgevingen en vaardigheden om data te ontleden, analyseren en te vertalen naar modellen, adviezen en besluiten.

    We hebben de leergang business data scientist ontworpen omdat geen bedrijf met alleen tools en technieken succesvol gaat worden. Het is juist de business data scientist die de brug vormt tussen data science en de verandering die in organisaties plaats vindt.

    Te vaak ligt bij organisaties de nadruk op de technologie (Hadoop? Spark? Data lake? Moeten we R leren?). Om succesvol te zijn met data heb je ook andere instrumenten nodig. Bedrijfskundige modellen, business analyse, strategievorming helpen om de juiste vragen te formuleren en doelen te stellen. Softskills en veranderkundige vaardigheden om die doelen zichtbaar te maken voor opdrachtgevers en stakeholders. Kennis van data science, architectuur, methoden en organisatiemodellen geeft de inzichten om data science in een organisatie in te passen. Visie en leiderschap is nodig om data sc

    ience in een organisatie te laten werken. Ons doel is om deelnemers dit te laten zien. De opleiding is ontworpen om al deze aspecten samen te laten komen en bruikbare instrumenten te geven.

    Wat ik het leukste vind van deze leergang? Steeds weer de brug maken naar: wat gaan we nu doen, hoe breng je de theorie in praktijk brengen. Elke deel theorie wordt vertaald naar een praktische toepassing in de casus. En dat is de eerste stap naar het halen van successen met data science in je eigen werk, team, afdeling, divisie of organisatie.

    Meer weten? Interesse? Op 28 november is er een thema avond over de Business Data Scientist in Utrecht. Aanmelden kan via de Radboud Management Academy!

    Deze blog is verschenen op www.businessdecision.nl.

    Auteur: Alex Aalberts

     

     
  • What Can Retailers Do To Elude Extinction?

    ExtinctHere's what you didn't learn in school about the disruption affecting retail today. A recent article by consultant Chris H. Petersen, "Seven disruptive trends that will kill the bigstock-Extinct-150-79929610-copy'dinosaurs of retail'" discussed the fate of "25 retail dinosaurs that vanished in the last 25 years" which was the subject of an Entrepreneur article. Those retailers included giants such as Circuit City, Comp USA, Blockbuster, Borders, and Tower Records, companies which literally dominated their category or channel. Others named in the article were retail innovators in their own right until new disruptors outgunned them. The point is that neither longevity, size, or specialization guarantee retail survival today. So how can today's retailers avoid being extinguished by current disruptive innovations?

    Disruptive innovation refers to any enhanced or completely new technology that replaces and disrupts an existing technology, rendering it obsolete. (Picture how we went from the Model T to the KIA; from giant mainframes to personal computers; or from fixed-line telephones to cellphones/smartphones).

    Disruptive innovation is described by Harvard Business professor Clayton Christensen as a process by which a product or service takes root initially in simple applications at the bottom of a market and then relentlessly moves up market, eventually displacing established competitors.

    Today's major disruptive retail trends have led to the rise of the consumer, the rise of technology to help retailers best serve the consumer while wrestling with competitive forces, and the demise of "the old way of doing business."

    I. The Consumer.

    Evolving, innovative, disruptive technology has led to consumer-dominated behavior that reaches across many channels. As we know, today's consumer now shops any time and everywhere using a variety of helping tools.

    The consumer is capable of having a personal, seamless experience across their entire shopping journey to explore, evaluate and purchase, tempered by how retailers DO business, provide service, deal with their competition, etc.

    * The consumer journey starts online, although stores remain a destination for experience.

    What can retailers do? The successful retailer of the future needs to not only master online and offline, but how to connect with the consumer across many touch points, especially social media.

    * Mobile juggernaut. The latest stats show that there are now more cell phones in use than people on this planet. Smartphones now exceed 4.5 billion. Mobile is the majority and will be the preferred screen for shopping.

    What can retailers do? Retail survivors must optimize for mobile engagement, and also broadcast offers and connect with consumers wherever they are. The store of the future will not only have beacons to connect, but to track traffic via mobile as well.

    * Stock availability / Virtual aisle / Endless shelf. More than 50 percent of consumers expect to shop online and see if stock is available in store.

    Omni channel consumers now fully realize that stores can't begin to stock every model, style and color. While consumers can see hundreds if not thousands of products in store, they know that there are millions online.

    What can retailers do? The survivors are literally creating a seamless experience between online, store and mobile apps so the consumer can "have it their way" anywhere, anytime.

    * Consumer experience still rules. Consumer experience still needs to come down to senses: Tactile, visual, and psychological.

    What can retailers do? Virtual dressing rooms, better in-store experiences, and adoption of new disruptive technology to address and satisfy these issues.

    * Personalization of products and services.

    What can retailers do? New survivors are emerging with "mass personalization" opportunities to custom tailor your clothes or curate your personal wardrobe assortment and send it to you.

    * Social Connections and the influence of the opinions of others. Social has become a primary source of research and validation on what to buy. Today's consumers are 14 times more likely to believe the advice of a friend than an ad.

    What can retailers do? Today's major brands are giving much more attention to and spending more dollars on social media than traditional media.

    II. Technology

    Disruptors share the common purpose to create businesses, products and services that are better -- usually less expensive and always more creative, useful, impactful -- and scalable.

    What can retailers do? Put into use as soon as possible disruptive technology solutions such as price and assortment intelligence, behavioral economics, customer experience analytics, predictive analytics, and more to help understand, meet, and outgun the competition and service the customer.

    A Note on Predictive Analytics.

    Dr. Christensen subscribes to predictive analytics as, "the ability to look at data from the past in order to succeed in new ways the future." Predictive analytics solutions, the capability to forecast consumer purchase trends in order to sell the most products at the best prices at any given time are coming on strong.

    Bottom Line For Your Bottom Line

    There's never been a time of more disruptive change in retail. Retailers who are the most adaptable to change -- and not the strongest nor more intelligent of the species -- will be the ones to survive.

    It's a case of keeping yourself on top of the tsunami of change through the mastery of today's and tomorrow's new disruptive technologies.

    *Thanks to Chris H. Petersen, PhD, CEO of Integrated Marketing Solutions, a strategic consultant who specializes in retail, leadership, marketing, and measurement.

    Source: upstreamcommerce.com, February 8, 2015

  • What is Predictive Intelligence and how it’s set to change marketing in 2016

    Screen-Shot-2016-01-27-at-14.02.51Explaining how modelling of marketing outcomes can let you make smarter marketing decisions

    As 2016 gets under way we're seeing more discussion of the applications of Predictive Intelligence. It’s a nascent field, but one that is gaining popularity fast and for some very good reasons, which we will discuss in a lot more detail in this article. We’re going to start this article off by explaining what precisely Predictive Intelligence is, we’re then going to provide some hard stats on its impact in the marketing world so far and are going to finish off by explaining how we feel it’s set to shape marketing in 2016 and beyond.

    What Is Predictive Intelligence?

    Despite the buzz surrounding Predictive Intelligence, many still don’t know what it actually is, so here is our definition. Predictive Intelligence is often used interchangeably with terms like Predictive Recommendation, Predictive Marketing and Predictive Analytics. Although there are some minor differences between the terms, broadly speaking they all essentially mean the same thing.

    Our definition of predictive intelligence for marketing is:

    "Predictive Intelligence is the process of first collecting data on consumers and potential consumers’ behaviours/actions from a variety of sources and potentially combining with profile data about their characteristics.

    This data is then distilled and interpreted, often automatically, by sophisticated algorithms, from which a set of predictions are made, and based on these, rules are developed to deliver relevant communications and offers to consumers to persuade them to engage with a business to meet its goals".
    You can see that because of the three-step process of analysis, interpretation and implementing rules for automated communications, a single sentence definition is difficult to devise! But, we hope this shows the essence of Predictive Marketing.

    McKinsey view it as applying mathematical models to best predict the probability of an outcome. They cite customer relationship manager example using models to estimate the likelihood of a customer changing providers known as ‘churn’. Other examples uses sources including everything from CRM data, marketing data, and structure data such as click through rates or engagement levels.

    The relevant actions that are carried out based on this distilled and interpreted data are that of predicting and then executing the optimum marketing message (e.g. image based vs text heavy / formal vs informal) to specific customer’s/potential customers across the optimum marketing channel(s) (e.g. social media vs email), at the optimum time(s) (e.g. morning vs afternoon) in order achieve your company’s marketing goals. These goals being usually higher engagement and/or sales. In summary, you are communicating in a way that is simultaneously most relevant and preferred by the customers/potential customers and most likely to result in you achieving your marketing goal(s).

    Essentially, you set what the marketing goal is and the Predictive Intelligence algorithms will then make good use of the collected data to find the optimum way of achieving it. Predictive Intelligence, aims to deliver content based on customer needs essentially tailoring the experience for the person receiving the information. Predictive Intelligence, empowered by data, thus begins to usher in true personalised one to one marketing communication that is aligned with a company’s marketing goals.

    Some stats and examples showing the value of predictive intelligence

    While we’re sure all the above sounds great to you, understandably, there is nothing more convincing than some cold hard stats on how Predictive Intelligence is actually performing. So without further ado, check out the below.

    As mentioned in IdioPlatform.com, research firm Aberdeen Group conducted an in depth survey for their paper titled “Predictive Analytics In Financial Services” where they interviewed 123 financial services companies. They uncovered that the companies utilising Predictive Analytics typically managed to achieve an 11 per cent increase in the number of clients they secured in the previous 12 months. Further, they saw a 10 per cent increase in the number of new client opportunities that were identified in comparison to those that have not utilised Predictive Analytics. Pretty impressive.

    Additionally, a Forbes Insights survey of 306 execs from companies with $20 million or more in annual revenue found that of the companies that have been carrying out predictive marketing initiatives for at least 2 years, 86% of them have “increased return on investment as a result of their predictive marketing”.

    Finally, in a study by toneapi.com it was found that by understanding the correlation between emotions expressed in the communications and the subsequent click-through rate. Based on this insight, toneapi.com was able to use the understanding of the model to predict how the airline could increase its click-through rates by appealing to certain emotions that would generate more interest from its customers.

    In summary, Predictive Intelligence drives marked improvements across marketing channels.

    The Emotional Connection

    Initially one of the big advantages of algorithmic Predictive Intelligence was the removal of emotion from the equation; human feelings and mood played little part in the decision as the computers choose the best course of action based on hard data. Now, as processing speeds and storage increase and the analysis of unstructured data improves we are seeing companies move into a more fluid form of Predictive Intelligence based around sentiment and emotion. The driver for this is that emotional analysis of text can help drive understanding of the dynamics that are causing key effects. These insights can be then used to optimise the content to match these emotions creating a more iterative and action orientated approach to marketing campaigns.

    These companies look at the emotions which motivate behaviour and utilize technology to predict and improve results. Toneapi analyses freeform text content (such as emails, press releases and brand copy) for emotional impact and then offers up suggestions for improvements. Likewise Motista studies have shown that “emotion is the most predictive driver of customer behavior”, they bring together big data and social science to increase profitability.

    Looking To 2016 And Beyond

    Up until now Predictive Intelligence has seen most action in the B2B world. B2B companies have been using it to crunch colossal amounts of data on customer/potential customer behaviours from a variety of sources. They have then been using it to automatically draw insights from this data based on a set of signals in order to score their leads, identify the most attractive leads earlier on and uncover new high value leads previously unseen. Crucially, Predictive Intelligence has then allowed the B2B companies to tailor the marketing approach and messaging depending on the customer/potential customer’s actions/behaviours (signals) across the different areas where the data has been collected.

    We believe that in 2016 we’re going to see more of the above, and the process is going to become even more refined and sophisticated within the B2B sector. Also, we feel 2016 is the year we see Predictive Intelligence move more and more in the B2C world too, especially now that its frequent use across industry sectors in B2B has proven its effectiveness and given the approach some real credibility. And, we see more interest in Predictive Intelligence around emotion analytics, free-form text, unstructured data and behavioural social sciences.

    Additionally, now, unlike even a couple of years ago, there are quite a few smaller Predictive Intelligence companies on the market in addition to the big names like IBM or Salesforce. Many of these companies are offering intuitive, easy to understand, well designed and well-priced cloud based Predictive Intelligence software packages. This lowers the barrier to entry greatly for Small-to-Medium businesses (SMB’s). It allows them to dip their toes into the world of Predictive Intelligence and test the waters, with little risk or friction, or if they wish, jump straight into the deep end of Predictive Intelligence and reap the rewards.

    Thus a whole new world has opened up to the SMB. A world that not too long ago was reserved mostly for the large corporations that could afford the expensive Predictive Analytics software (which was the only real choice) or that had the budgets big enough to hire data scientists to crunch data and draw insights from it from which to base their predictions.

    Conclusion

    We hope this article has gone some way in demystifying the phrase “Predictive Intelligence”. Further, we hope we have communicated the immense benefits to be reaped if Predictive Intelligence is executed properly. Benefits in the form of higher engagement, higher click through rates, higher conversion rates and emotional impact. Predictive Intelligence has already seen some real traction in the B2B world, and we believe 2016 will mark the year that the B2C companies and SMB’s in general adopt Predictive Intelligence in a meaningful way. Some dipping their toes in and giving it a try and others jumping straight into the deep end and really embracing it.

    Source: Smart Insights

  • Why predictive analytics will be big in 2015

    predictive analysisAs the big data analytics sector rapidly matures, companies will be increasingly asking what they can get out of the technology this year. And one of the big trends for 2015 will be companies aiming to move beyond a reactive approach to data to a more proactive strategy.

    At the heart of this will be predictive analytics, which has been tipped by several commentators as one of the key developments in the industry for this year. This will be in response to increased demands to tackle the issue of problems and opportunities being spotted too late.

    For instance, it was noted by Tech Vibes writer Christopher Surdak that often, by the time businesses have extracted, transformed and loaded the relevant data using traditional technologies, the chance has passed - and this can defeat the purposes of many investments. He said: "There's no value in identifying what customers are doing every minute of the day if you can't respond predictively and proactively."

    He therefore stated that a top priority for organisations in 2015 will be to re-engineer their big data environments to enable information streams to be accessed, analysed and shared in real time. Benefits of this will include increased revenue, better productivity for knowledge workers, and lower costs.

    A report by TDWI Research, cited by Forbes, identified five key reasons why companies are looking to invest in predictive analytics. These are to predict trends, understand customers, improve business performance, drive strategic decision-making, and predict behaviour.

    Forbes highlights the telco sector as one that particularly stands to benefit from what predictive and real-time analytics can provide. Firms in this sector are looking for new ways to stand out from the crowd in order to retain customers, and these tools can help them better understand customers, and therefore serve them more effectively.

    One example the publication cited was Cox Communications, which turned to predictive analytics to identify business drivers for growth and then pinpoint existing and prospective customers to cultivate new offerings. The firm also wanted to answer tough questions about why customers would choose them over a competitor - or vice versa - as well as what type of customer is likely to buy a specific product

    With predictive analytics, the company was able to put more campaigns into the field, as well as measure the effectiveness of different offers and marketing techniques to different customer segments, Forbes said. As a result, recent campaigns have generated an 18 per cent increase in customer responses.

    The fact predictive analytics can be engaged to tackle a wide range of issues and challenges will be a key factor in the growth of the sector in 2015 - with every department from sales and marketing to human resources being able to derive benefits from the technology.

    For instance, predictive tools can be very useful to HR professionals by enabling them to spot employees who are at risk of leaving a company, as well as better identifying prospective hires who will be a good fit for an organisation.

    Speaking to Tech Target, vice-president and principal analyst at Constellation Research Holger Mueller said: "One of the first areas that vendors have tackled [with predictive analytics] was around 'flight risk', or determining if a valued employee could leave a company. These days we see them used more for recruiting and selecting highly-skilled candidates for the right position."

    Kognitio, 16 Januari 2015

EasyTagCloud v2.8