4 items tagged "predictive intelligence"

  • Forresters' 14 tools for Predictive Analysis

    Forrester Research has identified 14 leading tools for predictive analytics and machine learning. Vendors included in the research firm's a

    future

    sse

    ssment included: Alpine Data, Angoss, Dataiku, Domino Data Lab, FICO, H2O.ai, IBM, KNIME, Microsoft, RapidMiner, Salford Systems, SAP, SAS, and Statistica. 

    Each of these vendors has a comprehensive, differentiated PAML solution; a standalone PAML solution; install base and revenue history; and motivated client inquiries.

    The leaders

    Seven predictive analytics and machine learning products were selected as keaders among the 14 evaluated.

    Angoss KnowledgeSTUDIO
    “Angoss KnowledgeSEEKER is a must-have for data science teams that wish to use beautiful and comprehensive visual tools to build decision and strategy trees,” according to Forrester. “It also offers KnowledgeSTUDIO for building models, InsightOPTIMIZER for numerical optimization, and KnowledgeMANAGER for model management. Angoss recently added a coding environment that allows data scientists to use programming languages including R, Python, and the language of SAS. It also has some integration with Hadoop and Apache Spark.”

    FICO Decision Management Suite
    “FICO’s extensive real-world experience has led to a solution that focuses on the needs of the chief data scientists as well as the rank-and-file dat

    a scientists in a large organization,” Forrester says. “Chief data scientists at mature enterprises demand three key things from data scientists: 1) explainable models, 2) accurate accurate models, and 3) decision management. FICO’s Decision Management Suite encompasses the end-to-end capabilities needed to create, deploy, and monitor models for use in complex, consequential enterprise decisions. FICO needs to expand the number of algorithms it supports to compete more broadly.”

    IBM InfoSphere and SPSS Modeler“SPSS is still the core of IBM’s data science platform, but IBM is launching projects such as SystemML from its investments in its Spark Technology Center,” Forrester says. “IBM has also introduced the Data Science Experience for data science coders, which provides a quick cloud provisioning of open source Jupyter and/or RStudio notebooks with a Spark cluster on the back end to run data pipelines and train models. SSPS is a good fit for data scientists who want the productivity afforded by methods encapsulated in operators.”

    KNIME Analytics Platform
    “KNIME is not a big company, but it has a big community of contributors who continually push the platform forward with capabilities such as bioinformatics and image processing,” according to Forrester. “The KNIME Analytics Platform is free to download and use and includes over 1,000 analytical and model building operators. The vendor funds its ongoing operations by offering commercial extensions such as the KNIME Server for sharing workflows, advanced security, and remot

    e execution of model building workflows. Maybe KNIME was smart for not taking gobs of venture funding during the big data rush.”

    RapidMiner Platform
    “RapidMiner invested heavily to revamp its visual interface, making it the most concise and fluid that we have seen in this evaluation,” Forrester explains. “It also has a comprehensive set of operators that encapsulate a wide range of data prep, analytical, and modeling functionality to increase productivity of data scientists. RapidMiner is open source and has a community that contributes to its growing list of operators. RapidMiner Studio is free to download and use for up to 10,000 data rows, with tiered pricing for more than 10,000 rows of data.”

    SAP BusinessObjects Predictive Analytics, SAP HANA SPS
    “SAP offers comprehensive data science tools to build models, but it is also the biggest enterprise application company on the planet,” Forrester notes. “This puts SAP in a unique position to create tools that allow business users with no data science knowledge to use data-scientist-created models in applications. SAP’s solution offers the data tools that enterprise data scientists expect, but it also offers distinguished automation tools to train models. The solution has plenty of room to 

    grow into its existing applications customer base, but its dependence on SAP’s HANA data platform will limit its attractiveness to non-SAP customers.”

    SAS Analytics Suite
    “SAS is unifying its comprehensive portfolio of data science solutions under SAS Visual Suite,” Forrester notes. “It brings together world-class data prep, visualization, data analysis, model building, and model deployment. This unified tooling approach provides a consistent user experience that data scientists need to build even the most sophisticated models. SAS’s vision for data science is not limited to innovation in tools. It has been quick to jump on new, promising analytical methods across multiple disciplines, such as statistics, econometrics, optimization, machine learning, deep learning, and natural language interaction.” 

    Strong performers

    Five predictive analytics and machine learning products were determined to be strong performers from the 14 products evaluated.

    Alpine Data Chorus“Data scientists spend an inordinate amount of time preparing data and conversing with business stakeholders compared with the time they spend on building valuable models,” Forrester says. “Alpine Data’s visual tool provides data engineers, data scientists, and business stakeholders with the capabilities they need to divide and conquer the work of building models. Data engineers can use the tool to prep data. Data scientists and business stakeholders can communicate using built-in collaboration features.”

    Dataiku DSS
    “A haiku is a Japanese form of poetry of 17 syllables — concise and evocative if done well,” Forrester explains. “That’s Dataiku’s guiding inspiration — to offer a data science platform that lets coders use a notebook when they must, but use visual tools to build workflows when productivity is at a premium. Dataiku is one of the new venture-funded startups that aim to be a well-rounded alternative to the long-time market competitors by offering a notebook experience embedded in 

    a visual experience. With better model management capabilities, Dataiku is poised to challenge the leaders posthaste.”

    H2O.ai H2O
    “H2O.ai is best known for developing open source, cluster-distributed machine learning algorithms at a time (2011) when big data demanded them but no one else had them,” Forrester explains. “To say H2O.ai is an algorithm company today is an understatement. It also offers Sparkling Water to create, manage, and run workflows on Apache Spark and Steam to deploy models. Further, it offers Flow — a notebook-like experience similar to Jupyter. The company recently announced Deep Water and amalgamated distribution of open source deep-learning libraries Caffe, MXNet, and TensorFlow. “

    Microsoft Cortana Intelligence Suite, Microsoft R Server
    “Microsoft offers Microsoft R for data scientists who wish to code in the R programming language supported by callable cluster-distributed algorithms,” Forrester explains. “It also offers Azure Machine Learning to data scientists who want a more traditional visual development tool. There is no reason why enterprises cannot enjoy both. Microsoft’s machine learning cloud services offer pretrained models for tasks such as image labeling, voice recognition, and natural language processing that allow developers with no data science knowledge to use them in applications.”

    Statistica“Statistica was founded in 1984 as Statsoft and acquired by Dell in 2014 as part of Dell’s focus on building an enterprise software portfol

    io,” Forrester says. “It is now part of the newly relaunched Quest Software. The Statistica solution is based on a data science workbench that has a rich set of algorithms and data prep tools that are especially relevant for manufacturers and scientific use cases. Statistica was a non-participating vendor in this evaluation.”

    The contenders

    Two of the evaluated predictive analytics and machine learning products were viewed as contenders.

    Domino Data Lab Domino
    “Domino Data Lab’s solution aims to package the most popular open source coding tools and libraries and provide a unifying interface for teams of data science coders,” Forrester explains. “But this approach also has a drawback: Many of the features critical to enterprises, such as model management and advanced workbench tools, lack open source options. We don’t think Domino can wait for the open source community to add critical enterprise features.”

    Salford Systems SPM Salford Predictive Modeler software suite
    “Salford Systems is adored by its community of customers, large and small, for its implementation of specific methods including CART, MARS, Random Forests, and TreeNet,” according to Forrester. “Most other vendor solutions have one or more of these methods, but Salford claims that its methods are the best because they are implemented by their inventors — including Jerome Friedman, a professor of statistics at Stanford University. Salfor

    d provides a workbench tool for modeling and has added automation scenarios to improve data science productivity. Its focus on creating the most accurate models has been at the expense of adding features like model management and big data analysis that a broader range of enterprise customer needs.”

    Author: Bob Violino

  • Using Business Analytics to improve your business

    business analytics new

    I have often loudly advocated that enterprise performance management and corporate performance management is the integration of dozens of methods, like strategy maps, key performance indicator (KPI) scorecards, customer profitability analysis, risk management and process improvement. 

    But I have insisted that each method requires imbedded analytics of all flavors, and especially predictive analytics is 

    needed. Predictive analytics anticipate the future with reduced uncertainty to enable being proactive with decisions and not being reactive after the fact, when it may be too late.

    A practical example is analytics imbedded in strategy maps, the visualization of an executive team’s causally linked strategic objectives. Statistical correlation analysis can be applied among influencing and influenced KPIs. Organizations struggle with identifying what KPIs are most relevant to measure and then determine what the best target is for that measure. 

    Software from business analytics vendors can now calculate the strength or weakness of causal relationships among the KPIs using correlation analysis and display them visually, such as with the thickness or colors of the connecting arrows in a strategy map. This can validate the quality of KPIs selected. It creates a scientific laboratory for strategy management.

     

    Using the example of professional baseball, an evolving application of business analytics relates to dynamic home stadium ticket prices to optimize revenues. The San Francisco Giants experiment with mathematical equations that weigh ticket sales data, weather forecasts, upcoming pitching matchups and other variables to help decide whether the team should incrementally raise or lower prices right up until game day. The revenue from a seat in a baseball stadium is immediately perishable after the game is played. So any extra available seat sold at any price directly drops to the bottom line as additional profit.

    Another baseball analytics example involves predicting player injuries, which are increasing at an alarming rate. Using an actuarial approach similar to the insurance industry, the Los Angeles Dodgers’ director of medical services and head athletic trainer, Stan Conte, has been refining a mathematical formula designed to help the Dodgers avoid players who spend their days in the training room and not on the ball field. A player on the injured reserve list is expensive in terms of the missed opportunity from their play and the extra cost to replace them. Conte has compiled 15 years of data plus medical records to test his hypothesis that predict the chances a player will be injured and why. 

    Greater statistical analysis is yet to come. The New York Times has reported on new technology that could shift previously hard-to-quantify baseball debates such as the rangiest shortstop or the quickest center fielder from the realm of argument to mathematical equations. A camera and associated software records the precise speed and location of the ball and every player on the field. It dynamically digitizes everything, allowing a treasure trove of new statistics to analyze. Which right fielders charge the ball quickest and then throw the ball the hardest and most accurately? Guesswork and opinion will give way to fact-based measures. 

    An obsession for baseball statistics

    Gerald W. Sculley was an economist most known for his article, “Pay for Performance in Major League Baseball,” which was published in The American Economic Review in December 1974. The article described a method of determining the contribution of individual players to the performance of their teams. He used statistical measures like slugging percentage for hitters and the strikeout-to-walk ratio for pitchers and devised a complex formula for determining team revenue that involved a team’s won-lost percentage and market characteristics of its home stadium, among other factors.

    The Society for American Baseball Research (www.sabr.org), of which I have been a member since the mid-1980s, includes arguably the most obsessive “sabermetrics” fanatics. As a result of hard efforts to reconstruct detailed box scores of every baseball game ever played, and load them into accessible databases, SABR members continue to examine daily every imaginable angle of the game. Bill James, one of SABR’s pioneers and author of The Bill James Baseball Abstract, first published in 1977, is revered as a top authority of baseball analytics.

    How does an organization create a culture of metrics and analytics? Since it is nearing baseball’s World Series time an example is the community of baseball, including its managers, team owners, scouts, players and fans. With better information and analysis of that information, baseball teams perform better – they win!

    Legendary baseball manager Connie Mack’s 3,776 career victories is one of the most unbreakable records in baseball. Mack won nine pennants and five World Series titles in a career that spanned the first half of the 20th century. One way he gained an advantage over his contemporary managers was by understanding which player skills and metrics most contributed to winning. He was way before his time in that he favored hitting power and on-base percentage players to those with a high batting average and speed – an idea that would later become the standard throughout the sport.

    The 2003 book about the business of baseball, Moneyball, describes the depth of analytics that general managers like Billy Beane of the Oakland Athletics apply to selecting the best players, plus batter and pitcher tactics based on the conditions of the team scores, inning, number of outs, and runners on base.

    More recently, the relatively young general manager of the Boston Red Sox, Theo Epstein (who is now with the Chicago Cubs), assured himself of legendary status for how he applied statistics to help overcome the Curse of the Bambino – supposedly originating when the team sold Babe Ruth in 1920 to the New York Yankees – to finally defeat their arch-rival Yankees in 2004 and win a World Series. It ended Boston’s 86-year drought – since 1918 – without a World Series title.

    Author: Gary Cokins

    Source: Information Management

  • What is Predictive Intelligence and how it’s set to change marketing in 2016

    Screen-Shot-2016-01-27-at-14.02.51Explaining how modelling of marketing outcomes can let you make smarter marketing decisions

    As 2016 gets under way we're seeing more discussion of the applications of Predictive Intelligence. It’s a nascent field, but one that is gaining popularity fast and for some very good reasons, which we will discuss in a lot more detail in this article. We’re going to start this article off by explaining what precisely Predictive Intelligence is, we’re then going to provide some hard stats on its impact in the marketing world so far and are going to finish off by explaining how we feel it’s set to shape marketing in 2016 and beyond.

    What Is Predictive Intelligence?

    Despite the buzz surrounding Predictive Intelligence, many still don’t know what it actually is, so here is our definition. Predictive Intelligence is often used interchangeably with terms like Predictive Recommendation, Predictive Marketing and Predictive Analytics. Although there are some minor differences between the terms, broadly speaking they all essentially mean the same thing.

    Our definition of predictive intelligence for marketing is:

    "Predictive Intelligence is the process of first collecting data on consumers and potential consumers’ behaviours/actions from a variety of sources and potentially combining with profile data about their characteristics.

    This data is then distilled and interpreted, often automatically, by sophisticated algorithms, from which a set of predictions are made, and based on these, rules are developed to deliver relevant communications and offers to consumers to persuade them to engage with a business to meet its goals".
    You can see that because of the three-step process of analysis, interpretation and implementing rules for automated communications, a single sentence definition is difficult to devise! But, we hope this shows the essence of Predictive Marketing.

    McKinsey view it as applying mathematical models to best predict the probability of an outcome. They cite customer relationship manager example using models to estimate the likelihood of a customer changing providers known as ‘churn’. Other examples uses sources including everything from CRM data, marketing data, and structure data such as click through rates or engagement levels.

    The relevant actions that are carried out based on this distilled and interpreted data are that of predicting and then executing the optimum marketing message (e.g. image based vs text heavy / formal vs informal) to specific customer’s/potential customers across the optimum marketing channel(s) (e.g. social media vs email), at the optimum time(s) (e.g. morning vs afternoon) in order achieve your company’s marketing goals. These goals being usually higher engagement and/or sales. In summary, you are communicating in a way that is simultaneously most relevant and preferred by the customers/potential customers and most likely to result in you achieving your marketing goal(s).

    Essentially, you set what the marketing goal is and the Predictive Intelligence algorithms will then make good use of the collected data to find the optimum way of achieving it. Predictive Intelligence, aims to deliver content based on customer needs essentially tailoring the experience for the person receiving the information. Predictive Intelligence, empowered by data, thus begins to usher in true personalised one to one marketing communication that is aligned with a company’s marketing goals.

    Some stats and examples showing the value of predictive intelligence

    While we’re sure all the above sounds great to you, understandably, there is nothing more convincing than some cold hard stats on how Predictive Intelligence is actually performing. So without further ado, check out the below.

    As mentioned in IdioPlatform.com, research firm Aberdeen Group conducted an in depth survey for their paper titled “Predictive Analytics In Financial Services” where they interviewed 123 financial services companies. They uncovered that the companies utilising Predictive Analytics typically managed to achieve an 11 per cent increase in the number of clients they secured in the previous 12 months. Further, they saw a 10 per cent increase in the number of new client opportunities that were identified in comparison to those that have not utilised Predictive Analytics. Pretty impressive.

    Additionally, a Forbes Insights survey of 306 execs from companies with $20 million or more in annual revenue found that of the companies that have been carrying out predictive marketing initiatives for at least 2 years, 86% of them have “increased return on investment as a result of their predictive marketing”.

    Finally, in a study by toneapi.com it was found that by understanding the correlation between emotions expressed in the communications and the subsequent click-through rate. Based on this insight, toneapi.com was able to use the understanding of the model to predict how the airline could increase its click-through rates by appealing to certain emotions that would generate more interest from its customers.

    In summary, Predictive Intelligence drives marked improvements across marketing channels.

    The Emotional Connection

    Initially one of the big advantages of algorithmic Predictive Intelligence was the removal of emotion from the equation; human feelings and mood played little part in the decision as the computers choose the best course of action based on hard data. Now, as processing speeds and storage increase and the analysis of unstructured data improves we are seeing companies move into a more fluid form of Predictive Intelligence based around sentiment and emotion. The driver for this is that emotional analysis of text can help drive understanding of the dynamics that are causing key effects. These insights can be then used to optimise the content to match these emotions creating a more iterative and action orientated approach to marketing campaigns.

    These companies look at the emotions which motivate behaviour and utilize technology to predict and improve results. Toneapi analyses freeform text content (such as emails, press releases and brand copy) for emotional impact and then offers up suggestions for improvements. Likewise Motista studies have shown that “emotion is the most predictive driver of customer behavior”, they bring together big data and social science to increase profitability.

    Looking To 2016 And Beyond

    Up until now Predictive Intelligence has seen most action in the B2B world. B2B companies have been using it to crunch colossal amounts of data on customer/potential customer behaviours from a variety of sources. They have then been using it to automatically draw insights from this data based on a set of signals in order to score their leads, identify the most attractive leads earlier on and uncover new high value leads previously unseen. Crucially, Predictive Intelligence has then allowed the B2B companies to tailor the marketing approach and messaging depending on the customer/potential customer’s actions/behaviours (signals) across the different areas where the data has been collected.

    We believe that in 2016 we’re going to see more of the above, and the process is going to become even more refined and sophisticated within the B2B sector. Also, we feel 2016 is the year we see Predictive Intelligence move more and more in the B2C world too, especially now that its frequent use across industry sectors in B2B has proven its effectiveness and given the approach some real credibility. And, we see more interest in Predictive Intelligence around emotion analytics, free-form text, unstructured data and behavioural social sciences.

    Additionally, now, unlike even a couple of years ago, there are quite a few smaller Predictive Intelligence companies on the market in addition to the big names like IBM or Salesforce. Many of these companies are offering intuitive, easy to understand, well designed and well-priced cloud based Predictive Intelligence software packages. This lowers the barrier to entry greatly for Small-to-Medium businesses (SMB’s). It allows them to dip their toes into the world of Predictive Intelligence and test the waters, with little risk or friction, or if they wish, jump straight into the deep end of Predictive Intelligence and reap the rewards.

    Thus a whole new world has opened up to the SMB. A world that not too long ago was reserved mostly for the large corporations that could afford the expensive Predictive Analytics software (which was the only real choice) or that had the budgets big enough to hire data scientists to crunch data and draw insights from it from which to base their predictions.

    Conclusion

    We hope this article has gone some way in demystifying the phrase “Predictive Intelligence”. Further, we hope we have communicated the immense benefits to be reaped if Predictive Intelligence is executed properly. Benefits in the form of higher engagement, higher click through rates, higher conversion rates and emotional impact. Predictive Intelligence has already seen some real traction in the B2B world, and we believe 2016 will mark the year that the B2C companies and SMB’s in general adopt Predictive Intelligence in a meaningful way. Some dipping their toes in and giving it a try and others jumping straight into the deep end and really embracing it.

    Source: Smart Insights

  • White Rabbit Trends: Understanding what is industry-led and what is consumer-led

    White Rabbit Trends: Understanding what is industry-led and what is consumer-led

    Decoding which consumer-led trends will last as opposed to which are doomed to a short shelf life is challenging. Jason Partridge shares tips on how to keep your head above water and still make the most of consumer-led trends.

    Following the industry vs. the consumer perspective

    When it comes to driving innovation in an organization, there is only one certainty. And that is uncertainty. Because as we all work to decode the beliefs and values that shape consumer behaviors, we are faced with more change, more complicated demands, and more market volatility than ever before. 

    Culture is constantly and more rapidly evolving 

    Yet we are tasked to do more with less. And the expectation is that our ability to anticipate the future needs of consumers will be critical to driving growth for our organizations. As a result, many innovation and insight teams are rethinking how they approach understanding consumers. We are all looking for ways to be more agile, more efficient, and more accurate. Which is why it is more important than ever to ask ourselves: is our organization following ´White Rabbit Trends´?  

    A White Rabbit Trend is a trend that’s been created by looking at the consumers’ emerging needs, solely through an industry lens

    It begins with an assumption driven by our industry experience. For example, say we were a bakery. We would only look at consumer’s needs in the context of baked goods assuming that trends outside this context are not relevant.  

    As you can imagine, this leads to the identification of trends that fit within a preconceived framework that is familiar, and similar to patterns identified by our organization in the past. And while not always the case, a White Rabbit Trend is often something we have seen embraced by others in our industry, seen in the media or that has been identified in an industry report. White Rabbit Trends are often macro-trends that others are reporting on and we gravitate to them because we are reacting to changes in our category. In other words, we look to find incremental ways to innovate by following the industry, because we worry we are late for the big party. 

    The problem with White Rabbit Trends is they are often what everyone in our industry is talking about. But the allure of popularity puts you at the risk of parity. And the baggage that comes with an industry perspective is that it can often skew what is truly relevant to consumers. The industry reality is rarely the consumer reality. 

    Compare this to a consumer-led approach

    In a consumer-led approach, we do not assume what is relevant to a consumer. We let the consumer reveal what is relevant to them. As a bakery, the goal may be to arrive at innovation for baked goods, but by starting more broadly, to understand the consumer’s relationship with food in general, and the emerging needs that are increasing in importance, we may arrive at needs that can be expressed through baked goods, that has not entered the baked goods marketplace yet. 

    This type of insight cannot be found in industry reports or by walking down a store aisle. But it is critical if we want to lead the market, versus following a white rabbit trend down a hole, that only leads to a crowded marketplace. 

    Let’s use another example. Take the trend of the ketogenic diet. 

    There was a time, that keto was emerging in culture, and those that identified it early enough were able to capitalize on it. After all, identifying a trend early means you can align your brand and your organization with something that will gain power over time. It is not just about being first. It is about carving out and owning a cultural artifact early, so you can reap the rewards as more people commit to it and it gains social capital.  

    But that time has passed keto. CNN claimed this craze had well entered the mainstream in 2018. Yet, if you look at industry news, trend reports and product launches happening right now, the industry is pushing the keto narrative.

    But keto is a white rabbit trend. If you look at the culture of keto, keto is not just a trend that has reached the mainstream but consumers are already starting to find it limiting. So, it will either evolve or disappear. But if you are tasked with innovation to steal share from those that have already invested in keto, you need to identify the emerging, consumer-led trend that is relevant to a large enough portion of the population and growing in relevance in consumer culture. Growth is key.

    Our prediction is that companies that do not fall down the keto rabbit hole and recognize that it isn’t about the diet, it’s about an LCHF (Low Carb High Fat) lifestyle that will catch the next wave of what consumers are looking to do when it comes to food. 

    When we look at a trend that we are thinking of actioning on, we need to ask ourselves a couple of simple but important questions to determine if it is a consumer-led trend. 

    First, where does it sit in culture? Is it still a part of the early majority of has it moved to close to mainstream for your organization to gain competitive advantage

    Second, while it can’t be too big a trend, it also can not be too small. Is it relevant to a large enough tribe or microculture of consumers to prove that it is an emerging movement that will likely grow? 

    Third, was it revealed to us by consumers naturally, or did we 'force it out of them'. Did our bias impact the identification of the trend, or was it a 'eureka' moment where the consumer revealed a need that was both unmet and emotionally meaningful to them. 

    If not, it may be an industry-led trend.  

    This subtle, but critical nuance in understanding the difference between what is industry-led versus consumer-led can impact everything. And it can be immensely powerful in helping us understand whether we renovate or create something net new, how we package a product, right down to whom we target the product to at launch. 

    The reality is our jobs are hard. We are tasked with predicting the future and trying to find a way to be certain that it will work in a world where 95% of new products fail. 

    The question we need to ask ourselves is: will we find more certainty by looking at what the industry has done in the past? Or should we be seeking out new ways to decode what consumers are doing in the present?

    Author: Jason Partridge

    Source: Greenbook Blog

EasyTagCloud v2.8