35 items tagged "analytics"

  • 5 Guidelines that keep your business' analytics app working optimally

    5 Guidelines that keep your business' analytics app working optimally

    One of the key challenges faced by organizations deploying an enterprise-wide analytics solution is the maintenance and upgrade of its applications. Most organizations follow an agile development methodology that entails frequent releases with new content as well as routine upgrades, patches, fixes, security updates etc.

    Depending on the complexity of the application, you need to invest a significant amount of time, energy, and manpower to ensure that none of the existing reports, dashboards, or underlying data is adversely impacted by any of these maintenance tasks. Any degradation in performance or accuracy of data in these applications may not only reflect poorly on the system administrators, but it may also lead to a lower level of reliability in the analytics solution and ultimately impact user adoption and business value throughout the organization negatively.

    Hence, it is critical for system administrators to ensure that the application and the data within it remains consistent and reliable for its end users, irrespective of the ongoing maintenance tasks that they have to perform on the system.

    A typical testing methodology adopted by most organizations involves manual testing and 'eye-balling' of a subset of reports and data after major maintenance tasks such as patches and updates. Organizations with more resources may create custom test scripts and automate certain parts of the testing and QA process.

    Upgrades are typically more involved and take a lot more time and testing to ensure consistency. When your analytics application grows to thousands of users and tens of thousands of reports and dashboards, it is usually cost prohibitive to test every single report for every user. Hence, automation of this testing process is critical to the long-term success of an analytics application.

    Here are five things to keep in mind when automating testing of analytics applications:

    1. Real-world applications

    Make sure that tests are run on as many real-world production applications as possible. Testing on just one or a handful of sample environments is not ideal and can lead to unforeseen issues when deploying the update or an upgrade. The applications on which tests are run need to be representative of real-world applications that your users or customers will be using.

    2. Replica of live production system:

    Ensure that there is no impact to the actual live production system at the time of testing. To run a series of tests at any time of the day, you need a replica of the production system with the same hardware and software, but in an isolated environment that is as similar to the production system as possible. This way, as your users report new issues, you can analyze them and assess their impact by running tests in a separate environment so system performance for users is not affected by the ongoing testing. Using a cloud platform makes it easier to quickly provision a replicated environment for testing purposes.

    3. Platform approach to testing

    It is really important to design the automated testing system as a platform for running a variety of tests, rather than creating disjointed automation scripts for different scenarios. The testing process also needs to incorporate changes when it fails to identify certain issues. With a single platform, you can achieve economies of scale and optimize and share the testing infrastructure across multiple scenarios and applications.

    4. Granularity of test execution data

    Test results should not be simply binary in terms of pass or fail. Irrespective of whether an application passes or fails a particular test, it is important to capture detailed statistics and information from every step of the testing process. This will help you identify and anticipate future issues and fine tune the testing process.

    5. Analysis of test results 

    Depending on the complexity of the testing process, the analysis of test results can be a full-fledged analytics application in itself. The test results should be stored in an optimized format (for example, in a data warehouse) that makes it easy to analyze in detail to gain further insights into the application performance. This will also help analyze historical system test results and monitor the performance over a period of time.


    With the ever-increasing importance of analytics and the the use of mobile devices at an all-time high, an optimally functioning analytics app can be of value for any business. These apps should be unaffected by necessary processes like updates, testing, and maintenance in order to keep working optimally.

    That's why it's crucial that your business aways keeps the guidelines mentioned above in mind. Keep improving your applications, especially the ones connected to your business' analytics solution, but never let these improvements affect the use of the app negatively!

    Author: Pradyut Bafna

    Source: MicroStrategy

  • 6 signals that help you recognize the failure of competitors

    6 signals that help you recognize the failure of competitors

    On August 5, 2018, retail giant Toys R Us officially closed its doors after 70 years of business. The company’s financial troubles were already well known, having filed for Chapter 11 bankruptcy in September of 2017, but the decision to close its remaining 807 brick-and-mortar locations still shocked many who are familiar with the toy industry. Other retailers, like Payless, Wet Seal, and RadioShack, had managed to successfully exit bankruptcy during the same year, and many believed Toys R Us was on the right track. Still, there were indications that Toys R Us wasn’t going to recover its losses, long before the decision to shut down was made public.

    Knowing in advance that a competitor is failing can provide numerous opportunities for your business to grow. On the other hand, getting caught off guard puts your company at a disadvantage, making it harder to move quickly and fill the void left behind.

    These are 6 warning signs that your competition might be going under: 

    1. CFO turnover: 

    According to a study by recruiting firm Korn/Ferry International with 1,000 companies surveyed, only 4% of businesses saw their new CFO leave within a year of taking the job. Only 15% lost a CFO within 2 years of hiring. The main reasons CFOs cited for leaving were operational problems and a fear of harming their own careers if/when the company went under on their watch. So, if a company can’t seem to hang on to a CFO, it could stem from bigger problems behind the scenes.

    2. Insider sales: 

    In a similar vein, insider selling at public companies can indicate that the company’s future isn’t looking bright. Executives sell stock all the time, often for reasons unrelated to the company’s performance. But when multiple executives dump their stock at the same time, it’s worth investigating why.

    3. Slashing budgets: 

    At Toys R Us, workers across the county reported that cleanings, remodels, and repairs were being overlooked. Sometimes indefinitely, months before the retailer went under. While that information may be hard to obtain, you can identify when expenses, such as marketing budgets, have been slashed. Are they missing from tradeshows? Have their Google pay-per-click ads stopping showing up? Has their display advertising dried up? These are only some of the signals that marketing budgets took a cut.

    4. Unpublicized website changes: 

    Failing companies may quietly update their websites to reflect internal changes that they don't want the public to notice. If you’re tracking your competitors’ websites, keep an eye out for missing product pages, removed executive profiles, and dropped partnerships. Even a general decrease in press releases, events, and everyday communication can signal that problems are mounting inside the company.

    5. Unexpected pivots: 

    To attract customers and appease investors, failing companies may try to switch up their offerings when the end seems nigh. In 2017, Toys R Us started positioning itself as an interactive playground where kids could test out new toys, in an effort to distinguish itself from competitors like Amazon, Target, and Walmart. Borders, the defunct book chain, took the opposite approach by rolling out an ebook service a year prior to closing, in a last-ditch attempt to keep up with Amazon’s Kindle. Both companies made dramatic shifts away from their core businesses in their final days, with equally unsuccessful results.

    6. Layoffs and buyouts: 

    Companies struggling to stay afloat may try to cut expenses by cutting staff. Layoffs are a pretty obvious signal that a business can’t pay its workers, but executive buyouts can also be an overlooked sign of trouble. Senior managers and executives are usually highly compensated and cutting a few of those high salaries and generous penchants might indicate that a company is looking to scale back expenditures at the cost of long-term stability and leadership.


    When analyzing your competition, it is always important to keep in mind what developments for competitors mean to your business. Do you notice one or more of the 5 signals that mentioned above that a competitor is failing? Maybe this poses opportunities for your business. But it can also mean that your competitor is failing because of something that  is also a threat to your business. 

    Always keep your competitive intelligence up to date in order to grasp opportunities or defend your company against potential threat. Having knowledge about your competition helps you to stay ahead of your competition.

    Source: CI Radar


  • Better analytics must address cloud computing's remaining challenges


    Without proper analytics in place, many cloud services customers are wasting resources, struggling with compliance and suffering from outages and unexpected costs, according to a new study from Forrester Research.

    The study, sponsored by enterprise cloud hosting provider iLand, shows that all of the 275 IT decision makers and senior business executives surveyed in the United States, United Kingdom and Singapore said they’ve experienced at least one negative financial or operational impact due to missing or hidden metadata. These negative business impacts include outages, wasted resources, unexpected costs and challenges reporting to management.

    “Companies aren’t just using the cloud—they depend on it,” the report says. “Nevertheless, cloud providers fail to keep cloud users happy. As companies expand their use of cloud services, they need to be confident that their cloud providers aren’t holding anything back, and are committed to their success.”

    Other findings of the report are that overall, cloud customer satisfaction is low, with a strong sentiment that providers don’t respond to customers’ needs. More than half of the respondents said their provider does not understand their company’s needs or care about their success.

    Forrester recommends that companies considering cloud services evaluate the native tools delivered by the cloud platform to ensure they deliver visibility, alerting and analytics; demand clarity about compliance data, on-call experts and straightforward processes from your cloud provider; and look for a cloud with onboarding and support teams staffed by experts

    Auteur: Bob Violino

    Bron: Information Management

  • Chinachem: successful use of SAP software in the Hong Kong property market

    Chinachem: successful use of SAP software in the Hong Kong property market

    According to a January story in the South China Morning Post, Hong Kong has topped the table as the world’s most expensive housing market for the 9th straight year. That sounds like good news for property developers in that area. But, according to the same story, prices of Hong Kong homes also decreased with 7.2% in the last four months.

    What the news really shows is that the property market can be volatile. Combined with long construction times running into multiple years and billion dollar capital investments, that makes property development an extremely challenging industry.

    Few of Hong Kong’s developers are more aware of that than the Chinachem Group. While Chinachem began its life in agricultural projects and chemicals, Chinachem has developed its presence as one of Hong Kong’s most famous property companies over the year. Tthrough prosperous times and through tough times. Recently Chinachem was able to win a big land parcel in one of Hong Kong’s upmarket suburbs after surveyors cut their valuation by 10 per cent, another sign of softening property prices.

    However, in an industry that is often very traditional in its execution, it is not just prices that are putting property businesses under increasing competitive pressure. The digital explosion is also having a huge effect. As Chinachem’s Executive Director and Chief Executive Officer, Donald Choi, points out: technology is changing how companies in every industry are organized and run. And Chinachem isn’t any different.

    Changing times

    Hong Kong has been lucky in a way, especially in the property market, which has been a long-term growth market. But complacency can be a killer.

    Chinachem’s Head of Group IT, Joseph Cagliarini, believes that the lesson to be learned from a truly global brand like Kodak, which went bankrupt because the world changed from film to digital photography, cannot be overlooked. Instead, he calls for a relentless pursuit of technology to make sure Chinachem is not only listening to its customers, but able to respond appropriately.

    Different companies are at different stages of embracing digital transformation and technology. Anticipating what is required and strategizing change, Chinachem has turned its eyes to a new generation of milestones, and embarked on a journey to become an intelligent business.

    For the long-time property developer, that change starts with (real-time) data. Like many companies, Chinachem didn’t have a central view of its operations. So, all of its business units operated autonomously to some extent. That created a mountain of manual processes, and many separate systems containing valuable information.

    In October 2018, Chinachem selected a comprehensive suite of SAP software and cloud solutions to drive operational efficiency across finance and HR operations for its corporate group and hotels in order to help drive long-term efficiencies and growth. SAP is also providing Chinachem to help drive rapid innovation and increase the strategic value of human resources.

    Once the solutions are fully implemented, Chinachem will enjoy a variety of benefits, including real-time updates on financial performance that will optimize their finance processes. This includes everything from planning and analysis to period end close and treasury management.

    Long-term plans

    Thanks to other key features the group’s long-term objectives, such as enhancing financial planning and analysis, accelerating planning cycles, increasing profitability, and making finance functions more efficient are also supported. Chinachem is now able to accelerate the building and deployment of apps and extensions that engage employees in new ways. This will allow HR to be flexible and innovative without compromising the organization’s core HR process.

    In addition, Chinachem’s hotels can personalize their end-to-end HR landscape, creating an outstanding, seamless and secure user experience. The group can also leverage data from SAP solutions to make insightful business decisions that will have lasting impact.

    Customers are still king

    Chinachem’s journey also involves adapting to changing customers who now live on multiple platforms, both online and offline.

    With the right technology and software, Chinachem will be able to monitor customer behavior and, therefore respond to their needs without actually being asked. Executive Director of Chinachem, Donald Choi, believes that advanced data analytics could be the key to this. Not to replace offline experiences, but to be at all the right places at the right time.

    In an ever-changing and increasingly digital world, a comprehensive suite of SAP software and cloud solutions may not be the final answer for all of Chinachem’s needs. However, as Donald Choi says, “it is a good starting point for this journey.”

    Author: Terence Leung

    Source: SAP

  • Competenties en mogelijkheden voor succes met (big) data analytics


    Voor bedrijven uit alle industrieën is big data analytics van grote waarde. Deze waarde ontstaat onder andere door een betere focus op de klant en het verbeteren van processen. Toch is het niet gemakkelijk om deze waarde er meteen uit te halen. Veel organisaties onderschatten de kosten, complexiteit en competenties om op dat punt te komen.

    Big data analytics

    Big data analytics helpt bij het analyseren van datasets die over het algemeen een stuk groter en gevarieerder zijn dan de data types uit traditionele business intelligence of datawarehouse omgevingen. Het doel van big data analytics is het herkennen van verborgen patronen, onbekende correlaties, markttrends, voorkeuren van de klant en andere informatieve bedrijfsinformatie.

    Waarom is succes behalen met big data lastig?

    Succes behalen met big data is niet vanzelfsprekend. Veel organisaties worstelen op verschillende aspecten met het inzetten van big data. De volgende aspecten kunnen worden onderscheiden:

    • Big data analytics wordt gezien als een technologie project en niet als een transformatie dat op verschillende fronten binnen de organisatie plaatsvindt.
    • Het ecosysteem van aanbieders is gefragmenteerd en veranderd snel.
    • Nieuwe technologieën en architecturen vragen om nieuwe vaardigheden van gebruikers.
  • De uitdaging van het structuur aanbrengen in ongestructureerde data

    De uitdaging van het structuur aanbrengen in ongestructureerde data

    De wereld verzamelt steeds meer data, en met een onrustbarend groeiende snelheid. Vanaf het begin van de beschaving tot ongeveer 2003 produceerde de mensheid zo’n 5 exabyte aan data. Nu produceren we deze hoeveelheid elke twee dagen. 90 procent van alle data is in de afgelopen 2 jaar gegenereerd.

    Op zich is er niets mis met data, maar het probleem is dat een groot deel hiervan ongestructureerd is. Deze ‘dark data’ omvat inmiddels al zo’n vier vijfde van de totale databerg. En daarmee beginnen de echte problemen.


    Ongestructureerde data is onbruikbaar. Je weet niet wat erin zit, wat de structuur is en hoeveel informatie daarvan misschien belangrijk is. Hoe kun je voldoen aan de eisen van de nieuwe privacywetgeving, als je niet eens weet welke informatie er in je data zit? Het kan gevoelige informatie zijn, zodat je de wet overtreedt zonder dat je daarvan op de hoogte bent. Totdat zich een lek voordoet en alle gegevens op straat liggen. En hoe kun je voldoen aan de wet openbaarheid bestuur en straks aan de wet open overheid, als je niet weet waar je de informatie moet vinden? De AVG verplicht je om persoonsgegevens te vernietigen als de persoon daarom vraagt. Maar als je niet weet waar je die moet vinden, sta je met de mond vol tanden.


    Stel je data voor als een ijsberg. Het grootste deel ligt onder water: je ziet het niet. Wat boven het water uitsteekt is de kritische informatie die je dagelijks gebruikt en die nodig is om jouw organisatie te laten werken. Direct onder het oppervlak ligt een groot deel dat ooit kritisch was. Het is gebruikt en daarna opgeslagen om vervolgens nooit meer aangeraakt te worden: redundant, overbodig en triviaal, kortom ROT.

    Het grootste deel van de berg bevindt zich daar weer onder, het is de ‘dark data’, verzameld door mensen, machines en allerlei werkprocessen. Je hebt geen idee wat er zich in dat donkere deel schuilhoudt. Het zijn gegevens die zijn verzameld door sensoren, video’s van beveiligingscamera’s, en vele, vele documenten van lang, lang geleden.

    Nieuwe inzichten

    Je kunt het natuurlijk negeren, je hebt het immers niet nodig voor je dagelijkse workflow. Maar voor hetzelfde geld bevindt zich in die dark data waardevolle informatie die gebruikt kan worden om de processen in de organisatie beter te laten verlopen. Of nieuwe toepassingen mogelijk te maken. Door data uit de berg te leggen op andere data bijvoorbeeld, kun je plotseling nieuwe inzichten verkrijgen waarmee beleid kan worden gemaakt: informatiegestuurd beleid.

    Digitale dompteur

    Als alle plannen en elke beleidsmaatregel kunnen worden onderbouwd met keiharde gegevens uit de databerg, dan hebben we de heilige graal gevonden. De kwaliteit van de dienstverlening van de overheid gaat met sprongen omhoog, en er komen nieuwe impulsen voor veiligheid, handhaving, onderhoud en schuldhulpverlening, om maar eens een paar beleidsterreinen te noemen.

    Dat zal waarschijnlijk een onbereikbaar ideaal blijven, maar we kunnen wel flinke stappen in de goede richting maken. Digitaal werken betekent voortdurend aanpassen, herordenen, migreren. Om digitale informatie te temmen is een digitale dompteur nodig: een beheeromgeving die structuur aanbrengt en die inspeelt op de voortdurende veranderingen die digitalisering met zich meebrengt.

    Bron: Managementbase

  • Effective data analysis methods in 10 steps

    Effective data analysis methods in 10 steps

    In this data-rich age, understanding how to analyze and extract true meaning from the digital insights available to our business is one of the primary drivers of success.

    Despite the colossal volume of data we create every day, a mere 0.5% is actually analyzed and used for data discovery, improvement, and intelligence. While that may not seem like much, considering the amount of digital information we have at our fingertips, half a percent still accounts for a huge amount of data.

    With so much data and so little time, knowing how to collect, curate, organize, and make sense of all of this potentially business-boosting information can be a minefield, but online data analysisis the solution.

    To help you understand the potential of analysis and how you can use it to enhance your business practices, we will answer a host of important analytical questions. Not only will we explore data analysis methods and techniques, but we’ll also look at different types of data analysis while demonstrating how to do data analysis in the real world with a 10-step blueprint for success.

    What is a data analysis method?

    Data analysis methods focus on strategic approaches to taking raw data, mining for insights that are relevant to a business’s primary goals, and drilling down into this information to transform metrics, facts, and figures into initiatives that benefit improvement.

    There are various methods for data analysis, largely based on two core areas: quantitative data analysis methods and data analysis methods in qualitative research.

    Gaining a better understanding of different data analysis techniques and methods, in quantitative research as well as qualitative insights, will give your information analyzing efforts a more clearly defined direction, so it’s worth taking the time to allow this particular knowledge to sink in.

    Now that we’ve answered the question, ‘what is data analysis?’, considered the different types of data analysis methods, it’s time to dig deeper into how to do data analysis by working through these 10 essential elements.

    1. Collaborate your needs

    Before you begin to analyze your data or drill down into any analysis techniques, it’s crucial to sit down collaboratively with all key stakeholders within your organization, decide on your primary campaign or strategic goals, and gain a fundamental understanding of the types of insights that will best benefit your progress or provide you with the level of vision you need to evolve your organization.

    2. Establish your questions

    Once you’ve outlined your core objectives, you should consider which questions will need answering to help you achieve your mission. This is one of the most important steps in data analytics as it will shape the very foundations of your success.

    To help you ask the right things and ensure your data works for you, you have to ask the right data analysis questions.

    3. Harvest your data

    After giving your data analytics methodology real direction and knowing which questions need answering to extract optimum value from the information available to your organization, you should decide on your most valuable data sources and start collecting your insights, the most fundamental of all data analysis techniques.

    4. Set your KPIs

    Once you’ve set your data sources, started to gather the raw data you consider to potentially offer value, and established clearcut questions you want your insights to answer, you need to set a host of key performance indicators (KPIs) that will help you track, measure, and shape your progress in a number of key areas.

    KPIs are critical to both data analysis methods in qualitative research and data analysis methods in quantitative research. This is one of the primary methods of analyzing data you certainly shouldn’t overlook.

    To help you set the best possible KPIs for your initiatives and activities, explore our collection ofkey performance indicator examples.

    5. Omit useless data

    Having defined your mission and bestowed your data analysis techniques and methods with true purpose, you should explore the raw data you’ve collected from all sources and use your KPIs as a reference for chopping out any information you deem to be useless.

    Trimming the informational fat is one of the most crucial steps of data analysis as it will allow you to focus your analytical efforts and squeeze every drop of value from the remaining ‘lean’ information.

    Any stats, facts, figures, or metrics that don’t align with your business goals or fit with your KPI management strategies should be eliminated from the equation.

    6. Conduct statistical analysis

    One of the most pivotal steps of data analysis methods is statistical analysis.

    This analysis method focuses on aspects including cluster, cohort, regression, factor, and neural networks and will ultimately give your data analysis methodology a more logical direction.

    Here is a quick glossary of these vital statistical analysis terms for your reference:

    • Cluster: The action of grouping a set of elements in a way that said elements are more similar (in a particular sense) to each other than to those in other groups, hence the term ‘cluster’.
    • Cohort: A subset of behavioral analytics that takes insights from a given data set (e.g. a web application or CMS) and instead of looking at everything as one wider unit, each element is broken down into related groups.
    • Regression: A definitive set of statistical processes centered on estimating the relationships among particular variables to gain a deeper understanding of particular trends or patterns.
    • Factor: A statistical practice utilized to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved variables called ‘factors’. The aim here is to uncover independent latent variables.
    • Neural networks: A neural network is a form of machine learning which is far too comprehensive to summarize, but this explanation will help paint you a fairly comprehensive picture.

    7. Build a data management roadmap

    While (at this point) this particular step is optional (you will have already gained a wealth of insight and formed a fairly sound strategy by now), creating a data governance roadmap will help your data analysis methods and techniques become successful on a more sustainable basis. These roadmaps, if developed properly, are also built so they can be tweaked and scaled over time.

    Invest ample time in developing a roadmap that will help you store, manage, and handle your data internally, and you will make your analysis techniques all the more fluid and functional.

    8. Integrate technology

    There are many ways to analyze data, but one of the most vital aspects of analytical success in a business context is integrating the right  decision support software and technology.

    Robust analysis platforms will not only allow you to pull critical data from your most valuable sources while working with dynamic KPIs that will offer you actionable insights, it will also present the information in a digestible, visual, interactive format from one central, live dashboard. A data analytics methodology you can count on.

    By integrating the right technology for your statistical method data analysis and core data analytics methodology, you’ll avoid fragmenting your insights, saving you time and effort while allowing you to enjoy the maximum value from your business’s most valuable insights.

    9. Answer your questions

    By considering each of the above efforts, working with the right technology, and fostering a cohesive internal culture where everyone buys into the different ways to analyze data as well as the power of digital intelligence, you will swiftly start to answer to your most important, burning business questions. 

    10. Visualize your data

    Arguably, the best way to make your data analysis concepts accessible across the organization is through data visualization. An online data visualization is a powerful tool as it lets you tell a story with your metrics, allowing users across the business to extract meaningful insights that aid business evolution. It also covers all the different ways to analyze data.

    The purpose of data analysis is to make your entire organization more informed and intelligent, and with the right platform or dashboard, this can be simpler than you think.

    Data analysis in the big data environment

    Big data is invaluable to today’s businesses, and by using different methods for data analysis, it’s possible to view your data in a way that can help you turn insight into positive action.

    To inspire your efforts and put the importance of big data into context, here are some insights that could prove helpful. Some facts that will help shape your big data analysis techniques.

    • By 2020, around 7 megabytes of new information will be generated every second for every single person on the planet.
    • A 10% boost in data accessibility will result in more than $65 million extra net income for your average Fortune 1000 company.
    • 90% of the world’s big data was created in the past three years.
    • According to Accenture, 79% of notable business executives agree that companies that fail to embrace big data will lose their competitive position and could face extinction. Moreover, 83% of business execs have implemented big data projects to gain a competitive edge.

    Data analysis concepts may come in many forms, but fundamentally, any solid data analysis methodology will help to make your business more streamlined, cohesive, insightful and successful than ever before.

    Author: Sandra Durcevic

    Source: Datapine

  • Embedded Analytics facilitates strategic choices

    The adoption rate of embedded analytics among business users is twice that of traditional business intelligence (BI) tools, according to the fourth annual State of Embedded Analytics report by Logi Analytics.

    The report, which studied how organizations embed business intelligence and analytics inside their software applications, indicates that embedded analytics continues to improve user satisfaction and increase end-user adoption of analytic tools.

    “The report shows that demand for self-service analytics is expanding beyond data analysts to everyday users, who need to monitor and measure key performance indicators,” said Brian Brinkmann, vice president at Logi Analytics. “If organizations want to see these users be successful, they need to offer analytics within the business applications they are using every day.”

    When users are forced to leave their preferred business applications to conduct analysis, they are less likely to use that analytics tool. The report found that 43% of application users leverage embedded analytics regularly, which is double the user adoption rate of traditional BI tools reported in the 2015 State of Self-Service Report.

    Application providers say they expect the adoption rate of embedded analytics to increase to 52% within two years.

    The report notes that for the first time, embedded analytics has become a top priority for executive management. More than one third (36%) of application providers say the CEO or executive management team is the primary driver of their decision to embed.

    Author: Bob Violino

    Source: Information Management


  • Forrester: The developments around enterprise BI platforms for 2019

    Forrester: The developments around enterprise BI platforms for 2019

    Information technology keeps moving forward at an ever-increasing pace. Business intelligence (BI) technology isn’t falling behind and keeps constantly evolving. BI vendors can no longer be categorized as:

    • IT-focused and enterprise-scalable vs. business-user-focused BI platforms mostly going after departmental and line of business use cases. All formerly IT-focused BI vendors have moved squarely into the business-user-focused territory. And most BI vendors that originally architected their platforms for ease of use, often sacrificing scalability, have introduced large-enterprise scalability features, technology, and architecture.
    • On-premises vs. cloud BI platforms. All formerly on-premises-only BI vendors now have cloud deployment options. 
    • Data visualization platforms. Forrester no longer considers data visualization a separate, distinct market segment. It can rather be seen as a table-stakescapability of all BI and analytics platforms.

    Going forward (but no guarantees as the market will surely grow, mature, and morph again next year), Forrester will segment enterprise BI platforms into the following three categories:

    • Client-managed enterprise BI platforms. In this segment, clients are fully responsible for deploying their private instance of the BI software. They may choose to install it on-premises, in a public cloud, or hosted by a vendor. The client is ultimately responsible for the timing of upgrades and other software platform management decisions. Organizations that want to retain control over software upgrades and fixes should consider vendors in this category.
    • Vendor-managed enterprise BI platforms. In this segment, clients do not deploy but subscribe to software. A vendor maintains a single software instance and partitions it for logical private instances for each client. All clients are on the same software version, and all get the same continuous upgrades. Clients have no control over upgrades or other decisions. Organizations that are ready to completely shift software management responsibilities to the vendor should consider this category. Organizations must also be willing to use software deployed in a public cloud, as software in this category does not run on-premises.
    • In-data-lake enterprise BI platforms. These BI platforms (app server, metadata server, etc.) run entirely inside data lake clusters and do not move data (including result sets) out of clusters. Organizations that are mostly looking for a BI platform to analyze terabytes of data stored in data lakes, especially for detail-level (versus aggregate) analysis, should consider vendors in this category.

    Author: Boris Evelson

    Source: Forrester

  • Healthcare analytics and the opportunities to improve patient care

    Healthcare analytics and the opportunities to improve patient care

    Healthcare: everyone needs it, it’s a rapidly technologizing industry, and it produces immense amounts of data every day.

    To get a sense of where analytics fit into this vital market, Sisense interviewed Hamza Jap-Tjong, CEO and Co-Founder of GeriMedica Inzicht, a GeriMedica subsidiary. GeriMedica is a multi-disciplinary electronic medical record (EMR) company servicing the elderly care market and as such, their SaaS platform is filled with data of all kinds. Recently, they rolled out analytics that practitioners could use to improve the quality of care (versus the prior main use case in healthcare analytics, which was done by the billing and finance departments). This helps keep practitioners focused on helping patients instead of spending (wasting) hours in a software product. Hamza opened up about the state of healthcare analytics, how it can improve care for patients, and where the industry is going.

    The state of healthcare analytics

    As previously mentioned, the healthcare industry creates tons of data every day from a wide array of sources.

    'I think tons of data might be an understatement', says Hamza, citing a Stamford study. 'They were talking about data on the scale of exabytes (an exabyte equals a billion gigabytes). Where does all that data come from? Fitbits, iPhones, fitness devices on your person… healthcare data is scattered everywhere: not only treatment plans and records created by practitioners, but also stored in machines (X-rays, photographs, etc.)'.

    Data is the new oil, but without the right tools, the insights locked in the data can’t help anyone. At present, few healthcare organizations (let alone frontline practitioners) are taking advantage of the data at their disposal to improve patient care. Moreover, these teams are dealing with amounts of information so vast that they are impossible to make sense of without help (like from a BI or analytics platform). They can’t combine these datasets to gain a complete picture without help, either. Current software offerings, even if they have some analytical capabilities for the data that they capture, often can’t mash it up with other datasets.

    'In my opinion, we could really improve the data gathering', says Hamza. 'As well as the way we use that data to improve patient care. What we know is that when you look at doctors, nurses, physical therapists, everybody close to care processes and patients, is hankering for data and insights and analytics and we see that at the moment there isn’t a tool that is good enough or easy enough for them to use to gain the insights that they are looking for'.

    Additionally, the current generation of medical software has a high barrier to entry/learning curve when it comes to getting useful insights out. All these obstacles prevent caregivers from helping clients as much as they might be able to with analytics that are easier to use.

    Improving patient care (and improving analytics for practitioners)

    Analytics and insight-mining systems have huge potential to improve patient care. Again, healthcare data is too massive for humans to handle unaided. However, there is hope: Hamza mentioned that AI systems were already being used in medical settings to aggregate research and present an array of options to practitioners without them having to dig through numerous sources themselves.

    'Doctors or nurses usually don't work nine-to-five. They work long shifts and their whole mindset is focused on solving mysteries and helping the patients. They don't have time to scour through all kinds of tables and numbers. They want an easy-to-understand dashboard that tells a story from A to Z in one glance and answers their question'.

    This is a huge opportunity for software and analytics companies to help improve patient care and user experience. Integrating easy-to-understand dashboards and analytics tools within medical software lowers the barrier to entry and serves up insights that practitioners can use to make better decisions. The next step is also giving clinicians the right tools to build their own dashboards to answer their own questions.

    The future of healthcare analytics

    Many healthcare providers might not know how much analytics could be improving their work and the care they give their patients. But they certainly know that they’re spending a lot of time gathering information and putting it into systems (and, again, that they have a ton of data). This is slowly changing today and will only accelerate as time goes on. The realization of how much a powerful analytics and BI system could help them with data gathering, insight harvesting, and providing better care will drive more organizations to start using a software’s analytics capabilities as a factor in their future buying decisions.

    Additionally, just serving up insights won’t be enough. As analytics become more mainstreamed, users will want the power to dig into data themselves, perform ad hoc analyses, and design their own dashboards. With the right tools and training, even frontline users like doctors and nurses can be empowered to create their own dashboards to answer the questions that matter most to them.

    'We have doctors who are designers', says Hamza. 'They are designing their own dashboards using our entire dataset, combining millions of rows and records to get the answers that they are looking for'.

    Builders are everywhere. Just as the healthcare space is shifting away from only using analytics in financial departments and putting insights into the hands of frontline practitioners, the right tools democratize the ability to create new dashboards and even interactive analytics widgets and empower anyone within an organization to get the answers and build the tools they need. Such as many other industries, healthcare has to go through a technological transformation.

    Creating better experiences

    When it comes to the true purpose of healthcare analytics, Hamza summed it up perfectly:

    'In the end, it’s all about helping end users create a better experience'.

    The staggering volume of data that the healthcare industry creates presents a huge opportunity for analytics to find patterns and insights and improve the lives of patients. As datasets become more massive and the analytical questions become more challenging, healthcare teams will rely more and more on the analytics embedded within their EMR systems and other software. This will lead them to start using the presence (or lack thereof) and quality of those analytics when making decisions. Software companies that understand this will build solutions that answer questions and save lives, the ones that don’t might end up flatlining.

    Author: Jack Cieslak

    Source: Sisense

  • Hoe de slimme stad een dom idee kan worden

    smartcitynieuwDigitale volgsystemen De ‘smart city’ is een belofte om leefbaarheid te verbeteren, ook in Nederlandse steden. Maar uit recent onderzoek blijken grote valkuilen. Bepaalt straks een algoritme welke wijkbewoners een gevaar zijn?


    Hudson Yards is nu nog de grootste bouwput van Manhattan. De eerste skeletten van de wolkenkrabbers zien er op het eerste gezicht niet anders uit dan alle andere in New York. Maar hier wordt geen gewone wijk gebouwd.

    „We gaan met sensoren op elke hoek de luchtkwaliteit meten”, zegt Constantine Kontokosta, een van de directeuren van het project. „We gaan ook temperatuur, lichtniveaus en lawaai constant in kaart brengen.” Hij doet er bij CUSP, een instituut van New York University ook onderzoek naar; de gegevens wil hij combineren met data van gemeentelijke klachtenlijnen, de sociale dienst, mogelijk ook energiebedrijven.

    „We gaan berichten op sociale media van bewoners bijhouden zodat we hun sentiment kunnen meten, en locatiedata van mobiele telefoons bijhouden via wifinetwerken zodat we zien hoe ze door de buurt bewegen.”

    Vanaf volgend jaar, als de eerste bewoners er intrekken, moet Hudson Yards de eerstequantified community van de wereld worden: een wijk waarin alles en iedereen op elk moment van de dag gemeten wordt. „Voor het eerst kunnen we danin real time een wijk analyseren.” Met de data kan het gemeentebestuur bijvoorbeeld snel ingrijpen bij geluidsoverlast.

    Op termijn wil hij ook gegevens over lichaamsbeweging, gezondheid en gewicht van de wijkbewoners verzamelen, bijvoorbeeld met data uit wearables zoals polsbandjes. „Geheel vrijwillig natuurlijk.” En alle data worden geanonimiseerd opgeslagen, zo bezweert hij. Hudson Yards wordt een van de duurste buurten van de stad; de ontwikkelaars verwachten dat er veel vraag is naar zo’n gekwantificeerd leven.

    Zover als in New York gaat het vrijwel nergens, maar overal ter wereld worden burgers digitaal in kaart gebracht, vaak onder de naam ‘smart city’. Rotterdam komt dit najaar met een ‘smart city-aanpak’. Amsterdam heeft sinds vorig jaar een chief technology officer (CTO) – technologisch directeur zeg maar – die allerlei experimenten doet; van het plaatsen van sensoren voor druktemetingen in fietsenstallingen tot het geautomatiseerd in de gaten houden van mensenmassa’s tijdens evenementen. Ook de gemeente Midden-Drenthe heeft een eigen smart city-app.

    „De beloftes zijn groot: de smart city kan erg veel opleveren”, zegt Jorrit de Jong, academisch directeur van het overheidsinnovatieprogramma van Harvard University. Er komt veel meer informatie beschikbaar waarop gemeenten hun beleid kunnen afstemmen. „Maar er zijn ook potentiële problemen. Daar wordt nog niet altijd even goed over nagedacht.”


    Volgens de bouwbedrijven Related en Oxford Properties, is de nieuwe wijk Hudson Yards het grootste bouwproject ooit op het New Yorkse eiland Manhattan. Er hangt een totaal prijskaartje aan van ruim 20 miljard dollar (ruim 17 miljard euro), en er verrijzen 16 torens met woon-, winkel- en kantoorruimte.

    De bouwers werken samen met de New York University om er de eerste quantified community van te maken: een wijk waarin op allerlei manieren informatie wordt verzameld en geanalyseerd worden. De inzichten die daaruit komen kunnen worden gebruikt door het stadsbestuur, en door de projectontwikkelaars.

    Dit zijn de vier potentiële problemen:

    1 De slimme stad wordt gehackt

    Onderdelen van slimme steden worden al regelmatig gehackt: er zijn volop voorbeelden van ‘slimme’ straatverlichting die op afstand werd uitgeschakeld door onbevoegden en van stoplichten die werden gemanipuleerd door hackers. Ook in de systemen van energiecentrales en zelfs sluisdeuren is al vaker ingebroken. Hoe meer sensoren, infrastructuur en besturingssystemen een internetverbinding krijgen, hoe meer er kwetsbaar worden. „Niet alle kleine gemeenten hebben de capaciteit om de beveiliging even goed te regelen”, aldus De Jong.

    En behalve de fysieke systemen moeten ook de gegevens goed zijn beschermd. Aan data uitslimme energiemeters is bijvoorbeeld af te leiden wanneer iemand thuis is. Handig voor inbrekers. Gezondheidsgegevens zijn een extra gevoelig onderwerp, bijvoorbeeld omdat die veel waard kunnen zijn voor verzekeraars. „De beveiliging en de privacybescherming van die data is van absoluut belang”, zegt Kontokosta. Maar ook hij erkent dat geen enkel systeem 100 procent waterdicht is.

    Wat naast de beveiliging belangrijk is: wie is de eigenaar van de data? Een oplossing daarvoor komt uit Estland, dat op veel gebieden geldt als voorloper in digitalisering van de overheid. Daar blijven data over een burger van die burger zelf. „Dat kan transparanter maken wat er met gegevens gebeurt”, zegt Kontokosta. Maar het is technisch ingewikkeld om data over verschillende personen van elkaar te scheiden. Voorlopig zijn het de gemeentes of de uitbaters van de sensoren die de data in bezit krijgen.

    2 De algoritmes worden oncontroleerbaar

    Om patronen te ontdekken in de berg gegevens, zijn algoritmes nodig. Die doorzoeken de data op basis van een vaste set instructies. De instructies die in zo’n algoritme zijn ingeprogrammeerd, bepalen de uitkomst. Maar wat als die instructies ethisch niet door de beugel kunnen? Daarvan zijn nu al veel voorbeelden.

    De Jong is behalve onderzoeker ook directeur van een Harvard-project in drie Amerikaanse steden. Daar lost zijn team met behulp van data problemen op met panden waar veel criminaliteit is. Wat blijkt? Door data over die panden te vergelijken met panden die in het verleden zijn ontspoord, is vaak te voorspellen dat er problemen ontstaan. „Een van de voorspellende factoren is bijvoorbeeld dat een bewoner zijn belastingen niet betaalt.”

    Andere gegevens die een voorspellende waarde hebben zijn bijvoorbeeld data over het geluidsniveau in de buurt, gegevens van de politie, van de uitkeringsinstantie en andere persoonlijke informatie over de bewoners. Aan de hand van dat soort gegevens en de afweging die het algoritme maakt, kunnen politieagenten, maatschappelijk werkers en ambtenaren sneller ingrijpen als het dreigt mis te gaan.

    Het algoritme stuurt dus beslissingen die grote impact kunnen hebben op de bewoners. De Jong: „Het is heel belangrijk dat zo’n algoritme deugdelijk is.” Vooral omdat potentieel ook factoren als etniciteit, leeftijd of inkomensniveau van de bewoners een rol kan spelen. „Dan kom je bijvoorbeeld al snel bij etnische profilering en andere ongrondwettige zaken.”

    Ger Baron, de CTO van Amsterdam, herkent dit. Hij ontwikkelt een systeem dat hulpverleners op basis van grote hoeveelheden data helpt inschatten hoe veilig bepaalde buurten zijn. Dat wil de gemeente omdat brandweerlieden en ambulancemedewerkers wel eens worden belaagd. Het systeem werkt met kleurencodes: als in een huis of buurt mensen wonen die in het patroon passen van mensen met gewelddadig gedrag, springt de code op rood en gaat er bijvoorbeeld extra beveiligingspersoneel mee.

    Hoe bepaal je dat het algoritme dat die codering bepaalt neutraal blijft en dus bijvoorbeeld niet onterecht rood wordt bij huizen alleen omdat er allochtonen wonen? „Het zou zomaar kunnen dat de kleurcode vaker negatief is in wijken met een laag inkomen bijvoorbeeld,” zegt Baron. Het hangt er maar net vanaf op basis van welke criteria het algoritme beslist.

    „Politici en ambtenaren moeten vaker uitzoeken: waar is dat algoritme op gebaseerd?”, aldus De Jong. Politici lezen ook wetsontwerpen voordat ze ermee instemmen; ze moeten nu ook vaker het functioneel ontwerp van softwaretoepassingen snappen en de algoritmes begrijpen. Bestuurders hebben vaak niet de technische kennis om de techniek goed te beoordelen, volgens De Jong. „Dat hoeft niet erg te zijn, als ze maar doorvragen en grondig informatie inwinnen zoals ze dat ook bij wetsontwerpen doen.”

    3 Bedrijven krijgen te veel invloed

    Amsterdam en Eindhoven afficheren zich allebei als urban living lab: een stedelijk laboratorium voor nieuwe technologieën. Dat klinkt als de taal van technologiebedrijven. En dat is het ook. In Eindhoven betaalt het Franse bedrijf Atos grootschalige experimenten. Ger Baron van Amsterdam wordt naar eigen zeggen „dagelijks” benaderd door bedrijven die nieuwe technologieën willen uitproberen op Amsterdammers. Veruit de meeste wijst hij af.

    Techbedrijven als Samsung, Microsoft, IBM en Alphabet (het voormalige Google) hebben de slimme stad ontdekt en zijn nauw betrokken bij veel projecten, ook in Nederland. Op zich is het niet raar dat ze mogen meedoen, want uit die bedrijven komen ook de vernieuwingen.

    „Maar de belangen van bedrijven zijn zeker niet altijd de belangen van burgers”, zegt Baron. Ze hebben soms andere opvattingen over het eigendom van de data, of de transparantie van projecten. Voor de gemeenteraden is openheid en controleerbaarheid van belang, maar bedrijven willen soms meer geslotenheid om concurrenten niet wijzer te maken.

    Hoe meer gemeenten smart city-projecten gaan doen, hoe groter het risico wordt dat ondeskundige ambtenaren zich te veel laten verleiden door technologiebedrijven. 


    4 De overheid verprutst de IT

    Smart city-projecten zijn in de kern gewoon IT-projecten. IT en de overheid zijn niet altijd een even fijne combinatie; vaak mislukken miljoenen kostende projecten. Als dat bij een smart city gebeurt, verzamelen gemeentes bovendien allemaal gevoelige informatie waar ze vervolgens niets mee kunnen.

    De Jong van Harvard deed onderzoek naar wat bepaalt of een smart city-project een succes wordt of niet. Daaruit blijkt dat drie factoren cruciaal zijn. De eerste factor is het vermogen van gemeenten om zowel intern als extern samen te werken. Het draait bij smart city’s vaak om samenwerkingen met technologiebedrijven en samenwerking tussen gemeentediensten. Dat blijkt vaak lastig.

    Herkenbaar, vindt Kontokosta. „Vooral gedeelde toegang tot data kan problemen opleveren: je ziet bijvoorbeeld zelden dat het bureau voor waterbeleid goed praat met het bureau dat energiedata verzamelt. En ook de combinatie van ambtelijke bureaucratie met snelle technologiebedrijven loopt niet altijd even goed.” Baron loopt in Amsterdam tegen dezelfde dingen op: „Dat moeten we echt nog veel beter regelen.”

    De tweede factor die uit het onderzoek blijkt, is het vermogen van gemeentes om heldere doelen te stellen en daar meetbare criteria aan te koppelen. Nu wordt vaak vanuit de techniek gedacht: we hebben Twitter en allerlei sensoren, wat moeten we daarmee? Denken vanuit het oplossen van problemen van burgers is productiever, blijkt: we hebben een probleem met verlaten panden, welke technologie kunnen we gebruiken om dat op te lossen? .

    Ten derde moet een gemeente in staat zijn om alle data te analyseren. Daar is expertise voor nodig die gemeenten niet altijd hebben, beamen Kontokosta als Baron. Een oplossing is meer datawetenschappers inhuren, maar die zijn nu overal in trek – ook bij bedrijven die diepere zakken hebben dan gemeenten. „Gemeenten zullen harder moeten concurreren om die mensen,” zegt De Jong.

    Ondertussen gaan de experimenten snel verder. Telecombedrijf KPN lanceert dit najaar een speciaal netwerk in Den Haag en Rotterdam dat bedoeld is om allerlei sensoren met elkaar te verbinden. KPN hoopt dat dat de definitieve doorbraak betekent voor smart city-toepassingen in Nederland – daar verzorgt KPN graag het dataverkeer voor.

    Ook Ger Baron heeft veel meer smart city-plannen met Amsterdam op korte termijn, en Kontokosta is niet van plan te stoppen bij Hudson Yards. Hij is met zijn team al bezig met andere quantified community-projecten: in een armere buurt van Brooklyn en in het financiële district van Manhattan. Daar kun je weer heel andere dingen meten dan in Hudson Yards.


    Source: NRC, 17 oktober 2015


  • Hoe onderscheiden data gedreven organisaties zich echt?

    We are data driven Image

    Je hoort het vaak in bestuurskamers: we willen een data-driven organisatie zijn. We willen aan de slag met IoT, (predictive) analytics of location based services. En ja, dat zijn sexy toepassingen. Maar wat zijn de werkelijke business drivers? Die blijven vaak onderbelicht. Onderzoek laat zien op welke terreinen organisaties met een hoge ‘datavolwassenheid’ vooroplopen.

    SAS ondervroeg bijna 600 beslissers en kon op basis van de antwoorden de respondenten onderverdelen in drie groepen: de koplopers, een middengroep en de achterblijvers. Zo ontstaat goed zicht op waarin de koplopers zich onderscheiden van de achterblijvers.

    Het eerste wat opvalt is de proactieve houding. Koplopers maken budget vrij om oude processen en systemen te vervangen en investeren in de uitdaging van data-integratie. Er heerst bovendien een cultuur van ‘continuous improvement’. Deze bedrijven zijn voortdurend actief op zoek naar verbetermogelijkheden. Dit in tegenstelling tot de achterblijvers, die pas willen investeren in verbeteringen als ze precies weten hoe hoog de ROI is.

    De koplopers vervangen hun oude systemen het vaakst door open source data platformen, waarbij Hadoop verreweg het meest populaire platform is. Behalve in technologie investeren deze bedrijven ook meer in het opschonen van data. Ze hebben goede processen ingericht om ervoor te zorgen dat data up-to-date en van de juiste kwaliteit is voor het beoogde gebruik. En ook de governance op deze processen is beter dan in de bedrijven die achterblijven (lees hier over het verhogen van de ROI op data en IT).

    Ook investeren koplopers meer in talent. 73 procent van deze bedrijven heeft een dedicated datateam dat wordt bezet met eigen mensen. De achterblijvers hebben vaker ofwel helemaal geen datateam ofwel een team dat wordt ingevuld door externe mensen. Koplopers investeren ook meer in werving en selectie van gekwalificeerd personeel. Daardoor ondervindt ‘slechts’ 38 procent van de koplopers een tekort aan interne vaardigheden, tegenover 62 procent van de achterblijvers.

    Dit alles leidt ertoe dat koplopers beter zijn voorbereid op de GDPR-regelgeving, die in 2018 zijn intrede doet.

    Ze zijn beter in staat om de risico’s te benoemen die verbonden zijn aan een data-driven strategie en ze hebben maatregelen genomen om deze risico’s af te dekken of te verkleinen.

    De komst van de GDPR is voor veel organisaties een aanleiding om te investeren in een goede datastrategie. Maar dit is niet de enige reden. Bedrijven met een hoge datavolwassenheid kunnen:

    • sneller ingewikkelde vragen beantwoorden
    • sneller beslissingen nemen
    • sneller innoveren en groeien
    • de klantervaring verbeteren
    • groei realiseren in omzet en marktaandeel
    • kortere time-to-market voor nieuwe producten en diensten realiseren
    • business processen optimaliseren
    • betere strategische plannen en rapportages maken

    Alle reden dus om écht in data governance en data management te investeren en niet alleen maar te roepen dat je organisatie data-driven is. 90 procent van de ondervraagden vindt zichzelf namelijk datagedreven, maar de realiteit is helaas minder rooskleurig.

    Interesse in de volledige onderzoeksresultaten?
    Download hier het rapport ‘How data-driven organisations are winning’.


    Bron: Rein Mertens (SAS)

    In: www.Analyticstoday.nl

  • How Big Data is changing the business landscape

    jpgBig Data is increasingly being used by prominent companies to outpace the competition. Be it established companies or start-ups, they are embracing data-focussed strategies to outpace the competition.

    In healthcare, clinical data can be reviewed treatment decisions based on big data algorithms that work on aggregate individual data sets to detect nuances in subpopulations that are so rare that they are not readily apparent in small samples.

    Banking and retail have been early adopters of Big Data-based strategies. Increasingly, other industries are utilizing Big Data like that from sensors embedded in their products to determine how they are actually used in the real world.

    Big Data is useful not just for its scale but also for its real-time and high-frequency nature that enables real-time testing of business strategies. While creating new growth opportunities for existing companies, it is also creating entirely new categories of companies that capture and analyse industry data about products and services, buyers and suppliers, consumer preferences and intent.


    What can Big Data analytics do for you?

    *Optimise Operations

    The advent of advanced analytics, coupled with high-end computing hardware, has made it possible for organizations to analyse data more comprehensively and frequently.

    Analytics can help organisations answer new questions about business operations and advance decision-making, mitigate risks and uncover insights that may prove to be valuable to the organisation. Most organisations are sitting upon heaps of transactional data. Increasingly, they are discovering and developing the capability to collect and utilise this mass of data to conduct controlled experiments to make better management decisions.

    * React faster

    Big Data analytics allows organisations to make and execute better business decisions in very little time. Big Data and analytics tools allow users to work with data without going through complicated technical steps. This kind of abstraction allows data to be mined for specific purposes.

    * Improve the quality of services

    Big Data analytics leads to generation of real business value by combining analysis, data and processing. The ability to include more data, run deeper analysis on it and deliver faster answers has the potential to improve services. Big Data allows ever-narrower segmentation of customers and, therefore, much more precisely tailored products or services. Big Data analytics helps organizations capitalize on a wider array of new data sources, capture data in flight, analyse all the data instead of sample subsets, apply more sophisticated analytics to it and get answers in minutes that formerly took hours or days.

    * Deliver relevant, focussed customer communications

    Mobile technologies tracks can now track where customers are at any point of time, if they're surfing mobile websites and what they're looking at or buying. Marketers can now serve customised messaging to their customers. They can also inform just a sample of people who responded to an ad in the past or run test strategies on a small sample.

    Where is the gap?

    Data is more than merely figures in a database. Data in the form of text, audio and video files can deliver valuable insights when analysed with the right tools. Much of this happens using natural language processing tools, which are vital to text mining, sentiment analysis, clinical language and name entity recognition efforts. As Big Data analytics tools continue to mature, more and more organisations are realizing the competitive advantage of being a data-driven enterprise.

    Social media sites have identified opportunities to generate revenue from the data they collect by selling ads based on an individual user's interests. This lets companies target specific sets of individuals that fit an ideal client or prospect profile. The breakthrough technology of our time is undeniably Big Data and building a data science and analytics capability is imperative for every enterprise.

    A successful Big Data initiative, then, can require a significant cultural transformation in an organisation. In addition to building the right infrastructure, recruiting the right talent ranks among the most important investments an organization can make in its Big Data initiative. Having the right people in place will ensure that the right questions are asked - and that the right insights are extracted from the data that's available. Data professionals are in short supply and are being quickly snapped up by top firms.

    Source: The Economic Times

  • How to act now to be successful in the future? Digital business models

    How to act now to be successful in the future? Digital business models

    Digital business models created around data are producing a winner-take-all market, not a digital divide. That’s why leaders need to “stop doing analytics for analytics’ sake, focus on the business problem, and define and ask the big questions of your data,” warns disrupting digital business author Ray Wang in 10 Enterprise Analytics Trends to Watch.

    The Constellation Research founder and principal analyst notes that digital leaders are now grabbing 70% of overall market share, and more than 75% of profits. A Harvard Business Review Analytic Services report that features insights from Wang warns brands of “an existential imperative; those companies that do not evolve into data-driven organizations will be supplanted by those that do.”

    For most, a long way to go and a short time to get ghere

    The inflection point for the data-driven enterprise report, based on a survey of 729 business leaders conducted by Harvard Business Review Analytic Services, shows that while 90% of respondents say they’re confident that their organizations will achieve its vision of a data-driven enterprise, most have an alarmingly long way to go:

    • While 86% say the ability to extract new value and insights from existing data and analytics applications is very important, only 30% say their organization is currently very effective at doing so.
    • While 78% say accessing and combining data from a variety of external data sources is very important, just 23% say their organization is currently very effective at doing so.

    And those new digital business models that are, according to Ray Wang, creating a winner-take-all market? Only 28% of respondents say that introducing new business models is a key goal of their evolution into a data-driven organization. For leaders this is key to digital transformation, says Wang. For the remaining 72% that don’t have new business model creation or business model evolution as a goal, there’s simply no time to wait.

    “This is a top-down strategic business model decision that boards have to address,” says Wang. “Boards aren’t doing their jobs because they don’t understand the problem: they’re in a data war, and data is the weapon."

    Leaders are moving further ahead, faster

    In 10 Enterprise Analytics Trends to Watch, Wang notes that you’ll also see analytics leaders applying artificial intelligence for business agility and scale. This automation and augmentation when it comes to data and insights is set to move leaders and fast followers even further ahead when it comes to digital transformation.

    “The situation in almost every market is that executives realize that they need to transform. They want to start using artificial intelligence, for example,” says Wang. “But they don’t realize that these changes happen along a continuum. It’s an intensive, multi-year process.”

    As the next decade looms, the race is on to make the most, and more than competitors, of data. Is your 2020 vision for data and analytics clear?

    ''Every board member and CEO needs to understand that data assets have to be managed the same way they manage any other asset. If they don’t, they will be disrupted.'' - Ray Wang, Constellation Research

    Source: Microstrategy

  • How to make a case for the importance of competitve intelligence

    How to make a case for the importance of competitve intelligence

    Whether you’re in product developmetn, sales or marketing, if you deal with competitive intelligence (CI) gathering, you already know how important it is to your business. You see how much information exists in your industry space, and you probably spend significant time and effort capturing and interpreting that data. It’s all worth it though, because the insights you uncover help keep your business on top. You know that an investment in CI is an investment in your future competitiveness.

    But when it comes to garnering executive support, the case for CI isn’t always self-explanatory. Executives often lack firsthand knowledge and context when it comes to the ins and outs of intelligence gathering. They see the end results, but they don’t see the late nights and last-minute scrambles that go into producing those results.

    In order to secure the funding and resources you need to keep your CI system running smoothly, keep these points in mind:

    Emphasize inevitability

    Competitive intelligence is more than a trend, it’s an old concept that’s been given new life by advanced web-scraping technologies, allowing businesses to gather information at an unprecedented rate. In 2015, around 17% of US businesses surveyed were using some form of data-gathering technology to inform business decisions. In 2018, that number rose to 59%. That type of expansion signals that whether it happens today, tomorrow, or next year, your business will inevitably find a reason to invest in intelligence gathering. So why wait and fall further behind the competition? Nobody wants to feel like they’re missing out on the next big technological breakthrough, and you can use that to your advantage when you’re looking for executive investment.

    Bring the numbers

    If abstract benefits aren’t enough to make your case, take a look at your company’s recent history. Find instances where intelligence gathering has had a measurable impact on your business’s finances. Better yet, find instances where overlooked information caused your company to miss out on a profitable opportunity. These anecdotes will drive home the concept that good CI is directly linked to your business’s financial health. In fact, 92% of executives surveyed in 2018 said that data gathering and analysis is 'extremely' important to their companies’ successes. It’s no longer a luxury, it’s an investment your business can’t afford not to make.

    Think ‘better, stronger, faster’

    Once you have the c-suite’s attention, it’s time to talk potential. Whatever your current CI process looks like, chances are it could be improved. Whether you proactively skim the competition’s social media or play catch-up after major announcements, you’re probably not grabbing as much data as you could be, or distributing it as efficiently as you could with a competitive intelligence service. In a 2018 Grid report, businesses already using an intelligence platform achieve on average a 51% adoption rate among users. If 49% of your workforce is left out of the CI process, you’re undoubtedly missing out on the valuable insights those employees might be able to offer.

    Building out your CI process means ensuring that you’re capturing, organizing, and distributing information to the stakeholders who need it, in a timely and efficient manner. If you don’t have a full-time staff member or team devoted to it, developing and implementing this type of process can be a daunting task. That’s why more and more businesses are choosing to outsource their CI gathering needs.

    Source: CI Radar

  • How to Optimize Analytics for Growing Data Stores

    Every minute of every day, mind-blowing amounts of data are generated. Twitter users send 347,222 tweets, YouTube users upload 300 hours of video, and Google receives more than four million search queries. And in a single hour, Walmart processes more than a million customer transactions. With the Internet of Things accelerating at lightning speed – to the tune of 6.4 billion connected devices in 2016 (up 30 percent from 2015) – this already staggering amount of data is about to explode. By 2020, IDC estimates there will be 40 zettabytes of data. That’s 5,200 GB for every person on the planet.

    This data is a gold mine for businesses. Or, at least, it can be. On its own, data has zero value. To turn it into a valuable asset, one that delivers the actionable intelligence needed to transform business, you need to know how to apply analytics to that treasure trove. To set yourself up for success, start out by answering these questions:

    What Is the Size, Volume, Type and Velocity of your Data?

    The answers to this will help you determine the best kind of database to store your data and fuel your analysis. For instance, some databases handle structured data, and others are focused on semi-structured or unstructured data. Some are better with high-velocity and high-volume data.

      RDMS Adaptive NoSQL Specialty In-Memory NewSQL Distributed
    Example DB2, Oracle, MySQL Deep Information Sciences Cloudera, MonoDB, Cassandra Graphing, Column Store, time-series MemSQL, VoltDB NuoDB Hadoop
    Data Type Structured Structured Un/semi-structured Multiple Structured Structured Structured
    Qualities Rich features, ACID compliant, scale issues Fast read/ write, strong scale, ACID, flexible Fast ingest, not ACID compliant Good reading, no writing, ETL delays Fast speed, less scale, ETL delays for analytics Good scale and replication, high overhead Distributed, document-based database, slow batch-based queries

     Which Analytics Use Cases will You Be Supporting?

    The type of use cases will drive the business intelligence capabilities you’ll require (Figure 1).

    • Analyst-driven BI. Operator seeking insights across a range of business data to find cross-group efficiencies, profit leakage, cost challenges, etc.
    • Workgroup-driven BI. Small teams focused on a sub-section of the overall strategy and reporting on KPIs for specific tasks.
    • Strategy-driven BI. Insights mapped against a particular strategy with the dashboard becoming the “single source of truth” for business performance.
    • Process-driven BI. Business automation and workflow built as an autonomic process based on outside events.


    Where Do You Want your Data and Analytics to Live?

    The main choices are on-premises or in the cloud. Until recently, for many companies – particularly those concerned about security – on-prem won out. However, that’s changing significantly as cloud-based solutions have proven to be solidly secure. In fact, a recent survey found that 40 percent of big data practitioners use cloud services for analytics and that number is growing.

    The cloud is attractive for many reasons. The biggest is fast time-to-impact. With cloud-based services you can get up and running immediately. This means you can accelerate insights, actions, and business outcomes. There’s no waiting three to four months for deployment and no risk of development issues.

    There’s also no need to purchase and install infrastructure. This is particularly critical for companies that don’t have the financial resources or skills to set up and maintain database and analytics environments on-premises. Without cloud, these companies would be unable to do the kind of analyses required to thrive in our on-demand economy. However, even companies that do have the resources benefit by freeing up people and budget for more strategic projects.

    With data and analytics in the cloud, collaboration also becomes much easier. Your employees, partners, and customers can instantly access business intelligence and performance management.

    Cloud Options

    There are a number of cloud options you can employ. Here’s a quick look at them:

    Infrastructure as a Service (IaaS) for generalized compute, network, and storage clusters. IaaS is great for flexibility and scale, and will support any software. You will be required to install and manage the software.

    Database as a Service (DBaaS), where multi-tenant or dedicated database instances are hosted by the service provider. DBaaS also is great for flexibility and scale, and it offloads backups and data management to the provider. Your data is locked into the provider’s database solution.

    Analytics as a Service (AaaS) provides complex analytics engines that are ready for use and scale as needed, with pre-canned reports.

    Platform as a Service (PaaS) is similar to DBaaS in that it scales easily and that application backups and data management are handled by the provider. Data solutions themselves are often add-ons.

    Software as a Service (SaaS) is when back office software is abstracted through a hosted application with data made available through APIs. Remote analytics are performed “over the wire” and can be limiting.

    How you leverage data can make or break your business. If you decide to go the cloud route, make sure your service provider’s database and analytics applications fit your current and evolving needs. Make sure the provider has the expertise, infrastructure, and proven ability to handle data ebbs and flows in a way that’s cost-effective for you and, equally important, ensures that your performance won’t be compromised when the data tsunami hits. Your business depends on it.

     Source: DataInformed

  • How to Sell Your C-suite on Advanced Analytics

    Advanced AnalyticsBut just because businesses are more open to exploring analytics and executives are dropping data science buzzwords in meetings doesn’t mean you don’t still have to sell your C-suite on investing in such technology. Analytics done right with the best tools and a skilled staff can get extremely expensive, and your C-suite isn’t just going to write you a blank check, especially if you can’t communicate how this investment will positively impact the bottom line.

    As the co-founder of Soothsayer Analytics, which applies artificial intelligence to build analytics tools for companies, Christopher Dole has experienced firsthand how difficult it can be to sell senior leadership on the ROI of advanced analytics.

    Since the founding of his company two years ago, he has continued to hone his pitch on prescriptive analytics, and he’s learned what information C-suite executives look for both before and after the launch of an analytics platform. He listed four pieces of advice for how to not only pitch an analytics program, but also ensure its continued success after its launch.

    Do your homework

    Prior to even scheduling a meeting with senior leadership, you must first arm yourself with the answers to every question that might get thrown your way.

    “I would definitely plan on meeting with any relevant colleagues, peers, or other internal stakeholders about issues and opportunities that they’d like to address,” said Dole. “And once you have some ideas you should also, in advance, meet with your data team and identify any relevant data — preferably data that’s clean and comprehensive — so then when you’re actually in front of the C-suite or board you can start by clearly defining where you’re currently at in the analytics journey, whether it’s the descriptive, diagnostic, predictive, or prescriptive level. If leadership says that your company is already doing analytics, yet they can’t predict what will happen or what can be done to perturb it, then they aren’t really doing analytics, and you should clearly articulate that.”

    It’s also important during your research to find examples of other companies’ experience with analytics solutions similar to the ones you’re proposing.

    “Talk about the value it created for them,” said Dole. “So, for example, if you’re starting on an analytics initiative and you’re a telecom provider, talk about how a competitor tapped into their stream of customer data to reduce churn and provide millions of dollars per year of savings.” When generating a list of examples, he said, try to focus more on instances that generated revenue or prevented losses as opposed to reduced waste. “Making money is often seen as sexier than saving money.”

    Start with the low hanging fruit

    If you’re just starting out in the analytics game, it may be tempting to ramp up a state-of-the-art program. But it’s actually more important to get some early wins by capturing the low-hanging fruit.

    “If possible, start with a larger problem that can be easily split into sub projects,” said Dole. “For instance, if you decide to focus on customer understanding, start with scientific customer segmentation. That way, once you know who your customers are, you can start to solve other problems that would require that understanding as a foundation anyway, whether it’s identifying opportunities for cross-sell and upsell, predicting and preventing churn, or forecasting customer lifetime value. These quick wins can typically be achieved within 12 weeks.”

    Set the proper expectations

    It can be incredibly tempting to hype the potential payoff of analytics, but overselling it can result in the C-suite viewing outcomes as failures when they would otherwise be considered wins.

    “It may be a few month or two before any snippets of insight can be garnered, so it’s important that they are patient during the process,” said Dole. “A lot of what a data scientist is doing is identifying, collecting and compiling clean data into usable formats, and this can often take up to 60 percent of their time. Make sure they understand that a properly structured analytics project typically provides as much as a 13x ROI. There are many steps to achieving this, and everyone needs to be aligned on the ultimate goal.”

    Above all, you should keep it simple stupid. It’s all too easy for a data scientist to get bogged down into technical jargon and respond to questions with arcane answers.

    “Use rich visualizations when possible because it’s much easier to understand a graphic than an equation or complex model,” said Dole. “Remove as much of the math and science as possible and just focus on the insights and the value that it’s going to create as well as all of the potential to expand upon it.”

    Information Management, 2016, Simon Owens

  • ING en TU Delft slaan handen ineen met nieuw AI lab

    ING en TU Delft slaan handen ineen met nieuw AI lab

    ING en de TU Delft bundelen hun kennis en expertise op het gebied van artificial intelligence (AI) binnen de financiële sector in het nieuwe AI for FinTech Lab (AFL). Het doel van de samenwerking met het AFL is om met kunstmatige intelligentie-technologie de effectiviteit en doelmatigheid van data- en software-analyse te verbeteren.

    ING en de TU Delft werken al langere tijd samen op het gebied van software onderzoek en ontwikkeling. Binnen het nieuwe AFL zullen onderzoekers en studenten van de TU Delft onderzoek doen naar de ontwikkeling van software voor de financiële sector, waaronder autonome software en systemen voor data-analyse en data-integratie. Binnen deze samenwerking biedt het ING een onmisbare IT-infrastructuur, een ambitieuze organisatiestructuur voor software-ontwikkeling, en een leidende positie op het gebied van data fluency en analytics delivery.

    Gevalideerde oplossingen

    Volgens Arie van Deursen, hoogleraar software engineering aan de TU Delft en wetenschappelijk directeur van het AFL, is het AFL voor de TU Delft een logische volgende stap in de samenwerking met ING. ''Het biedt de kans om nieuwe theorieën, methoden en tools op het gebied van kunstmatige intelligentie te ontwikkelen, en om nieuw talent aan ons te binden. Wij verwachten dat de samenwerking binnen het AFL niet alleen zal leiden tot baanbrekende theorieën, maar ook tot gevalideerde oplossingen die breed verspreid kunnen worden.”

    Görkem Köseoğlu, Chief Analytics Officer bij ING: ‘Het gebruik van klantdata biedt grote kansen om betere diensten te ontwikkelen, maar moet tegelijk zorgvuldig worden vormgeven. Voor klanten zijn hun data van groot belang en ING hecht er veel waarde aan dat klanten ons vertrouwen. De samenwerking met de TU Delft is daarom van groot belang om beide doelen te realiseren.’

    Het AFL bevindt zich op twee locaties: de ING campus in Amsterdam en de campus van de TU Delft in Delft. Zo brengt het studenten, software- en data-specialisten, onderzoekers en ondernemers van beide organisaties samen.

    AI for FinTech Lab en ICAI

    Het AFL is deel van het ICAI, het Innovation Center for Artificial Intelligence. Dit is een nationaal netwerk gericht op technologie- en talentontwikkeling tussen kennisinstellingen, industrie en de overheid op het gebied van Artificial Intelligence. ICAI's innovatiestrategie is georganiseerd rond industry labs, onderzoekslabs die meerjarige strategische samenwerkingen met het bedrijfsleven omvatten. Het AFL kan zo nog beter kennis en expertise uitwisselen met andere ICAI partners, zoals bijvoorbeeld Elsevier, Qualcomm, Bosch, Ahold Delhaize, en de Nationale Politie.

    Bron: BI platform

  • Integrating analytics into business processes: Human or Machine?

    Integrating analytics into business processes: Human or Machine?

    As data science and analytics teams continue to feel pressure to deliver more value from analytics, many organizations struggle with the processes and technology required to deploy models into production and more rapidly make data-driven decisions. When evaluating how to best undertake these activities, organizations should consider an important distinction to determine the best path forward.

    Human or machine?

    The pervasive level of mature analytics requires that a significant number of analytics products are integrated into business operations. These products can be used to help humans make a better decision in the context of their workflow. For example, a call center rep might see cross-sell offers prioritized based on the context of an interaction. Or a supply chain manager might receive an alert that the forecasted price of a key ingredient has significantly changed so that they can take action.

    Analytics products can also be integrated into business processes to guide the decision making machines. Examples include automatically underwriting a life insurance application or managing the dynamic pricing of airline seats.

    The roles, processes, and technology needed to inform a human decision maker are very different from those needed to inform a machine decision maker. Exactly who (or what) will be making a decision should be established early in order to ensure that the final product is designed for the correct type of end user.

    Many data science and analytics groups develop models while focusing primarily on the data, analytic techniques, and business problem. In a pilot phase, this is sufficient. However, there are significant differences between automating a decision for a machine versus informing a decision for a human. These differences need to be accounted for throughout the analytical development and deployment process.

    The differences

    While the algorithmic approach may be consistent whether informing a human or a machine to make a decision, the supporting architecture, data provisioning, and process monitoring needs to be very different.

    For example, when call center employees are delivering offers from a script, they are able to identify if something has gone haywire and what they are seeing makes no sense. They may or may not deal with the situation optimally, but they will notice quickly, report the issue, and attempt to adjust.

    An automated machine-based decisioning process, however, will only be able to assess problems within the exact context and parameters that are programmed into it. The process may not even realize that something has gone haywire, let alone report it or try to adjust. Care must be taken to build in more validation logic and/or to have a human monitoring a process as it runs, much like how humans monitor manufacturing processes.

    There is an inherent advantage to having many people observing the actions of an analytic process on an ongoing basis. Enabling human-based decisions provides opportunities for the organization to find issues and deal with them. A machine-based process, on the other hand, will simply continue stamping out decisions regardless of an obvious-to-a-human problem arising. A much higher level of monitoring and results tracking is required when deploying analytics to facilitate a machine-based decision.

    Another major difference between human-based and machine-based decisions is the speed and scale of the decisions being made. While the number of cross-sell offers made in a call center in a day may be impressive, it would pale in comparison to the number of decisions being made by machines optimizing output within a complex assembly line. Machine-based decisions often occur at a rate that is orders of magnitude higher than human decisions. This necessitates a more scalable, sophisticated, and performant pipeline from raw data to the recommended decision.

    From many to millions

    As the practice of analytics matures within a company, opportunities that drive significant value by informing human back office or strategic decisions are typically found early. The current frontier for most companies is not assisting a few decisions that create millions in value but assisting millions of decisions that each create a small amount of value. These small individual values can add up to a massive number in total and the pursuit of this frontier is the foundation of the analytics revolution that is currently underway.

    While organizations must aggressively pursue analytics that inform both human-based and machine-based decisions, the differences in the requirements for each must also be understood and accounted for from the beginning. The failure to recognize the distinction has led many organizations to struggle more than needed.

    Source: Datafloq

  • Intelligence, automation, or intelligent automation?

    Intelligence, automation, or intelligent automation?

    There is a lot of excitement about artificial intelligence (AI), and also a lot of fear. Let’s set aside the potential for robots to take over the world for the moment and focus on more realistic fears. There is a growing acceptance that AI will change the way we work. There is also agreement that it is likely to result in a number of jobs disappearing or being replaced by AI systems, and others appearing.

    This has fueled the discussion on the ethics around intelligence, especially AI. Thoughtful commentators note that it is unwise to separate the two. Some have suggested frameworks for the ethical development of AI. Underpinning ethical discussion, however, is a question of what AI will be used for exactly. It is hard to develop an ethics framework out of the blue. In this blog, this issue will be unpicked a little, sharing thoughts about where and how AI is used and how this will affect the value that businesses obtain from AI.

    Defining intelligence

    Artfiicial Intelligence has been defined as the ability of a system to interpret data, learn from it, and then use what it has learnt to adapt and therefore achieve particular tasks. There are therefore three elements to AI:

    1. The system has to correctly interpret data and draw the right conclusions.

    2. It must be able to learn from its interpretation.

    3. It must then be able to use what it has learnt to achieve a task. Simply being able to learn or, indeed, to interpret data or perform a task is not enough to make a system AI-based.

    As consumers, most of our contact with AI is with systems like Alexa and Siri. These are definitely "intelligent," in that they take in what we say, interpret it, learn from experience and perform tasks correctly as a result. However, in business, there is general acceptance that much of the real value from AI will come from automation. In other words, AI will be used to mimic or replace human actions. This is now becoming known as 'intelligent automation'.

    Where does intelligent start and automation stop though? There are plenty of tasks that can be automated simply and easily, without any need for an intelligent system. A lot of the time the ability to automate tasks is overshadowing the need for intelligence to drive the automation. This typically results in very well-integrated systems, which often have decision-making capabilities. However, the quality of those decisions is often ignored.

    Good AI algorithms can suggest extremely good options for decisions. Ignoring this limits the value that companies can get out of their investments in AI. Equally, failing to consider whether the quality of the decision is good enough can lead to poor decisions being made. This undermines trust in the algorithm. This results in less use for decisions, again reducing the value. But how can you assess and ensure the quality of the decisions made or recommended by the algorithm?

    Balancing automation and intelligence

    An ideal AI deployment should have a balance between automation and intelligence. If you lean too much towards the automation side and rely on simple rules-based automation, all you will be able to do is collect all the low-hanging fruit in this case. You will therefore miss out on the potential to use the AI system to support more sophisticated decision making. Lean too much towards other direction though, and you get intelligence without automation or systems like Alexa and Siri. Useful for consumers, but not so much for businesses.

    In business, analytics needs to be at the heart of an AI system. The true measure of a successful AI deployment lies in being able to mimic both human action and human decision making.

    An AI deployment has a huge range of components, it would not be unreasonable to describe it as an ecosystem. This ecosystem might contain audio-visual interpretation functions, multisystem and/or multichannel integration, and human-computer interface components. However, none of those would mean anything without the analytical brain at the centre. Without that, the rest of the ecosystem is simply a lifeless body. It needs the analytics component to provide direction and interpretation of the world around it.

    Author: Yigit Karabag

    Source: SAS

  • Intelligente organisaties en BI van de toekomst: Hoe?

    virtualbiSteeds meer beslissingen binnen het bedrijfsleven worden genomen aan de hand van data. De moeilijkheid is dat dit ook snel moet gebeuren. Als de klant de bedrijfswebsite bezoekt, aan de telefoon is of bij de klantendesk staat, moet het systeem direct de passende  informatie opleveren. Het mag duidelijk zijn: er is meer nodig dan een digitale kaartenbak. In deze whitepaper bespreken we een van de eisen aan de nieuwe generatie analytics-oplossingen: ze zijn cognitief .

    Wat zijn cognitieve systemen?

    Cognitieve systemen begrijpen mensen, gaan met ze in discussie en leren van ze. Naast hun kennis van natuurlijke talen hebben deze systemen deep learning en machineleren als basis. Ze verwerken vliegensvlug enorme hoeveelheden gestructureerde en ongestructureerde data om hypotheses te vormen en adviezen te geven. Doordat ze van elke interactie leren, worden ze in de loop van de tijd steeds effectiever.

    Meer dan een spelletje

    Een bekend voorbeeld van een cognitief systeem is Watson, de supercomputer van IBM. Werd Watson vooral bekend als deelnemer aan het Amerikaanse spelprogramma Jeopardy!, andere cognitieve systemen hebben het bedrijfsleven veel te bieden. Dit is allemaal mogelijk met een cognitieve analytics-oplossing:

    • Inzichten en patronen worden ontdekt die anders verborgen waren gebleven.
    • Data en informatie gebruiken om nieuwe ontdekkingen mee te doen.
    • Deze inzichten gebruiken om betere beslissingen te nemen en daardoor weer betere resultaten te behalen.
    • Steeds dieper graven, op zoek naar antwoorden, en op die manier innoveren binnen de organisatie.

    Whitepaper: The next wave of business intelligence for the data-driven enterprise

    Deze whitepaper gaat dieper in op de hedendaagse eisen aan analytics-oplossingen. Daarbij wordt dieper ingegaan op één product, IBM Cognos Analytics. Download deze whitepaper op www.analyticstoday.nl en u zult zich verbazen over wat er tegenwoordig allemaal mogelijk is op het gebied van BI-tools.

    Bron: www.analyticstoday.nl

  • Is BI dying? Or is it the definition discussion again?

    The business intelligence industry is in a free fall. Spend on BI tools will decrease more than 80 percent in 2018, according to the First Half 2018 Market Survey from Nucleus Research. Companies like Domo have been forced to go public at a fraction of their original valuations. And behemoths like MicroStrategy are in a spending spree despite stagnant revenue.

    It’s not that people don’t want to analyze information. In fact, the demand is stronger than ever. Over 83 percent of business professionals want analytics in the applications they already use, instead of wasting precious time switching applications. Business intelligence is failing because users don’t want to leave their day-to-day applications to analyze data in a standalone BI tool. 

    tombstoneWhy BI Is Dying

    Business intelligence hasn’t lived up to its promise to give users unprecedented access to business insights. Vendors have spent millions trying to improve their user experiences and deliver self-service. But nearly every BI tool forces users to leave their current workflows and open standalone applications to analyze data. 

    Increasingly, application teams are looking for new ways to deliver analytics that encourage user adoption and meet customer demand. Many are turning to embedded analytics solutions to help them deliver dashboards, reports, and self-service analytics in their applications.  

    Just look at the adoption rates for BI compared to embedded analytics. Users adopt embedded solutions twice as much as standalone tools. The only way to make sure everyone in an organization is on the same data page is to embed analytics in the applications they already use. 

    The Embedded Analytics Arms Race

    As BI struggles, embedded analytics is growing. Embedded analytics now represents 60 percent of new or additional analytics purchases, according to Nucleus Research. Traditional BI vendors are trying hard to wedge themselves into the embedded analytics space, but their architectures, pricing models, and support structures haven’t proved to be well-suited for the needs of application teams. 

    “Not all analytics platforms were built to be embedded,” says Gartner in its 5 Best Practices for Choosing an Embedded Analytics Platform Provider report. “Some providers attempt to offer embedded analytics offerings that are the same products sold to direct customers but lack required capabilities for a truly embedded experience.” 

    When application teams attempt to leverage traditional BI solutions or build something themselves to deliver embedded analytics, they end up with a disjointed experience that users reject. 

    So, what’s the bottom line? 

    BI had a 20-plus-year run, but demand for standalone BI is waning while embedded analytics grows. People naturally want their information in context of where they work. It’s becoming harder for businesses to justify the price of traditional standalone BI when so many integrated solutions—complete with embedded analytics—are on the market. 

    Over the next few years, standalone BI vendors will need to rethink their architecture and business models if they really want to support embed analytics, or they’ll struggle to stay afloat

    Author: Steven Schneider

    Sourc: Information Management

  • Monitor, evaluate and act to improve with DevOps analytics

    Monitor, evaluate and act to improve with DevOps analytics

    Today, most companies can say that they have integrated some form of DevOps collaboration between their development and operations teams. They are breaking down the silos, communicating better, and making the company more efficient as a result. But is that really true? Is your DevOps movement doing what it was set out to do?

    DevOps first came about in 2007-2008 to fix problems in the software industry and bring with it continuous improvement and greater efficiencies. The idea was that to be more successful, development and operations teams that had never worked together before were now expected to share responsibilities, communicate with each other, and be transparent. Thus forming a highly performant DevOps team.  

    For most companies, especially analytics businesses, the word 'efficiency' is what sparks interest. If the main goal is to bring about efficiencies, shouldn’t there be some measurement available to make sure the target is being met? 

    Of course, there should be. And it’s called DevOps analytics.

    DevOps analytics is the analysis of machine data to find insights that can be acted upon. In this case, insights that can be responded to in order to optimize a sequence or a larger process quickly. DevOps data analytics can be set up and measured at any time during your DevOps journey. Of course, the sooner you do it, the sooner you will be able to measure your successes and failures and make necessary adjustments. Every day is critical, especially in a DevOps environment where teams are working faster and more reliably. 

    Getting started

    Start by listing your KPIs. You should have at least one KPI for every part of your product cycle; planning, development, testing, deployment, release, and monitoring. Don’t overlook adding a few KPIs for the active cooperation between different teams (since this is the heart of the DevOps culture). Think in terms of developers working with their operations counterparts to deliver operable applications, or operations specialists monitoring applications delivered by development.  

    Useful KPIs should be obtainable, reviewable, and actionable. With the ‘actionable’ KPI playing a significant role in analytics for DevOps. We’ll circle back to the actionable part later on in this blog. 

    If you’re managing a team, you might want to consider adding usage analytics to your routine. This is an excellent way to see who is looking at their dashboards and using the data to move things forward.  

    With DevOps touching on so many different aspects of the entire company (teams, people, processes, etc.), there are multiple DevOps dashboards that can be created. Most of the metrics you will want to measure will fit within three categories:

    1. The process

    This is the ultimate measurement. With DevOps teams working together to improve the entire service lifecycle, from design through development to production and support, this is where DevOps data analysis will drive efficiency and effectiveness. 

    A good time-to-market dashboard can be used to measure and optimize the product development cycle. Phases of the product that are in delay can be identified quickly, and the overall delivery of the project can be minimized by acting on those insights quickly. 

    Quality assurance dashboards can help i managing the release of the project. Detailed information on the cases in progress and the number of completed and failed cases all contribute to the decisions that need to be made.  Bottom line, if the quality assurance manager decides the release isn’t up to par, there better be data to back it up. 

    It is recommended that you set alerts for key milestones that you want to reach, and get notifications if you are under or over the target. This will give you time to adjust the process and get those sequences back on track before it affects the entire process.

    Set up your dashboard to include milestones, and then monitor the progress daily, even hourly if necessary. Just imagine what your daily scrum meetings will look like when you pull up a visual dashboard showing the team’s progress for build duration or cycle times? Nothing beats the visual impact of a chart to display the hard work of the entire team. And at the end of the sprint, you can pull up the historical data and discuss the output, and what can be improved in the coming period.   

    Once you execute a successful activity, you can use it again and again. With DevOps teams putting more emphasis on releasing software faster and more reliably, reusing good processes can speed up the frequency of releases without diminishing the quality.

    2. The end user

    The holy grail of measurements. If customers are not satisfied with the product or release, then it doesn’t matter how efficient or effective your DevOps process is. Customer success is the base of any business growth, and there are many metrics you can choose from to measure this: NPS (Net Promoter Score), customer satisfaction score, customer effort score, churn rate, expansion revenue, and more.

    Set up a customer service dashboard to track the most crucial customer service metrics for your business. Some examples of the metrics this dashboard shows are customer churn, customer engagement, and customer experience, as well as metrics like help desk tickets, CRM tracker, and call center analytics. It may not be obvious, but all of these items can be indicators that something is off and in need of a quick fix to bring value back to your customers. 

    It will also help to keep track of customer satisfaction by asking your customers straight-up if they would recommend your product/service to a friend or colleague. High or low scores here are a direct reflection of your DevOps processes, and how you deliver quality products to your customers.

    3. The people

    People metrics, or employees, are a great way to measure success. Happy employees tend to have a tremendous cascading effect on the entire company, and the accompanying processes.

    If you’re doing DevOps right, then a majority of your employees will be experiencing modification (almost daily) in the way they are collaborating with other teams. HR teams can use analytics to keep their fingers on the pulse of these specific teams and stay ahead of any issues that may be swerving from the norm. 

    Here are some of the main HR KPIs that can be associated with DevOps: employee turnover rate, productivity, satisfaction, and even employee compensation (check to make sure you’re paying your DevOps people what they’re worth). HR teams can use people analytics to provide insight into organizational performance and engagement.

    Keeping your DevOps people happy will keep the spirit of this movement (cooperation, communications, and transparency) alive and kicking in your company.  

    Make it actionable and gain a competitive edge

    Once you’ve started to measure DevOps KPIs, the next step will be to start implementing changes in the process by becoming predictive with your analytics. Predictive analytics uses AI (Augmented Intelligence) and ML (Machine Learning) to identify other patterns and relationships that might be hidden or harder to realize. This is the actionable part of your KPIs. By using predictive analytics, you can detect certain anomalies, act on them, and gain a real competitive edge.

    Now you’re making the most of your DevOps journey.

    Author: Dana Liberty

    Source: Sisense

  • Qlik voor zesde jaar op rij in leiderskwadrant van Gartner’s Business Intelligence en Analytics Platforms Magic Quadrant

    Qlik Product SmallQLIK, leider in visual analytics, is door Gartner, Inc. in het leiderskwadrant van het Business Intelligence en Analytics Platform Magic Quadrant Report 2016* geplaatst. Het is het zesde opeenvolgende jaar dat Qlik deze positie in het leiderskwadrant inneemt op basis van een complete visie en deskundige uitvoering. Het Gartner Magic Quadrant is een momentopname hoe leveranciers presteren binnen een marktsegment. Dit met als doel eindgebruikers te ondersteunen bij hun keuze van bedrijven waarmee ze kunnen samenwerken of waarvan ze producten of diensten kunnen afnemen. Het volledige rapport is te downloaden via www.qlik.com/mq2016.

    Gartner stelt in het rapport: “Leiders zijn leveranciers die een goed begrip hebben van de mogelijkheden van een product en zich inzetten om bij te dragen aan het succes van hun klant in de huidige markt. Deze leiders weten dit te combineren met een begrijpelijk en aantrekkelijk prijsmodel dat de proof of value ondersteunt en incrementeel is aan te kopen. In een moderne BI- en analytics-markt worden aankoopbeslissingen voornamelijk beïnvloed en gemaakt door de business. Zij eisen gebruiksvriendelijke en makkelijk af te nemen producten die duidelijk zorgen voor toegevoegde waarde. Deze producten maken krachtige analytics mogelijk, zonder dat daar technische expertise voor nodig is of dat vooraf betrokkenheid van IT vereist is. In een markt die zich snel ontwikkelt, moet een leider ook laten zien dat hij zich niet enkel richt op de huidige activiteiten. Een leider moet een gedegen plan hebben dat ook de toekomstige positie van een marktleider onderstreept. Daarmee wordt immers de investering van de kopers van vandaag ook in de toekomst veilig gesteld.” Gartner stelt verder dat de netto nieuwe aankopen in de markt worden gedomineerd door vendoren die zich richten op flexibiliteit en gebruiksgemak voor zakelijke gebruikers. Dit wordt gecombineerd met de mogelijkheid om zelf controle te houden over implementaties en het bevorderen van het op een verantwoorde wijze creëren, distribueren en gebruiken van analytische content binnen het platform.

    Lars Björk, CEO van Qlik: “Wij zijn altijd een disruptieve kracht binnen de BI-markt geweest. In het verleden was ons Data Discovery-aanbod nog een visionair concept dat andere spelers in de markt graag wilden toevoegen aan hun bestaande oplossingen. Nu is Data Discovery simpelweg de standaard voor Business Intelligence en hebben we ons visionair leiderschap gehandhaafd. Qlik zal als pionier in de markt continu gedurfde stappen blijven zetten om een modern platform te bouwen dat niet alleen mooie visualisaties oplevert, maar een platform vormt voor enterprise-class governance en beheersbaarheid dat kan meegroeien met klanten in hun visual analytics reis.”

    Aangezien de BI-markt steeds volwassener wordt, richt Qlik zich op de volgende transformatie van de BI-markt om zo aan de evoluerende vraag van de gebruikers te blijven voldoen. Recent heeft Qlik dan ook Qlik Sense Enterprise 2.0 aangekondigd, een modern platformgebaseerde benadering van Business Intelligence. Het aanbod is geschikt voor een grote hoeveelheid aan use cases binnen de hele organisatie, variërend van self-service data visualisatie, rapportage en dashboards tot en met guided analytics en embedded analytics. Dit is allemaal gebaseerd op de gepatenteerde QIX Associative Indexing Engine. Het bedrijf heeft ook het Qlik® Analytics Platform geïntroduceerd, speciaal ontworpen om developers te helpen om eenvoudig alle denkbare data-driven apps te bouwen. Met Qlik® Cloud kunnen zij deze applicaties gratis delen, de samenwerking verbeteren en interactieve charts eenvoudig publiekelijk delen via Qlik Sense Charts. Met de overname van DataMarket is het nu ook mogelijk om naadloos toegang te krijgen tot on-demand externe databronnen.

    Aangezien Qlik vastberaden is om ook voorop te blijven lopen op het gebied van customer service en support, heeft het bedrijf Qlik Proactive Support geïntroduceerd – de eerste proactieve monitoring-as-a-service oplossing voor BI. Hiermee kunnen problemen voorkomen en opgelost worden. Qlik Proactive Support biedt real-time anomaliedetectie, live diagnostiek en regelmatige health-check reports. Qlik Support voorziet daarbij proactief van aanbevelingen om optimaal te blijven presteren en risico’s te beperken.

    Na jaren van voortdurende innovatie en het veranderen van BI, handhaaft Qlik haar overtuiging dat alle mensen analyses moeten kunnen maken van iedere databron – on premise of in de cloud, intern of extern – zonder beperkingen of grenzen. Met de associatieve ervaring van Qlik kunnen gebruikers, individueel en in groepen, vragen stellen en beantwoorden waar en wanneer ze maar willen. Hierdoor kunnen gebruikers onbeperkt met informatie aan de slag en ontdekkingen doen, zonder beperkt te worden tot een vooraf gedefinieerd pad aan vragen. Inzicht komt immers voort uit het daadwerkelijk begrijpen van data vanuit alle verschillende invalshoeken.

    Over Qlik
    Qlik (NASDAQ: QLIK) is een leider in visual analytics. Het product portfolio voldoet aan de groeiende behoefte van klanten naar rapportage en visuele self-service analyses voor guided, embedded en op maatgemaakte analytics. Ruim 37.000 klanten vertrouwen op Qlik oplossingen om betekenis te geven aan informatie van verschillende bronnen. Zo kunnen zij verborgen verbindingen binnen de data ontdekken die leiden tot inzichten die goede ideeën tot stand brengen. Het hoofdkantoor van Qlik is gevestigd in Radnor, Pennsylvania. Qlik heeft wereldwijd vestigingen die samen met ruim 1.700 partners meer dan 100 landen bedienen.

    Over het Magic Quadrant
    Over het Magic Quadrant Gartner beveelt geen van de leveranciers, producten of services aan die zijn opgenomen in de onderzoekspublicaties. Ook adviseert Gartner technologiegebruikers niet alleen die leveranciers te kiezen met de hoogste ratings. De onderzoekspublicaties van Gartner bevatten de mening van het onderzoeksinstituut en zijn geen statement. Gartner geeft geen garanties, expliciet of impliciet, betreffende deze research, ook niet over de handelskwaliteiten of geschiktheid voor een bepaald doel.

    Source: Emerce

  • Real-Time Business Intelligence: What is it and why should every business use it?

    Real-Time Business Intelligence: What is it and why should every business use it?

    It’s becoming increasingly obvious that the pace at which we create new data won’t slow down anytime soon. With millions of streams and touch points, most businesses create gigabytes of data every minute, yet our rate of data comprehension can't keep up.

    To get the most out of data, companies need to analyze it as soon as it is created, when it can provide the most immediate and relevant insights. Unlike traditional models that look at historical data for patterns, real-time analytics focuses on understanding information as it arrives to help make faster, better decisions. Real-time business intelligence is a necessity more than a luxury nowadays, so it’s important to understand exactly what it is and what it can do for you and your business.

    What is Real-Time Business Intelligence?

    Real-time business intelligence is the use of analytics and other data processing tools to give companies access to the most recent, relevant data and visualizations. More than anything, this up-to-the-minute information lets organizations make smarter decisions and better understand their operations. To provide real-time data, these platforms use smart data storage solutions such as Redshift data warehouses, visualizations, and ad hoc analytics tools.

    To successfully provide you with the best data, real-time BI tools use a combination of serverless analytics (where data is transmitted directly to a dashboard or visualization) and data warehouses. This allows dashboards to show both real-time and historic data in a holistic way. It also lets companies provide users with the data they need to complete their jobs more effectively, and even assists in predictive analytics.

    Why is Real-Time Business Intelligence crucial for organizations?

    At its core, real-time BI is about understanding data faster and using it to make wise snap decisions. For organizations which produce gigabytes, or even terabytes in some cases, much of this information loses its relevance once it’s sitting in storage. Information about inventory levels, customer needs, ongoing services, and more can all be incredibly useful, but more so if it’s analyzed as soon as it’s generated.

    Real-time analytics and BI also empower users in organizations to perform their own research and use their available data. This includes the ability to perform ad-hoc analysis on existing data or creating visualizations specific to new streams. Finally, real-time BI helps better understand trends and create more accurate predictive models for organizations.

    Who uses Real-Time Business Intelligence?

    There are several industries, sectors, and fields where using real-time BI can help to optimize an organization:

    • Customer relationship management suites (CRMs) can use real-time data to provide better service for consumers. This includes providing better engagement by tailoring services and conversations to consumers’ known preferences in the moment.
    • Logistics organizations can better understand their supply chains and workflows as they evolve. This allows for better decisions in emergencies (how to best alter a route for faster delivery) or simply to create more efficient chains.
    • Inventory managers can see their stock falling and rising as it happens, letting them make better calls about which products to order. By combining with historic trends, they can also create predictive models for ordering that automate time-consuming tasks.
    • Manufacturers can collect real-time data from machinery and production chains, and see how they are performing, as well as their condition. This allows them to both improve efficiency and productivity and resolve any maintenance issues before they become a full-blown emergency.
    • Risk analysts can take advantage of incoming data to continuously modify their risk models and make better calls on insurance, loans, and a variety of other financial decisions.

    What are the Real-Time Business Intelligence best practices?

    These are some of the best ways to ensure your real-time BI tools give you the best results:

    • Understand the value of your data – The value of data decreases as it ages. Understand how quickly this change happens and make sure your BI tools optimize your information.
    • Have a strong infrastructure built for your data  Once collected, data must be stored, scrubbed, and parsed. Having a clear data warehousing solution and organized data flow can help you optimize your results.
    • Collect the right data  Most importantly, real-time analytics works best when you’re using the most relevant data. Focus on the data that can give you the most pertinent answers for each situation.

    Real-time BI is becoming a central aspect of organizations’ decision-making. By implementing the right solution and empowering your teams to use it, you can start making upgrades and finding the best insights in your data.

    Author: Shelby Blitz

    Source: Sisense

  • Terugblik Business Analytics congres

    Het Business Analytics congres 2016 zit er weer op! Dit jaar stond het thema “Hoe advanced zijn jouw analytics?” centraal. Een dag na het event kunnen we terugkijken op een zeer geslaagde editie.

    Met meer dan 135 deelnemers, interessante praktijkcases van o.a. De Nederlandse Bank, DHL, Amazon en een overvol netwerkplein, werd volop ingegaan op wat predictive analytics voor toegevoegde waarde biedt aan uw organisatie.

    Wij kijken terug op een fantastische dag met inspirerende sprekers en deelnemers die naar huis gaan met veel tips & tricks, zodat ze gelijk aan de slag kunnen binnen hun organisatie.

    Dit congres gemist? Over 4 maanden staat het volgende Business Intelligence congres alweer op de planning.

    360º BI: Hét business intelligence congres 2017

    Het Business Intelligence vakgebied bevindt zich op vele vlakken in een transformatie. De technische mogelijkheden nemen toe met Big Data en Analytics maar ook de vaardigheden die mensen nodig hebben om van informatie consument te veranderen in een informatie producent. De vraag naar self-service opties binnen het bedrijf is zo goed als standaard geworden in een situatie waarbij het aantal informatie bronnen alleen maar toeneemt. De klassieke BI omgeving dient zich hier op aan te passen. Uitdagingen genoeg!

    Hoog tijd om opnieuw te kijken waar Business Intelligence zich in uw organisatie bevindt.

    • Past de huidige inrichting nog bij de ambities?
    • Welke stappen moeten ondernomen worden om beter in staat te zijn voordelen met data te behalen?
    • Hoe hou je het product van business intelligence betrouwbaar?

    Op dit congres staat Business Intelligence Maturity centraal. Diverse business intelligence onderwerpen op het gebied van mens, organisatie en techniek worden besproken en praktijkcases laten zien hoe succes bereikt kan worden.

    Wilt u op de hoogte blijven van de ontwikkelingen in het vakgebied, u laten inspireren door andere organisaties en kennis delen? Meld u dan nu aan voor deelname aan dit congres op 7 februari 2017 in Supernova, Jaarbeurs. Ga naar http://www.analyticstoday.nl/events/het-business-intelligence-congres-2017/ voor meer informatie en aanmelden.


  • The 7 most relevant metrics for sales managers

    The 7 most relevant metrics for sales managers

    Sales managers need to be savvy and strategic to get ahead. These are the 7 metrics every sales manager must know and be able to measure.

    We are now in an age where sales managers have a myriad of advanced data measurement and analytics options available to them. Through sales analytics software, sales managers can gain insight into their sales team’s pipeline and have a team that works more effectively and efficiently. But are you measuring the right things?

    Data analytics solutions have revolutionised sales measurement. They enable sales managers to pinpoint where their teams can generate more leads as well as cross-sell and upsell to existing customers and define customer profitability. The potential exists for sales managers to enjoy great benefits. Whether they get to enjoy such benefits depends on how you use the solutions available.

    So, are you measuring the necessary metrics to ensure your sales team are working at optimum level? We outline the seven metrics every sales manager should know and use.

    1. The sales pipeline

    This is a great way to gauge a company’s health. Sometimes presented in a graphical format, it shows the sales opportunities the company currently has and an estimation of the amount of revenue the sales team is going to generate in the coming months. If the opportunities within the pipeline are managed well, the sales team will stay organised and feel more in control of their sales figures, giving the sales manager more confidence in the targets that can be achieved.

    What metrics should be measured in a sales team’s pipeline?

    • Number of potential deals in your pipeline
    • Average size of a deal (in €/$) in your pipeline
    • Average percentage of deals that are converted from leads to customers
    • Average time deals are in the pipeline (measured in days)

    2. Sales revenue

    Measuring the revenue a sales team brings in, instead of only their profit margin, gives a sales manager more insight into the business' performance. If a company experiences steady “top-line growth”, it could be viewed that the performance in that period was positive even if the earnings growth or “bottom-line growth” didn’t change.

    Measuring revenue allows you to to identify the profitability of the business. By calculating the profit ratio (divide net income by sales revenue) businesses can reveal how much of every dollar brought in by sales actually makes it to the bottom line.

    3. Forecast accuracy

    Forecasts will never be exact, but there are tools available that will assist a business in creating the most accurate forecast as possible. The accuracy of a sales team’s forecasts needs to be measured on an ongoing basis to ensure that they are continually reaching their predicted targets or at least getting closer to them as time goes on. Producing accurate forecasts enables a company to reveal issues threatening the business as well as opportunities available.

    4. Sales funnel leakage

    No sales team wants a leaky funnel but sometimes with limited technology and man-power this can happen. It’s imperative to know where the holes in your funnel are, how they occured and how you can essentially ‘plug’ them. Things to review include:

    • Lead response time: a business that responds quickly to a sales qualified lead is more likely to win the sale
    • Rate of follow up contact: persistence is key, a sales teams should be continually following up with a lead via phone calls and emails until they are deemed no longer qualified

    By constantly monitoring this data and putting means in place to avoid opportunity leakage, the overall sales numbers will improve.

    5. Win vs loss rate

    It’s important to understand the reasons why leads buy or don’t buy a company’s product or service. This information is crucial as it can assist in improving a sales team’s close rate thus gaining more market share for the business. 

    6. Cross-sell and upsell opportunities

    Cross-selling and upselling can be complex and risky. However, with the challenges around new customer acquisition, businesses must find ways to improve sales from existing customers. With the right analytics tool, businesses can identify cross-selling and upselling opportunities in the organisation and ultimately, generate more sales for the business.

    7. Closure rate or “win rate”

    It’s important to be aware of how many leads or opportunities are being converted into customers. This metric focuses on the final stage of a sales team’s pipeline. By this point a sales team would have invested a lot of time and resources into the lead so this rate should be as high as possible.

    A low or constantly changing closure rate signifies lack of competitiveness in the market, it means the value proposition being offered to the leads is not good enough. It may also mean that the sales team requires additional training.

    Measuring a sales team’s performance has evolved from the simple spreadsheets used back in the 20th century. There are now advanced business intelligence software options that provide dynamic reporting capabilities with dashboards to help automatically track key metrics. This gives a sales manager the ability to become more proactive as well as make more insightful and strategic decisions that will benefit the company.

    Source: Phocas Software

  • The BI trends your business cannot neglect in the near future

    The BI trends your business cannot neglect in the near future

    According to the World Economic Forum’s Future of Jobs Report, the top five trends set to positively impact business growth through 2022 are (1) the increasing adoption of new technology, (2) the increasing availability of big data, (3) advances in mobile internet, (4) advances in AI, and (5) advances in cloud technology.

    This nexus of these and other trends, and their accelerated innovation and development (as an example, think of how fast we’ve gone from rotary phones to smartphones, to the dematerialization of other devices onto smartphones, and now to 5G), raises the imperative for organizations to focus their next-decade vision and investment strategy now.

    Consider these 2020 and beyond assertions for enterprise analytics and mobility from Ventana Research:

    • By 2020, analysis of streams of IoT event data will be a standard component of nearly all big data deployments.
    • By 2021, two-thirds of analytics processes will no longer simply discover what happened and why, they will also prescribe what should be done.
    • By 2022, one-half of organizations will re-examine the use of mobile devices and conclude the technology being used does not adequately address the needs of their workers, leading them to examine a new generation of mobile apps.

    And that’s just a start. In '10 Enterprise Analytics Trends to Watch in 2019', Ventana Research CEO Mark Smith notes that in addition to 5G, enterprise organizations’ mobility strategies must absolutely address accelerating technologies and capabilities, such as:

    • Device proximity features that can provide environmental context and suggest where to take action based on location.
    • Gestures and camera-based input that make it even easier and faster to engage with business applications.
    • Biometrics, from facial recognition to fingerprints, that enable significantly better device, data, and enterprise security.
    • High-quality device cameras that make it easy to capture, share, and use photos and videos and their data within business processes.
    • Augmented reality (AR) that enables the use of a mobile device’s camera to digitally interpose virtual objects to enhance work experiences.
    • Speech recognition and voice assistants on mobile devices that make it simpler for users to access information and act quickly.

    The future is here. Is your organization ready to take advantage of the accelerated innovation around enterprise analytics and mobility?

    Source: MicroStrategy

  • The essence of centralizing analytics: a health system perspective

    Hospitals and health systems continue to invest in data analytics, but (too) often a fragmented, decentralized approach to analytics delivery models results in excessive costs, inefficiency and missed opportunities to improve patient care.

    A number of factors have coalesced in recent years to catalyze greater investment in healthcare analytics – the ongoing transition to new payment models under value-based care, a greater emphasis on the health of populations, and increasing competition. But also the explosion in available health data from electronic health records, laboratory test results, and wearable devices – to name a few.

    The momentum isn’t expected to slow down any time soon. A recent report from Zion Market Research predicts the global healthcare analytics market to grow to $68 billion in 2024 from approximately $20 billion in 2017, a compound annual growth rate of more than 19 percent.

    While there’s no question that providing organizations are busy writing checks to healthcare analytics vendors, there is some question about whether they’re getting an adequate bang for their bucks.

    For example, a Deloitte survey of U.S. hospitals and health systems with greater than $500 million in revenues found that fewer than half of respondents said their organization had a clear, integrated data analytics strategy, while about one in four didn’t have a data governance model in placebat all. Even more problematic, about one in three reported that they didn’t know their organizations’ total analytics spend.

    Multiple vendors, no single source of truth

    A common cause of many of these issues is a decentralized approach to analytics in which data analysis happens in different business units that do not share assumptions, analytics methods or insights broadly. In contrast, under a centralized delivery model, an experienced team of data analysts report to one function at the enterprise level, even if they are assigned to serve different business units, based on strategic priorities set at the corporate level. This business-oriented team of analysts meets the need of organizational stakeholders while maintaining and developing in-house intelligence.

    For a large part, a centralized analytics delivery model is important because it offers an improvement to the fragmented, incomplete data governance models that too many providers still use. For example, it’s not uncommon for large health systems to contract with multiple vendors to analyze population health risk for groups of patients with different conditions, such as diabetes and osteoarthritis among others.

    This lack of a single source of truth in analytics can lead to different answers to the same question, such as conflicting guidance on levels of risk, and in turn, on the highest-priority patients to target for interventions. As a result of this fragmented and potentially conflicting information, when prioritizing care plans and interventions, the health system cannot build a consistent clinical profile with a 360-degree view of each patient that accounts for the same factors.

    This results in health system decision makers being left wondering which vendors’ information they should believe.

    Delivering analytics as a service across the organization

    In addition to the fragmentation of data, there are a number of common barriers that prevent hospitals from efficiently and cost-effectively deploying analytics across their organizations, including territorial disputes over data, unclear roles and responsibilities and competition for already-scarce resources.

    As with virtually all organizational transitions, success in centralizing analytics starts with buy-in at the top. Strong executive leadership must bring together talented people with deep experience in applying analytical expertise to solving pressing clinical and business issues.

    A best practice is to place a senior-level executive in charge of analytics, potentially in a Chief Data Officer role, to lead the organization’s centralization initiative. A key function of this role is to establish effective and comprehensive data governance practices, clearly defining what type of data the organization will collect, how the data is structured, who can access it, and how it gets reported and presented to different people in the organization, among other steps.

    Once the organization establishes a solid foundation for data, it will be ready to adopt a single analytics platform that delivers actionable information to decision makers. Today’s leading analytics platforms often employ machine-learning systems to automatically extract important insights that may not be otherwise apparent to human analysts.

    Ultimately, the aim is the creation of one internal, centralized professional services group within the organization that delivers analytics as a service to other stakeholders in the hospital. By structuring a hospital’s analytics functions this way, the organization can eliminate the fragmentation and cacophony of multiple systems that offer conflicting insights and prevent leadership from understanding the organization’s full analytics spend.

    Generalization in practice

    Already, prominent health systems like University of Michigan Health System (UMHS) and Beth Israel Deaconess Medical Center (BIDMC) have taken the leap to centralized analytics delivery models. UMHS, for example, has created comprehensive registries for population health and used them to generate predictive analytics that focus predominantly on chronic diseases. BIDMC, through its centralized analytics governance model, provides layers of decision support and analytics for its physicians, with the goal of understanding variations in cost and care to maximize quality, safety, and efficiency.

    In the future, the insights derived from centralized analytics delivery models are likely to help hospitals improve quality, lower costs, identify at-risk populations and better understand performance. For that to happen, however, hospitals and health systems must first overcome the fragmented, decentralized approach to analytics that prevents them from realizing the full value of their analytics investments.

    Source: Insidebigdata

  • The impact of analytics on the field of marketing

    The impact of analytics on the field of marketing

    The rise of analytics is affecting every industry. The field of marketing is ahead of the curve, having been a metrics-driven industry for decades, but which metrics a company decides to focus on, how they track, report, and analyze those metrics, and what they do with that analysis makes a big difference. CEO Garrett Mehrguth from Directive Consulting talks about the impact that analytics (overall and customer-facing) have had, have, and will have on the marketing world.

    Beware analysis paralysis

    If you’re in a data-driven industry, then there’s no downside to tracking and analyzing your performance data. However, this cuts both ways: the idea of “if it’s not measurable, it’s not valuable” has taken hold in a lot of marketing organizations. Relatedly, sometimes organizations will latch onto a metric just because they can measure it, regardless of whether it is truly tied to results. To complicate matters further, having too much data can cause organizations to become so fixated on tracking every metric that they spend less time actually doing their jobs. People get lost in dashboards, fiddling with numbers instead of actually making progress. These companies get stuck in a morass of data points, charts, and dashboards that look and feel important, but might not help improve the bottom line. This type of dashboard fetishism that stymies action is an example of analysis paralysis.

    “I see this constantly happening to in-house marketers and others. You get so much data you’re inundated with and you want to track everything, and then you forget that keeping track of everything, even when it’s automated, becomes such a component of your time that you actually no longer have time to move the needle. You’re spending all this time in this analysis paralysis…” said Garrett Mehrguth.

    Having the right focus

    In a world of distractions, both companies and individuals are constantly exhorted to have the right focus (while being exposed to a slew of inspirational posts, articles, Instagram accounts, etc). What this means for marketing companies (or any company with involved in analytics), is that they should move away from vanity metrics and focus on the KPIs that really make a difference in their business or their clients’ businesses.

    Embedded analytics solutions give end-users easier insights into their own metrics by eaasily combining complex data from multiple sources. Buying that solution instead of building it in-house allows companies to build and ship robust analytics to their customers without investing a ton of developer resources in something that’s not the core competency of their business. However, where marketing companies can truly add value to their clients’ lives is with expertise: understanding which metrics matter to their bottom line and what those numbers and trends mean is important for the kinds of relationships that marketing companies should be trying to build with their clients.

    “We don’t work with any organizations that don’t have access to their own analytics and have a pretty good idea of what’s going on.” said Garrett Mehrguth. He also stressed that knowing which metrics matter most and having analytics that let them see and act on those numbers leads to better outcomes for marketers especially: “Knee-jerk marketing is the worst marketing in the world, and a lot of that reactivity comes from what marketers think they’re marketing should be accomplishing, not what reality says that their marketing can accomplish.”

    The right analytics solution gives clients a detailed look at how their companies are performing on their own terms and compared to their market peers. If a company is down year-over-year and their entire segment is down, then this level of transparency helps them see that and adjust expectations accordingly. Meanwhile, if their growth has been sluggish, but they’re still beating competitors, then they can see that as well. It’s all about helping users to get a better understanding of their world and keep their mindset focused on the possible and the achievable, instead of torturing themselves with unattainable goals. This kind of transparency also helps client relationships:

    “The reason I love being transparent and being held accountable to the quality of our efforts and our recommendations is that if you make an account more than they’re paying you, it’s pretty hard to get fired.” althus Garrett Mehrguth.

    The holistic approach

    So often, when working with a consulting firm, the client is focused on end results, but doesn’t have any idea whether or not the relationship is working well until it’s either too late or the contract is over, at which point they may or may not have gotten what they wanted out of it. In-depth analytics allow service-providers to prove efficacy and value quickly, easily, and continuously. One interesting thing that Directive does with their in-house analytics takes their relationship-building to the next level. Their account managers use in-house metrics to take a holistic approach to customer service:

    Garrett Mehrguth: “We’re using our analytics for quality control based onqualitativedata, all based on client feedback: ‘hey, this client’s a little uneasy,’ ‘hey, this client’s really happy,’ ‘this client needs some more attention.’ And now what we’re doing is using that data across the entire portfolio to get what we call leading indicators.”

    These leading indicators, created using a mix of the complex data at Directive’s disposal for in-house purposes (not client-facing), help them tie the actions the team is taking with clients to the results those clients are experiencing. In some cases, this means adjusting expectations or allocating additional resources to correct a situation. What it helps prevent is unhappy customers complaining late in the process, as per Garrett Mehrguth:

    “Sometimes it takes a little while to right the ship of a struggling client, but if you don’t know about that and you’re just waiting until they complain—which they have every right to do, because you haven’t been doing your job—then you’re behind the eight-ball.”

    So when it comes to marketing, in-depth analytics are good for marketers, marketing clients, and marketing service companies and consultants. They’re so important that Garrett Mehrguth sees major hurdles in the future for any marketing company that doesn’t offer detailed reporting and support to help their clients get the most out of it:

    “If you’re in marketing and you’re not correlating your efforts back to actual revenue with analytics, you’re going to struggle to close clients.”


    Analytics are everywhere. From a start as spreadsheets and CSV files to the era of self-service and the looming AI revolution around us, no industry can afford to stand still when it comes to its analytics. Marketing, steeped as it is in a history of metrics and data, is a leading-edge case. Marketers have more data than ever before and are struggling to avoid analysis paralysis and have the right focus. Service providers have to do more than just deliver amazing analytics and insights: they also need to be supportive, helping to guide their clients to make the smartest decisions and understand their markets. If your business can manage this, itll be able to prove value, delight its customers, and build the future in your desired way.

    Source: Sisense

  • The massive impact of data science on the web development business

    The massive impact of data science on the web development business

    “A billion hours ago, modern Homo sapiens emerged.
    A billion minutes ago, Christianity began.
    A billion seconds ago, the IBM personal computer was released.
    A billion Google searches ago… was this morning."

    - Hal Varian, Google’s Chief Economist, December, 2013 (From the book: Work Rules by Laszlo Buck)

    The last line of the above quote characterizes the world’s hunger for information. Information plays a huge role in our life. Information consumed by our senses helps our mind in making decisions. But what happens when the mind is flooded with information? You get confused, annoyed and scared of decision-making. This is where your computers and processors come to rescue, and this is when the term 'information' is replaced by 'data'.

    Every minute, more than a hundred hours of video content is uploaded on YouTube. From application stores, over 50 billion apps have already been downloaded since 2008. There are more than 2 billion people signed up on social media websites. These numbers are just giving you a glimpse of the amount of data which is flowing through the optical fibers every second around the world. And now the question comes: how to make this massive amount of data useful? The answer is analytics. If you know how to play with numbers and extract the nectar of useful insights from this huge amount of data using appropriate analytical tools, then you are my friend, are a real data scientist.

    Data science is helping many businesses, irrespective of them being B2B or B2C. But in this article, we are going to talk more about its role in one of the biggest B2B industries: Custom Web Development. If you are a web developer, you must not ignore the rise of data science in your profession, and if you are thinking about hiring one, then you should know about the latest trends to supervise the development process in a better way. So, let’s discuss the impact of data science in the transformation of web development:

    1. Re(de)fining the software solutions

    Not a very long time ago, web developers used to be creative with page layouts and menu details. It was generally guess-work, but now data science tells web developers about the layouts and details of the competitor websites. Hence, they can propose a unique design after carefully evaluating the competition.

    Also with the help of the latest analytical tools, web developers can know what the requirements of the end users are. They can suggest particular functions or features which are popular among the customers based on the analysis of consumer data. In this way, data science is assisting the developers in providing better and faster software solutions to their clients.

    2. Automatic updates

    Gone are the days when updates had to be manually administered by the developers. This is the era of automation. Machine learning has enabled tools to analyze consumer behavior and data available on social media platforms to come up with required updates. The websites are made self-learning so that they can improve themselves with the changing demands of the customers. It is possible only because data science is doing its job perfectly.

    Although this part is still facing some challenges with creating customized solutions for different clients, but soon custom web development services will make it a piece of cake with the help of data science.

    3. Customizing for end users

    We have discussed until now that how web development can be customized for the clients using data science, but the real goal should be the satisfaction of end users. And satisfaction is a dependent variable of personalization. To create a personalized product for the users, you need to know them, and in this regard data science is helping web developers.

    The spending habits, interest areas, preferred websites, geographical location, age, and gender, etc. all this information of the end users are used to create algorithmic models which can predict the consumer’s alignment towards your web apps. Using these models, you can not only give the user a personalized experience on the website but also strategically place your ads targeting specific customer segments, thus, creating a win-win situation for both buyer and seller.

    4. Changing hot-skills

    Apart from changing the way the web is being designed by developers, data science is influencing the transformation of web development in one more way: by revolutionizing the job market. With ever-changing needs of the industry, a web development company wants employees equipped with the skills of using the latest data and analytics tools.

    The developers looking for jobs today are expected to have knowledge of tools like python and google analytics. They are asked about their proficiency in creating AI and ML programs in their interviews. Therefore, one has to stay updated to stay relevant.

    5. Customer’s expectations

    Do you get irritated when the Uber’s driver calls you to ask about your pick-up location when it can be easily tracked by the GPS and clearly displayed on his device’s screen? Won’t you feel uncomfortable if you misspell something while typing on your messenger and autocorrect stops helping? And don’t you feel nice when you buy a phone online and the web app suggests your latest phone covers for it?

    Well, if the answer is yes, then you are becoming dependent on data science too. Don’t worr, you're not the only one. Customers worldwide like extra help provided by businesses. And this dependency on data will soon make the use of data science a hygiene factor in web development.


    Although it’s called Data Science, using it is nothing less than an art. It requires expertise and dedication to develop a web app which completely harnesses the potential of data science.

    Data science is a vast field. It is responsible for AI, machine learning, big data, analytics, etc. This also drives technologies such as the Internet of Things and AR/VR. Hence, when all the modern buzzwords of business are somewhere related to data science, it requiress absolute ignorance to neglect the role of data science in the development of websites and web apps.

    Source: Datafloq

  • The uses of workforce analytics in inventory management

    Inventory management software can make a big difference in helping to decrease your inventory gap and boost your profit. Here's what to know:

    You know better than anybody how many variables are in play during an average day in warehouse and inventory management. There’s low-turnover merchandise to look after, high-demand merchandise to re-order and equipment breakdowns to contend with. With global e-commerce raising the bar for efficiency and competitiveness, it just makes good business sense to look atinventory management software for supply chains. Here’s what you should know.

    Why worry about inventory gap?

    The primary advantage of reducing your inventory gap through smarter inventory management is cost savings. You’re probably not surprised — any gap or delay between inventory (supply) and the satisfaction of customer needs (demand) means lost profits. You also don’t want your warehouses or your vendors moving more product than they actually need. Inventory management dashboards can bring you better forecasting, a more visible and accurate approach to inventory counts and much more. Managing inventory more closely can reduce some of the more intolerable sources of inventory gap too, such as employee theft. When there’s a repeatable workflow for the handling of products and an accurate-in-real-time inventory management system to back it up, the attractive nuisance of having so much valuable merchandise close at hand starts looking a little less attractive. Then, think about the requirements of multichannel ecommerce sales. If your company offers customers and vendors a variety of platforms from which they can place orders or request service, you need a single order processor to keep everything straight. With modern-day warehouse management, you can easily automate your system. This can reduce ergonomic hazards employees face regularly among other safety risks. Invest in a system that can automatically receive and prioritize orders and check orders against existing inventory. New software can also plot likely sales forecasts to inform production decisions. Maybe the bottom line here is that inventory gaps and poor inventory management are types of liabilities. They make you liable for employee theft. They make you liable to re-order product that’s already sitting in your warehouse. They might even make you the weak link in an otherwise efficient, well-running, multi-discipline supply chain. Here’s a closer look at some of the advantages of bringing modern inventory management software into your operations.

    Solve inventory problems proactively

    If you’re tired of playing catch-up or scrambling for a course of action after something unexpected happens, you’re a prime candidate for inventory management software. And one of the most common events in warehousing is finding out you have more — or less — inventory than you thought. The best software for warehousing is one that builds a digital database of every incoming and outgoing SKU scan, in real-time. You’ve probably done one or two scheduled hand counts and paper-based inventories in your time. But thanks to QR codes, bar codes, RF scanners, RFID tags and other technologies, your database of available inventory can now be up to date, always, with every change in custody accounted for, from shipping to warehouse stowing to order picking and, eventually, packing and labeling for shipment again. You can still do random counts for peace of mind, but you’ll never again wonder what you have or where it is — or whether you’ll be able to meet incoming demand.

    Better manage your available space

    Devising more efficient and logical workflows is equal parts science and art. First-In-First-Out stowing of products and picking of customer orders is important for any number of reasons — including time-sensitive products and, let’s be honest, the accumulation of dust in product racks that comes from un-churned product. But making sure items are stowed with FIFO and size requirements in mind can help on both of these fronts and more. Just like your warehouse management software can provide real-time information about the materials coming and going from your facility, it can also give you a better understanding of your available storage and racking space. And it can help you arrive at better and more efficient solutions. For your stowing needs, management and pathfinding software can direct employees to the appropriate areas of the warehouse and automatically find bins of appropriate size and configuration for the incoming freight. The best systems can maximize chances to practice FIFO as well as cut down on mis-stows. What does this development mean for inventory gap? For a start, it means you’ll have a much lower chance of letting sensitive products spoil on your shelves because newer product was stacked on top or in front of it. Losing that many potential sales and having to scramble for new inventory can be devastating, and your bottom line might never recover if it happens at the wrong time of year or in the wrong season.

    Balance anticipated sales with leaner manufacturing

    Have you ever been in a situation where your sales teams were requesting extra inventory so that they would never miss a sale, only to be rebuffed by your supply chain managers, who had already been moving to leaner manufacturing principles? Finding the ideal middle-ground is something inventory management software can help you with. Your operation probably depends on manufacturers interfacing with procurement managers in warehouses and retail stores. If that’s the case, the importance of being as proactive as possible about ordering (or not ordering) new inventory becomes apparent. For example, you might have a product that turns over in your warehouses only once or twice a year. But if your sales teams bring about an unexpected, off-season demand for one of those products, you want software that can call upon the necessary parties in as timely a fashion as possible to meet demand. And more than that, you want enough data coming in that this intelligent, de-centralized system is able to produce reliable lead time estimates for everybody waiting downstream.

    Final thoughts

    Bringing a deliberate and, as appropriate, technologically advanced approach to managing inventory can be helpful in a variety of ways — but most of all in helping make sure there’s never too wide a gap between what you’re earning and what you have the ability (or the inventory) to earn.

    Author: Megan Ray Nichols

    Source: Smart Data Collective

  • The vision of IBM on Analytics

    IBM’s Vision user conference brings together customers who use its software for financial and sales performance management (FPM and SPM, respectively) as well as governance, risk management and compliance (GRC). Analytics is a technology that can enhance each of these activities. The recent conference and many of its sessions highlighted IBM’s growing emphasis on making more sophisticated analytics eaBi-kring sier to use by – and therefore more useful to – general business users and their organizations. The shift is important because the IT industry has spent a quarter of a century trying to make enterprise reporting (that is, descriptive analytics) suitable for an average individual to use with limited training. Today the market for reporting, dashboards and performance management software is saturated and largely a commodity, so the software industry – and IBM in particular – is turning its attention to the next frontier: predictive and prescriptive analytics. Prescriptive analytics holds particular promise for IBM’s analytics portfolio.

    The three basic types of analytics – descriptive, predictive and prescriptive – often are portrayed as a hierarchy, with descriptive analytics at the bottom and predictive and prescriptive (often referred to as “advanced analytics”) on the next two rungs. Descriptive analytics is like a rear-view mirror on an organization’s performance. This category includes variance and ratio analyses, dashboards and scorecards, among others. Continual refinement has enabled the software industry to largely succeed in making descriptive analytics an easy-to-use mainstream product (even though desktop spreadsheets remain the tool of choice). Today, companies in general and finance departments in particular handle basic analyses well, although they are not as effective as they could be. Our research on next-generation finance analytics shows, for example, that most financial analysts (68%) spend the largest amount of their time in the data preparation phases while a relatively small percentage (28%) use the bulk of their time to do what they are supposed to be doing: analysis. We find that this problem is mainly the result of issues with data, process and training.

    The upward shift in focus to the next levels of business analytics was a common theme throughout the Vision conference. This emphasis reflects a key element of IBM’s product strategy: to achieve a competitive advantage by making it easy for most individuals to use advanced analytics with limited training and without an advanced degree in statistics or a related discipline.

    VR2 June 2015

    The objective in using predictive analytics is to improve an organization’s ability to determine what’s likely to happen under certain circumstances with greater accuracy. It is used for four main functions:

    • Forecasting – enabling more nuanced projections by using multiple factors (such as weather and movable holidays for retail sales)
    • Alerting – when results differ materially from forecast values
    • Simulation – understanding the range of possible outcomes under different circumstances
    • Modeling – understanding the range of impacts of a single factor.

    Our research on next-generation business planning finds that despite its potential to improve the business value of planning,  only one in five companies use predictive analytics extensively in their planning processes.

    Predictive analytics can be useful for every facet of a business and especially for finance, sales and risk management. It can help these functions achieve greater accuracy in sales or operational plans, financial budgets and forecasts. The process of using it can identify the most important drivers of outcomes from historical data, which can support more effective modeling. Because plans and forecasts are rarely 100 percent accurate, a predictive model can support timely alerts when outcomes are significantly different from what was projected, enabling organizations to better understand the reasons for a disparity and to react to issues or opportunities sooner. When used for simulations, predictive models can give executives and managers deeper understanding of the range of potential outcomes and their most important drivers.

    Prescriptive analytics, the highest level, help guide decision-makers to make the best choice to achieve strategic or tactical objectives under a specified set of circumstances. The term is most widely applied to two areas:

    • Optimization – determining the best choice by taking into account the often conflicting business objectives or other forms of trade-offs while factoring in business constraints – for example, determining the best price to offer customers based on their characteristics. This helps businesses achieve the best balance of potential revenue and profitability or farmers to find the least costly mix of animal feeds to achieve weight objectives.
    • Stochastic Optimization – determining the best option as above but with random variables such as a commodity price, an interest rate or sales uplift. Financial institutions often use this form of prescriptive analytics to understand how to structure fixed income portfolios to achieve an optimal trade-off between return and risk.

    General purpose software packages for predictive and prescriptive analytics have existed for decades, but they were designed for expert users, not the trained rank-and-file. However, some applications that employ optimization for a specific purpose have been developed for nonexpert business users. For example, price and revenue optimization software, which I have written about is used in multiple industries.  Over the past few years, IBM has been making progress in improving ease of use of general purpose predictive and prescriptive analytics. These improvements were on display at Vision. One of the company’s major initiatives in this area is Watson Analytics. It is designed to simplify the process of gathering a set of data, exploring it for meaning and importance and generating graphics and storyboards to convey the discoveries. Along the way, the system can evaluate the overall suitability of the data the user has assembled for creating useful analyses and assisting general business users in exploring its meaning. IBM offers a free version that individuals can use on relatively small data sets as a test drive. Watson is a cognitive analytics system, which means it is by nature a work in progress. Through experience and feedback it learns various things including terminologies, analytical methods and the nuances of data structures. As such it will become more powerful as more people use it for a wider range of uses because of the system’s ability to “learn” rather than rely on a specific set of rules and logic.

    Broader use of optimization is the next frontier for business software vendors. Created and used appropriately, optimization models can deliver deep insights into the best available options and strategies more easily, accurately, consistently and effectively than conventional alternatives. Optimization eliminates individual biases, flawed conventional wisdom and the need to run ongoing iterations to arrive at the seemingly best solution. Optimization is at the heart of a network management and price and revenue optimization, to name two common application categories. Dozens of optimization applications (including ILOG, which IBM acquired) are available, but they are aimed at expert users.

    IBM’s objective is to make such prescriptive analytics useful to a wider audience. It plans to infuse optimization capabilities it into all of its analytical applications. Optimization can be used on a scale from large to small. Large-scale optimization supports strategic breakthroughs or major shifts in business models. Yet there also are many more ways that the use of optimization techniques embedded in a business application – micro-optimization – can be applied to business. In sales, for example, it can be applied to territory assignments taking into account multiple factors. In addition to making a fair distribution of total revenue potential, it can factor in other characteristics such as the size or profitability of the accounts, a maximum or minimum number of buying units and travel requirements for the sales representative. For operations, optimization can juggle maintenance downtime schedules. It can be applied to long-range planning to allocate R&D investments or capital outlays. In strategic finance it can be used to determine an optimal capital structure where future interest rates, tax rates and the cost of equity capital are uncertain.

    Along the way IBM also is trying to make optimization more accessible to expert users. Not every company or department needs or can afford a full suite of software and hardware to create applications that employ optimization. For them, IBM recently announced Decision Optimization on Cloud (DOcloud), which provides this capability as a cloud-based service; it also broadens the usability ofIBM ILOG CPLEX Optimizer. This service can be especially useful to operations research professionals and other expert users. Developers can create custom applications that embed optimization to prescribe the best solution without having to install any software. They can use it to create and compare multiple plans and understand the impacts of various trade-offs between plans. The DOcloud service also provides data analysis and visualization, scenario management and collaborative planning capabilities. One example given by IBM is a hospital that uses it to manage its operating room (OR) scheduling. ORs are capital-intensive facilities with high opportunity costs; that is, they handle procedures that utilize specific individuals and different combinations of classes of specialists. Procedures also have different degrees of time flexibility. Without using an optimization engine to take account of all the variables and constraints, crafting a schedule is time-consuming. And since “optimal” solutions to business problems are fleeting, an embedded optimization engine enables an organization to replan and reschedule quickly to speed up decision cycles.

    Businesses are on the threshold of a new era in their use of analytics for planning and decision support. However, numerous barriers still exist that will slow widespread adoption of more effective business practices that take full advantage of the potential that technology offers. Data issues and a lack of awareness of the potential to use more advanced analytics are two important ones. Companies that want to lead in the use of advanced analytics need leadership that focuses on exploiting technology to achieve a competitive advantage.

    Author: Robert Kugel

  • Why caution is recommended when using analytics for censorship

    Why caution is recommended when using analytics for censorship

    Historically, concerns about over-zealous censorship have focused on repressive governments. In the United States (and many other countries), free speech has been a pillar of society since its founding. For the most part, government attempts at censorship or speech restrictions receive swift and successful push back. In recent times, however, a new path to censorship has arisen in the form of search engine and social media companies that are building analytically-based censorship algorithms.

    These organizations are using analytics to censor speech more aggressively than any past governmental effort and are somehow convincing a sizable portion of the population that it is a good thing. This post will outline why the use of analytics for centralized censorship is a steep and slippery slope and also lay out an alternative that will enable those same censorship analytics to provide people with a choice rather than a dictate.

    Where is the line?

    Let’s assume, for the sake of argument, that we all agreed that censorship is ethical and desired (of course, we don’t all agree on that, but just assume we do). Under those terms, we still have to agree on exactly where to draw the line that delineates what should be censored from what should not. Reaching such an agreement would be as impossible as deciding to censor in the first place. But, for the sake of argument, let’s assume we could all magically agree on the exact same lines in the sand. Does that mean we’re ready to be effective at implementing our censorship plan? No!

    Even after agreeing that we should censor information and agreeing on what to censor, we still have to build the analytical processes to flag the 'bad' content. As we all know, no algorithm will be perfect. So, do we error on censoring too much 'legitimate' content to ensure we filter out all the 'illegitimate' content? Or do we make sure we allow all 'legitimate' content, even though that will also let some 'illegitimate' content sneak past? Once again, we’ll find it almost impossible to reach agreement.

    No matter what analytics we agree to, the models will still make errors. Our censorship will never perfectly match our intentions, even if we agreed to those intentions. Inherently, therefore, using algorithms to censor information will lead to disparities between intent and outcome. Is this an effective or ethical use of analytics?

    The reality today

    The concern today is that we have data science teams making up their own rules about what to censor and forcing us to accept it. The people drawing the lines in the sand are not representative of the general population and the people building the models won’t be any more successful than anyone else at effectively targeting the arbitrary lines drawn. This is a dangerous situation where unelected, anonymous people are deciding what information we see and who can speak.

    This isn’t just an ideological issue as some would suggest. Sure, some people will agree or disagree more with the current censorship being applied. But just remember that, even if you are comfortable with the decisions being made today because they fall in line with your world view, totally different decisions might be made tomorrow when someone else is in charge. Once you accept the right of these organizations to censor, the tables can be turned on you at some point, even if today that is not the case.

    Just think of the sticky situations we’ll get into based on the standards of today. If I post an April Fool’s article, do I risk being banned for spreading fake news? At what point is my view simply unpopular or contrarian and at what point is it 'dangerous and illegitimate' and worthy of being censored, along with me also being completely banished? These are not decisions to be made lightly.

    An alternative option to centralized censorship

    Personally, I don’t believe in censorship. However, some people do. Why not give us all a choice to view information as we prefer? The same algorithms being built to censor information by force can be made available as options we can turn on or off, much like we do with privacysettings. Let’s allow individuals to make the choice with regards to what they read, watch, or hear and what they don’t.

    There can be various filters aimed at hate speech that differ based on how the user chooses to define hate speech and how strict the user desires the filter to be. There can also be filters that knock out any political content of any type, for instance, if we just want a break from politics. When we want to catch up on politics, we can always turn the filter off. We can also have positive filters that elevate a topic we’re interested in. Perhaps a big sports event is upcoming and so I turn on the filter that requests more content than usual on the event. Or the other way around if I'm not into sports.

    Analytics can be used to filter any type of information in or out. We can make those analytics available for people to choose from instead of having faceless workers in Silicon Valley forcing their choices and their models on us all.

    If we aren’t careful, we’ll soon slip into an Orwellian world of extreme censorship and suppression of information. Of note is that the greatest risk today isn’t from the government, but from private corporations who control the flow of information in today’s world. This is one example where analytics are being used in ways that could lead to disaster if we don’t have a broader conversation as a society about how we should proceed.

    As outlined above, I’d love to see individuals enabled to make our own choices. Give us the ability to censor (or not) as we each see fit. There is no reason that the analytics of censorship can’t be steered in this direction of choice and away from the current dictatorial trajectory.

    Source: Datafloq

  • Why trusting your data is key in optimizing analytics

    Why trusting your data is key in optimizing analytics

    With the emergence of self-service business intelligence tools and platforms, data analysts and business users are now empowered to unearth timely data insights on their own and make impactful decisions without having to wait for assistance from IT. It's the perfect situation for more agile, insightful business intelligence and therefore greater business advantage, right?

    The reality is that even with these new BI tools at their fingertips, most enterprises still fall short of leveraging the real power of their data. If users don't fully trust the information (even if they're able to find and comprehend it), they won't use it when making business decisions. Until organizations approach their data analytics strategy differently - by combining all aspects of how the data is managed, governed, prepared, analyzed, and shared across the enterprise - a lack of trust will prevent a business' data from being useful and leading to successful business decisions, ultimately turning it into a liability rather than an asset.

    Finding the balance between agility and trust

    Although the self-service features of modern BI platforms offer more freedom and greater analytics power to data analysts and business users, they still require enterprises to manage and maintain data quality over time. Various roadblocks impede data analysts and business users from gaining access to the trusted data they need. Businesses can overcome common and critical challenges using tactics like:

    Building agility through proper data preparation

    Many times, data prep - the process of gathering, combining, cleaning, structuring, and organizing data - is missing from the analytics equation, especially when data analysts or business users are eager to get results quickly. However, having the data clearly structured with a common vocabulary of business terms (typically held in a business glossary of a data catalog) and data definitions ensures that people can understand the meaning of available data, instilling trust.

    Because data is pulled from both internal systems and external sources for reporting, profiling and cleansing data is essential to sevure trust in data as well as to improve the accuracy and reliability of results. Any changes made to the data should be tracked and displayed, providing users with the full history of the data should they have questions when using the data.

    Freeing (and maximizing) the siloed data

    Data is often siloed within different business units, enterprise applications, spreadsheets, data lakes etc., making it difficult to scale and collaborate with others. The rise of self-service BI has made this problem even more relevant as more business users and teams have generated department-specific reports. People working in one silo are likely unaware of what data has already been prepared and shared in other silos, so time is wasted by reinventing data prep efforts and analytics rather than reusing and sharing them.

    Integrating data prep with self-service analytics unifies teams across the enterprise - including shrinking gaps between data analysts and the people who have more context about the data - and empowers data scientists with trusted, curated data so they can focus less on hindsight and more on foresight.

    Establishing 'a true north' through data governance

    Strong data governance practices provide an organization with structure and security for its business data. This is especially critical when data is distributed through many systems, data lakes, and data marts. Governance is the umbrella term for all the processes and rules for data, including assigned owners and data lineage - so users can clearly understand the data's past use, who has accessed it, and what changes were made (if any).

    Maintaining balance

    For an organization to fully realize the value of its data, it needs a shared, user-friendly approach where all users within a business have easy access to data they can trust to do their jobs, but in a way that is controlled and compliant, protecting data integrity. Organizations can balance the demands for convenience and collaboration with those of control by establishing and maintaining a three-tiered approach. The three tiers in this approach are:

    1: The data marketplace

    Enterprisewide data use begins with the data marketplace, where business users can easily find (or shop for) the trusted business data they need to gain analytics insights for critical decisions. The data marketplace is where all the rules of governance, shared common data prep, and shared data silos come together.

    This data marketplace concept is not a single tool, platform, or device. No single self-service data analytics tool can deliver the results organizations are looking for. The data marketplace is an overarching strategy that addresses data management and discovery with prep and governance to collect trusted data. The marketplace helps organizations address the challenges of finding, sharing, transmitting, analyzing, and curating data to streamline analytics, encourage collaboration and socialization, and deliver results. Creating a standard, collaborative approach to producing trusted, reusable, and business-ready data assets helps organizations establish a common portal of readily consumable data for efficient business analysis.

    2: Team-driven analytics

    Just as important as having quick and easy access to reliable data is the ability to share data with others in a seamless, consumer-friendly way, similar to how sophisticated online music, movie, and shopping platforms do. Through the data marketplace mentioned above, users can visually see the origin and lineage of data sets just as a consumer can see background information about the musical artist of a song just streamed on Spotify. Through this visualization, users see consistency and relevancy in models across groups and teams, and even ratings on data utilization just as we use Yelp for reviews.

    Team commentary and patterns of data use dictate which models are most useful. Similar to sharing and recommending music to a friend, business users can collaborate and share data sets with other users based on previous insights they've uncovered. This team-driven and "consumerized" approach to data discovery and analytics produces quick and reliable business results.

    3: Augmented analytics

    A newer, more advanced feature of self-service analytics starting to emerge is augmented data insights: results based on machine learning and artificial intelligence algorithms. Using the Spotify example again, when augmented analytics is applied to the marketplace, data recommendations are made based on data sets the user has accessed, just as new music is recommended to consumers based on songs they've listened to earlier.

    By automatically generating data results based on previously learned patterns and insights, augmented analytics relieves a company's dependence on data scientists. This can lead to huge cost savings for organizations because data scientists and analysts are expensive to employ and often difficult to find.

    By creating this fully integrated approach to how enterprises view and use their data, a natural shift will start to occur for the organization, moving from self-service analytics to shared business intelligence and "socialization", where all users across the organization are encouraged to contribute to and collaborate on business data for greater value and business advantage.

    A common marketplace

    Organizations that have started to make this shift are already starting to see business benefits. Similar to consumer platforms like Spotify and Amazon, in an interactive community of trust, users thrive and are inspired to share and collaborate with others. I is through this collaboration that users gain instant gratification for more insightful decision-making. Through social features and machine learning, they learn about data sets they otherwise never would have known existed. Because analysts can see business context around technical data assets and build upon others' data set recipes and/or reuse models, they can achieve better, faster decision-making and work more efficiently.

    As data complexity increases, the key to realizing the value of business data is pulling all of the different data management and analytics elements together through a common marketplace with a constant supply chain of business-ready data that is easy to find, understand, share, and most off all trust. Only then business data becomes truly intelligent.

    Author: Rami Chahine

    Source: TDWI

EasyTagCloud v2.8