16 items tagged "data driven"

  • 3 Reasons to implement a data strategy in your sales processes

    3 Reasons to implement a data strategy in your sales processes

    Sales managers are resilient folk. For many, adapting to leaps in technology, economic volatility, and radical shifts in buying behavior is the norm. Often they emerge stronger and better equipped to succeed. Not surprisingly then, this Covid era, with any number of unforeseen business challenges has prompted many sales managers to examine themselves and their teams and to commit to up their game. One given in this tumultuous time is a data strategy.

    1. Must have a data strategy for sales 

    The veil of comfort of a pre-Covid world, where growth is infinite, resources are boundless, and the only perceived limit to success is one’s level of ambition…. for many, that veil has been lifted. And for some it has revealed some blemishes that in more comforting times would be easier to ignore. One such organizational blemish, for many, is the lag in their business to adopt and employ a data strategy that can empower its sales people and improve results.

    Let’s face it. Doing what you did yesterday is a good approach if you believe that tomorrow will look similar to today. Not many sales managers share this view of the world anymore. Things are changing, they are changing fast, and many sales organizations that haven’t adopted a data strategy find themselves slow to react and at a disadvantage to their competitors.

    2. Data helps sales team understand subtle changes in customer behavior

    The contrast in talking to sales organizations with a data solution and those without is striking. Sales organizations committed to data, use buying trends and behaviors of their best customers to educate and inform the rest of their customers as well as increase add on sales and wallet share across their customer base. In a few mouse clicks, a rep can see what upsell and add on opportunities exist and prioritize their calling efforts.

    Data driven sales organizations can react to the pulse of their customers, often times pro-actively to head off issues before a customer is fully at-risk. Subtle changes in purchasing behavior can reveal at risk accounts and trends that the salesperson can address pro-actively to retain a customer rather than trying to win them back after they leave.

    3. A data strategy is necessary to compete in a shrinking marketplace

    Further, as competition for a lesser number of customers in the marketplace heats up, data driven sales organizations have 360 degree view of their customer that allows them to share insights, improve customer experience and add value to every interaction. Customers have come to expect a higher level of communication and experience from their vendors that mirrors what they have experienced online. 

    Companies that have not embraced a data strategy for sales find themselves at a tremendous disadvantage. In these rapidly changing times, sales managers of those organizations may be asking themselves how long they can afford to wait before they level the playing field for their team. 

    Author: Mark Giddens

    Source: Phocas Software

  • 6 myths about market research to stay away from

    6 myths about market research to stay away from

    Introduction

    In order for a company's products and services to be not just well-accepted but welcomed by clients and customers, a thorough understanding of the company’s markets, competitors and customers needs to be reached. Evaluating the feasibility of a new product or service is central to an organization's operations, which is why almost every successful business in present times conducts market research. Market research helps to identify problems, spot growth opportunities, formulate marketing strategies, determine consumer needs, forecast sales, and improve sales activities, and introduce new products and services. Without proper market research to drive your marketing strategies, especially regarding new products or services, you lack direction, which is a recipe for disaster. Information on market research and market research techniques is widely available nowadays, which has made the process easier to understand and execute. Then why do some organizations still fail to embrace market research? The larger problem with organizations these days is that they fall prey to the numerous myths that are unfortunately associated with market research. In this article, let us examine these common myths regarding market research, and then untangle them to help you conduct better market research.

    Why do market research myths exist at all?

    Well, the short answer is that in today's fast-paced world, we are exposed to too much information too fast. As a result, we suffer from something known as information overload, which causes confusion, and the inability to distinguish between information and misinformation. A larger problem arising from information overload is that different stakeholders in an organization have different versions of truth due to their exposure to divergent information, which makes it difficult to move strategically in one direction as an organization. Add to the fact that a number of organizations never take the time to actually study market research in detail, which results in the formation and propagation of market research myths. Consequently, they either write-off market research completely without even conducting it, or conduct it in a flawed manner which obviously doesn't bring accurate results. Millions of dollars are spent on an incomplete or imperfect idea, plus the extra time spent fixing what’s broken later on. Thus, to truly dispel market research myths, you need to conduct proper research on market research itself! Meanwhile, this article will raise your awareness about these common myths, so that you don't succumb to these many myths, and successfully predict which of your new ideas will thrive in the marketplace.

    6 common market research myths that you need to avoid

    1. Market research isn't necessary

    Most successful organizations in the world would disagree with you here. Market research is an integral step in a successful business development effort, whether you are part of a startup, launching a new product or rebranding your business. Even if you've been in the business long, and think that you know your customers pretty well, you never know when a new factor might emerge that changes customer behavior, or how the speed of societal change is continually altering the dynamics of the provider/customer relationship. Not conducting timely market research means that you will fail to identify new, emerging trends in the market, which will lead to lost opportunities that your competitors are not going to miss, which in turn will lead to a loss of market position. In a market as competitive and dynamic as today's, you need to keep a close watch on your target audience.

    2. We can do our own market research

    Of course you can. But unless you're a marketing professional with relevant experience in market research, have the necessary tools for it, have a dedicated internal research team with expertise in market research or have hired a research vendor to do it for you, it's unlikely that your venture is going to be fruitful. Don't confuse market research with online research. Market research is a systematic discipline that requires careful planning, time, expertise, and an ability to marry a variety of data sets or sources of information to clearly articulate business questions and then answer them. Unless you've studied it, or done it before successfully, with proven results, you're not eligible to conduct your own research, especially when it's the future of your organization at stake. If you still do it, be ready for inaccuracies and impaired decisions.

    3. Market research takes too long

    Not necessarily. It depends on multiple factors such as your research objectives, key questions, the primary techniques used to collect data, the ideal respondents, the ease or difficulty in reaching them, the number of respondents needed, how the collected data will be analyzed, how it will be shared, etc. One thing is for certain however, long gone are the days of the lengthy survey timeline, when a market research project meant at least 6 months of data collection and analysis. Nowadays, on average, most research projects take about one to two months max, thanks to new methods, and new data sources available now, as well as sophisticated technologies such as market intelligence software.

    4. Market research is too expensive

    With affordable and scalable research solutions and tools available on the internet today, this is simply not true. Market research can be expensive, but it doesn't have to be. Online research using survey tools, email marketing platforms and social media polls can be leveraged to provide you a valuable insight into your target audience's minds, at virtually no cost. Then there's always primary methods and techniques like focus groups and expert interviews, which come at moderate costs, but are still relatively expensive. Or, you could just use a market intelligence tool that allows you to identify shifting trends as they emerge and make adjustments accordingly, enabling you to see the complete picture by bringing data from external sources into the mix. Anyway, insights gathered from market research are going to help you bring in much more than you invest in it. They'll bring competitive advantage and growth, building a future for your company, which is something you can't put a price tag on.

    5. Doing quantitative research is enough

    Granted that market research has a lot to do with numbers and statistics, but the impression that quantitative research should be enough is mistaken. Surveys and polls will get you quantitative data, but knowing the reason behind a particular phenomenon, the "why" behind is crucial in order to customize your product or service to your customers' needs and demands, and only qualitative research can get you that. This is why market research should always be a balance between quantitative and qualitative research with open-ended questions, customer feedback, and interviews included. It will be really helpful for your stakeholders if you complement each statistic or data with a "why" when presenting your findings, i.e. the reason a certain demographic chooses to behave a certain way.

    6. Surveys are all you need to conduct market research

    While surveys are often a primary technique to conduct market research, they're certainly not the only market research technique used to conduct market research. In addition to surveys, organizations employ a number of techniques such as focus groups, interviews, field trials, social media listening, observation, research reports, research partners, competitive analysis, sales data, and market intelligence tools to gather and analyze data on the market and their customers. Surveys are, no doubt, a valuable research tool, but these techniques provide a window into actual social behavior and the adoption of technologies, services, or other offerings among customers. Out of these, market intelligence tools are a recent favorite, as they combine the benefits of a number of market research techniques.

    How market intelligence tools can help organizations with market research?

    According to Forbes, there are two problems with traditional market research. One, a number of times the data gathered using surveys and polls isn't actionable. For example, just because you have a good idea of who you're targeting doesn't mean you'll be able to formulate a winning marketing strategy around that insight. And two, people lie. Your respondents may lie simply to impress the person administering the survey. People want to make a good impression, whether the results are anonymous or not. Market intelligence allows you to look beyond the individual pieces of the marketing puzzle (like polls, surveys and focus groups) and focus on the big picture. When you utilize market intelligence, along with supplemental market research, you can merge strengths, eliminate weaknesses and holistically identify growth opportunities.

    Market intelligence tools can be very capable form of gathering market research, and one which is being used extensively by organizations. Such tools can gather and analyze data from the internet, including news, blogs, company websites, social media, regulatory and government portals, industry trade association reports, press releases, job boards, review websites, etc., in an automated manner, saving precious time, effort and resources. In addition, market intelligence tools are not only useful for gathering market research data, but to inform strategic decision-making across the enterprise, as mentioned above. Market research is, in fact, nothing but a subset of the larger field of market intelligence.

    Market intelligence allows organizations to make informed decisions by supplying them with insights, trends, patterns, etc., propagating a data-driven business approach. Organizations that receive a steady stream of market intelligence have a bird’s-eye-view of their market, and understand their current position in the market, which in turn helps formulate strategies for success and growth. Isn't that the idea behind conducting market research in the first place?

    Conclusion

    Successful organizations know their markets, understand their competitors’ and focus on their customers’ wants and needs. They do this by gathering all the information necessary for their businesses to be competitive. They know why people buy their products and services, not just when or where. Not conducting timely market research can result in indecision and inaction, fear of risk or the truth, and/or too many options, which can lead to paralysis. Hopefully, this article was able to dispel myths about market research, and clear any ambiguities about why market research is essential. I also hope that this article was able to shed some light on how market intelligence is a step forward from market research, and a useful strategy for organizations to invest in.

    Author: Malay Mehrotra 

    Source: Contify

  • Adopting Data Science in a Business Environment

    Adopting Data Science in a Business Environment

    While most organizations understand the importance of data, far fewer have figured out how to successfully become a data-driven company. It’s enticing to focus on the “bells and whistles” of machine learning and artificial intelligence algorithms that can take raw data and create actionable insights. However, before you can take advantage of advanced analytics tools, there are other stops along the way, from operational reporting to intelligent learning.

    Digital transformation is dependent on adoption. But adoption and proficiency of new technologies can be disruptive to an organization. Mapping a data journey provides awareness and understanding of where your organization is to ultimately get where you want to go, with enablement and adoption of the technology throughout. Without the clarity provided by a data journey, your organization won’t be positioned to successfully deploy the latest technology.

    Here are the four elements of an effective data journey. 

    Determine Your Roadmap

    As with any trip, your data journey requires a roadmap to get you from where you are to where you want to go. Before you can get to your destination, the first step is to assess where you are.

    Most organizations begin with a focus on operational reports and dashboards, which can help you glean business insights from what happened, including how many products were sold, how often and where. They can also identify where problems exist, and deliver alerts about what actions are needed.

    Ultimately, most want to get to the point where analytics tools can help with statistical analysis, forecast, predictive analytics and optimization. Armed with machine learning, manufacturers want to understand why something is happening, what happens if trends continue, what’s going to happen next and what’s the best that can be done.

    Capture Data and Build Processes and Procedures

    Once you know where you want to go, it’s important to capture the data that is essential in helping you achieve your business goals. Manufacturers capture tremendous amounts of data, but if the data you collect doesn’t solve a business need, it’s not vital to your data processing priorities.

    This phase of your data journey isn’t just about what data you collect, it’s also about your data strategy: how you collect the data, pre-process it, protect it and safely store it. You need to have processes and procedures in place to handle data assets efficiently and safely. Questions such as how you can leverage the cloud to gain access to data management tools, data quality and data infrastructure need to be answered. 

    Make Data Accessible to Business Users

    Today, data – and business insights about that data – need to be accessible to business users. This democratization of data makes it possible for business users from procurement to sales and marketing to access the data that’s imperative for them to do their jobs more effectively.

    In the past, data was the domain of specialists which often caused bottlenecks in operations while they analyzed the data. In this phase of the data journey, it’s important to consider data management tools that can consolidate and automate data collection and analysis.

    Once data is removed from silos, it makes it possible for data to be analyzed by more advanced analytics and data science tools to glean business insights that can propel your success. 

    Change Company Culture for Full Adoption

    A data culture gap is a common barrier to the adoption of advanced data analytics tools for many companies. When employees who are expected to use the data and insights don’t understand the benefits data can bring to decision-making it can create a roadblock. Your company won’t be data-driven until your team embraces a data-driven culture and starts to use the data intelligently.

    If you want to get the most out of the advanced data analytics tools that are available today and use data intelligently in your organization, you must first develop a solid foundation.

    First, you must be clear where you are in your organization’s data journey with a roadmap. Then create effective data processes, procedures, and collection methods, as well as identify what data management and analytics tools can support your initiatives. Finally, your team is key to adopting advanced data analytics tools, so be sure they are trained and understand how these tools can empower them. Once you have a solid analytics foundation, you’re ready to put machine learning to work to drive your collective success.

    Author: Michael Simms

    Source: Insidebigdata

  • Are you aware of the value of your data?

    Are you aware of the value of your data?

    While most executives understand that their data is an asset, many haven’t harnessed the valuable insights accessible with a data analytics solution. The immense amount of data you generate may seem impossible to understand, but data analytics will transform it into clear, actionable information. Another way at looking at data, if you closed your doors tomorrow, what would the new owner be most interested in? The products on your shelves or the insights into your customers?

    Better understand your customers

    Leveraging your data can help you better understand your customers. For instance, you can create robust customer profiles that include information such as sector, job title, geographical locations, channels they use, and preferences. Identify their purchasing behaviors such as what they are buying, what they aren’t, when, how often, in what quantity, and their lifetime value.

    Understanding your customers enables your sales team to recognize new cross- and up-selling opportunities and recognize your top performing accounts. Knowing your best customers means you can reinforce those relationships by periodically rewarding them with a special promotion for products they like. Another benefit of analytics is the ability to identify when a customer is declining. By analyzing customer buying habits and visit frequency, your team can quickly detect a reduction in order frequency or volume, and make a sales call to find out if there is a problem.

    Transactional data keeps an eye on product sales

    Transactional data such as time, place, price, discount, and payment methods is generated at the point of sale. This data can help you measure the success of your various product lines. By analyzing your transactional data, you can tell you whether a product is gaining traction with your target customer base or it can reveal an unexpected dip in sales.

    While it’s important to determine which products aren’t selling as expected, it’s equally important to identify the products with high conversion rates. It may be that the price point is too low, for example. Finally, your transactional data can help you identify trends such as seasonal buying patterns. Knowing when sales increase due to the season can help you better manage the trend. If you know that sales for a particular product line typically increase in October, you can prepare for this by adjusting your stock level to meet the upcoming rise in demand.

    Be more strategic

    Even though many companies have adopted data analytics to guide their decision making, many other companies still rely on traditional approaches. Without realizing it, this means they are a step behind their competition. On the other hand, companies that use a data analytics solution to extract the value from their data have greater success. A study from the MIT Center for Digital Business found that companies that adopt a data-driven culture have 4% higher productivity rates and 6% higher profits. Data-driven companies rely on hard, verifiable data to back up their decision-making rather than making decisions based on intuition and gut-alone. An analytics solution can show you where to strategically deploy your business resources so you can gain a competitive advantage.

    Manage costs

    A major business resource is your capital. Managing your costs enables you to make the most profitable investments. Data analytics can help you lower costs companywide. For instance, analytics can help you track shipments and optimize deliveries to lower your shipping costs. Your marketing team can use analytics to trim marketing costs by creating targeted marketing campaigns and assessing their effectiveness. Finally, data analytics can help you improve employee performance and operational efficiencies across your various departments.

    To remain competitive in our data-driven economy, your business decisions must be based on credible evidence rather than on subjective experience. Data analytics helps companies achieve their goals by identifying fact-based, actionable insights so executives can develop effective strategies for each area of the business. 

    Source: Phocas Software

  • Digital transformation: Key goals and leaders

    Digital transformation: Key goals and leaders

    According to Innosight research, the average company tenure on the S&P 500 Index in 1977 was 37 years. By 2027, it’s forecasted to be just 12. At the current churn rate, about half of the companies now on the S&P 500 will be replaced over the next ten years. Digital Darwinism is steadily accelerating.

    '52% of the Fortune 500 have been merged, acquired, have gone bankrupt, or fallen off the list since 2000', notes Constellation Research Founder and Disrupting Digital Business author Ray Wang in a recent webcast. 'That is an amazing stat when you think about the level of disruption that’s happening inside lots of organizations'.

    Wang notes that digital leaders are now creating not just a digital divide, but a winner-take-all market. Overall, digital leaders now take up to 69.8% of market share, versus 30% for everyone else. And in percentage of profits, they lead with 77.1% versus 22.9% for everyone else. 'Using data-driven business models, they are able to create an unfair advantage, and it’s happening in every single marketing and every single industry', says the analyst and author.

    In Constellation Research’s latest digital transformation study, 68% of businesses with digital transformation projects are now seeing a positive ROI. Goals widely shared by these businesses include:

    • Reaching and engaging with customers more effectively
    • Building a competitive advantage in their current market
    • Implementing new, data-driven business models
    • Increasing revenue
    • Modernizing legacy IT and reducing costs
    • Improving agility
    • Faster innovation cycles
    • Improving the employee experience
    • Greater transparency
    • Compliance

    Who's leading the digital transformation charge?

    In 33% of organizations, notes Constellation’s survey, it’s the CIO who’s leading digital transformation initiatives. In 23% of organizations, it’s the CEO. In 20% of organizations, it’s the CDO (being chief digital or chief data officers). And depending upon who’s leading, the digital transformation priorities for the business may be different.

    When a CIO leads, their top three priorities tend to be:

    1. Building a competitive advantage (38%)
    2. Modernizing legacy IT and reducing costs (38%)
    3. Implementing data-driven business models (33%)

    When a CEO leads, their top three priorities are:

    1. Engaging with customers more effectively (57%)
    2. Building a competitive advantage (50%)
    3. Increasing revenue (43%)

    For CDOs (chief digital or chief data officers), the top priorities are also different, with:

    1. Implementing data-driven business models and engaging in faster innovation cycles tied for first place (50%)
    2. Engaging with customers more effectively, modernizing legacy IT, and reducing costs tied for second (43%)

    Conclusion

    No matter who leads your organization's digital transformation, it is obvious that businesses trnsforming digitally and data-driven are having a competitive edge in the present and future. When taking on the process of digital transformation for your business, make sure to align your data strategy with company goals and primary processes. Choosing the right person to lead this process is key to a successful transformation.

    Author: Tricia Morris

    Source: MicroStrategy

  • Hoe zorg je voor een datacultuur in je organisatie?

    Hoe zorg je voor een datacultuur in je organisatie?

    Organisaties met een duidelijke datacultuur zijn winstgevender dan bedrijven die dat niet doen. Maar waar is een datacultuur? En als je dat wilt, waar dan te beginnen? Alleen software installeren is niet genoeg volgens Tableau, maker van software voor data-analytics en -visualisatie.

    Op reis spreken mensen over de cultuur van een land: de taal die de inwoners spreken, de omgangsvormen, de manier waarop mensen de dingen doen en de producten die ze maken. In een organisatie met een datacultuur spreken alle werknemers de taal van data. Zij werken datagedreven en data vormen de basis van wat ze doen en maken. In bedrijven met een datacultuur heeft iedereen toegang tot de gegevens die men nodig heeft om betere beslissingen te nemen in het werk. 'Om dat voor elkaar te krijgen vraagt om meer dan software installeren', zegt James Eiloart, SVP EMEA van Tableau. 'Steeds vaker worden wij gevraagd onze klanten te helpen met het creëren van een datacultuur in hun organisatie. Wat dat inhoudt is dat wij hen heel veel vragen stellen en dan met behulp van data een kaart schetsen van waar zij nu zijn, en de weg naar het punt waarop alle werknemers hun werk beter kunnen doen met behulp van data'.

    Eiloart gebruikt graag de frase 'driving insights', wat in deze context zoveel betekent als het vergroten van inzichten op basis van data. 'Het einddoel is een organisatie waarvan de werknemers met zelfvertrouwen data gebruiken om vragen te beantwoorden en inzichten te vergroten, zodat zij hun werk in de organisatie beter kunnen doen. Feit is dat de meeste organisaties daar nog heel ver van verwijderd zijn'.

    Visie

    Het bewerkstelligen van verandering begint, zoals zo vaak, met een visie. Op het visitekaartje van Simon Beaumont staat dat hij Global Director is van het BI Centre of Excellence van vastgoedbedrijf Jones Lang LaSalle (JLL). Op de Tableau Conference 2019 noemt hij zichzelf liever Tableau Zen Master, wat betekent dat hij een bepaalde vaardigheid heeft in het werken met de software en spreekt als ambassadeur. Hij legt uit dat JLL vastgoed verhuurt en beheert over de hele wereld. Zij gebruiken data niet alleen om te de waarde van vastgoed te bepalen voor hun klanten of dashboards te bouwen met responstijden voor onderhoud. JLL onderzoekt desgewenst ook in welke steden de meeste softwareontwikkelaars wonen, voor bedrijven die een nieuwe vestiging willen openen. Of hoe kamertemperatuur de productiviteit van de werknemers beïnvloedt. Of de relatie tussen de hoeveelheid parkeerruimte om het kantoorpand en het op tijd komen van mensen. Beaumont: 'Onze directeur Paul Chapman had een visie: hij wilde dat JLL data op een andere manier ging gebruiken dan gebruikelijk was in de vastgoedwereld. Die visie werd medegedeeld aan alle werknemers, vanaf de top. Daarna begon de zoektocht naar passende software'.

    Datavaardig

    In een datacultuur werkt iedereen in meer of mindere mate met data. Datavaardigheid is niet voorbehouden aan een handvol specialisten. Dat vraagt om een investering in mensen, om opleiding en training. Mark Jewett is VP Marketing bij Tableau: 'Data is een taal die iedereen kan leren, maar niet iedereen hoeft hem vloeiend te spreken. Niet iedereen hoeft een deskundige te worden, maar de basis van werken met data moet wel bekend zijn, zoals logica en het verschil tussen correlatie en causaal verband. Als er in analytics een verband is tussen twee getallen, betekent dat niet dat het ene getal het andere heeft veroorzaakt. Veel mensen vinden dat een lastig concept om te begrijpen, maar is wel belangrijk om hen dat bij te brengen. Geweldige data in de handen van mensen die ze niet begrijpen, leidt niet automatisch tot geweldige beslissingen'.

    Opleiden

    Werknemers hoeven niet gelijk jaren naar school, kies een opleidingsvorm die past bij het werk en de persoon. Korte e-modules, hackathons, workshops, de mogelijkheden zijn legio. Jewett: 'Maak mensen enthousiast voor data. Laat ze een onderwerp kiezen dat ze leuk vinden en ga grasduinen in openbare data'.

    Wie bereid is zelf op te leiden kan bij het aannemen van mensen voor een team ook verder kijken dan de gedoodverfde kandidaten. In de Tweede Wereldoorlog had het Westen een enorm tekort aan codebrekers, omdat de jonge mannen aan het front waren. De militaire leiding besloot, heel innovatief, om vrouwen in te zetten. Wie weet welk talent er verscholen zit in 'de ander'? Voor wie huiverig is voor zoveel om- en bijscholing, citeert Eiloart een bekende cartoon: De CFO zegt tegen de CEO: ‘wat als die mensen, waarin we investeren, daarna vertrekken?’ Waarop de CEO antwoordt: ‘wat als we niet in hen investeren en ze blijven?’

    Directiesteun

    'Als de directie of raad van bestuur jouw datavisie deelt, helpt dat enorm', aldus Eiloart. 'Wij zien vaak meer wil en betrokkenheid bij gebruikers dan bij de directie'. Jewett vult aan dat in zijn ervaring het soort organisatie dat het beste presteert, datacultuur zien als onderdeel van de identiteit. 'Een klant van ons, een Europees automerk, heeft ‘driving excellence’ als zijn missie. De wil tot excelleren zit door de hele organisatie. Product-engineering was altijd al belangrijk, maar data-engineering komt daar nu bij. En wordt met diezelfde gedrevenheid uitgevoerd'.

    Begin nu

    Het is verleidelijk te wachten tot alle data op orde zijn, tot ze (bij wijze van spreken) keurig gerangschikt liggen te wachten in het eigen datawarehouse of datalake. 'Niet doen', adviseert Eiolart. 'Begin met de data die er zijn en kijk welke inzichten die opleveren. Iedere dag dat je wacht betekent een grotere achterstand op de concurrent. Begin de reis nu, anders begin je hem nooit. Natuurlijk moet er een mate van governance en control zijn, maar wacht niet op perfectie'. Jewett adviseert onderscheid te maken tussen soorten gegevens. 'Sommige data vereisen een grote mate van nauwkeurigheid, maar niet alle. Als ik snel wil controleren of iets klopt, is dat een andere vraag dan het onderbouwen van het financieel jaarverslag. Maak onderscheid tussen de verschillende eisen aan controle een privacy. Zorg dat de governance varieert waar nodig zodat mensen flexibeler kunnen werken'.

    Bron: Techzine

  • How autonomous vehicles are driven by data

    How autonomous vehicles are driven by data

    Understanding how to capture, process, activate, and store the staggering amount of data each vehicle is generating is central to realizing the future of  autonomous vehicles (AVs).

    Autonomous vehicles have long been spoken about as one of the next major transformations for humanity. And AVs are already a reality in delivery, freight services, and shipping, but the day when a car is driving along the leafy suburbs with no one behind the wheel, or level five autonomy as it’s also known, is still far off in the future.

    While we are a long way off from having AVs on our roads, IHS Markit reported last year that there will be more than 33 million autonomous vehicles sold globally in 2040. So, the revolution is coming. And it’s time to be prepared.

    Putting some data in the tank

    As with so many technological advancements today, data is critical to making AVs move intelligently. Automakers, from incumbents to Silicon Valley startups, are running tests and racking up thousands of miles in a race to be the leader in this field. Combining a variety of sensors to recognize their surroundings, each autonomous vehicle uses radar, lidar, sonar and GPS, to name just a few technologies, to navigate the streets and process what is around them to drive safely and efficiently. As a result, every vehicle is generating a staggering amount of data.

    According toa report by Accenture, AVs today generate between 4 and 6 terabytes (TBs) of data per day, with some producing as much as 8 to 10 TBs depending on the number of mounted devices on the vehicle. The report says that on the low end, that means the data generated from one test car in one day is roughly the equivalent to that of nearly 6,200 internet users.

    While it can seem a little overwhelming, this data contains valuable insights and ultimately holds the key in getting AVs on the road. This data provides insights into how an AV identifies navigation paths, avoids obstacles, and distinguishes between a human crossing the road or a trash can that has fallen over in the wind. In order to take advantage of what this data can teach us though, it must be collected, downloaded, stored, and activated to enhance the decision-making capabilities of each vehicle. By properly storing and managing this data, you are providing the foundation for progress to be made securely and speedily.

    Out of the car, into the ecosystem

    The biggest challenge facing AV manufacturers right now is testing. Getting miles on the clock and learning faster than competitors to eliminate errors, reach deadlines, and get one step closer to hitting the road. Stepping outside of the car, there is a plethora of other elements to be considered from a data perspective that are critical to enabling AVs.

    Not only does data need to be stored and processed in the vehicle, but also elsewhere on the edge and some of it at least, in the data center. Test miles are one thing, but once AVs hit the road for real, they will need to interact in real-time with the streets they are driving on. Hypothetically speaking, you might imagine that one day gas stations will be replaced by mini data centers on the edge, ensuring the AVs can engage with their surroundings and carry out the processing required to drive efficiently.

    Making the roads safer

    While it might seem that AVs are merely another technology humans want to use to make their lives easier, it’s worth remembering some of the bigger benefits. The U.S. National Highway Traffic Safety Administration has stated that with human error being the major factor in 94% of all fatal accidents, AVs have the potential to significantly reduce highway fatalities by addressing the root cause of crashes.

    That’s not to say humans won’t be behind the wheel at all in 20 years, but as artificial intelligence (AI) and deep learning (DL) have done in other sectors, they will augment our driving experience and look to put a serious dent in the number of fatal road accidents every year, which currently stands at nearly 1.3 million.

    Companies in the AV field understand the potential that AI and DL technology represents. Waymo, for example, shared one of its datasets in August 2019 with the broader research community to enable innovation. With data containing test miles in a wide variety of environments, from day and night, to sunshine and rain, data like this can play a pivotal role in preparing cars for all conditions and maintaining safety as the No. 1 priority.

    Laying the road ahead

    Any company manufacturing AVs or playing a significant role in the ecosystem, from edge to core, needs to understand the data requirements and implement a solid data strategy. By getting the right infrastructure in place ahead of time, AVs truly can become a reality and bring with them all the anticipated benefits, from efficiency of travel to the safety of pedestrians.

    Most of the hardware needed is already there: radars, cameras, lidar, chips and, of course, storage. But understanding how to capture, process, activate, and store the data created is central to realizing the future of AVs. Data is the gas in the proverbial tank, and by managing this abundant resource properly, you might just see that fully automated car in your neighborhood sooner than expected.

    Author: Jeff Fochtman

    Source: Informationweek

  • How modern ETL processes lead to better decision making

    How modern ETL processes lead to better decision making

    We live in a world of data: there’s more of it than ever before, in a ceaselessly expanding array of forms and locations. Dealing with data is your window into the ways data teams are tackling the challenges of this new world to help their companies and their customers thrive.

    In recent years we’ve seen data become vastly more available to businesses. This is mostly due to the rise of data warehouses, decrease in cost for data storage, and modern ETL tools that make storing and accessing data more approachable than ever before. This has allowed companies to become more and more data driven in all areas of their business. In fact, being data driven has become ubiquitous and imperative to survival in today’s climate. This article will discuss at a high level how modern businesses are leveraging new technology to ingest a wider variety of data sources. 

    As of 2019, according to Bissfully’s 2020 SaaS trends report, companies in all phases of business growth make use of many SaaS aaplications. smaller companies (0-50 employees) use an average of 102 SaaS applications, while mid-sized ones (101-250 employees) use about 137 SaaS applications. Enterprises, meanwhile, can have as many as 288 SaaS applications in use across the entire organization!

    These SaaS products range from CRMs to marketing automation apps to billing software to help desk tools, etc. These applications can log thousands to millions of rows of data made available via API, much of which can hold valuable insight for businesses. Add to this the company’s own internally-gathered product data and it’s clear that even a small or medium-sized business can do a lot with the data at its disposal. In this article, we’ll cover what the ETL process is, why it’s important to powering data-driven businesses, and how the rise of modern ETL tools has changed the ETL process forever.

    The ETL process: foundation of data-driven businesses

    Every company wants every team within their business to make smarter, data-driven decisions. Customer support teams look at trends in support tickets or do text analysis on conversations to understand where they can provide better onboarding and documentation. Marketing teams want better visibility into their ad performance across different platforms and the ROI on their spend. Product and engineering teams dig into productivity metrics or bug reports to help them better prioritize their resources. 

    The ETL process empowers all these different teams to get the information they need to understand and perform their jobs better. Businesses ingest data from a wide array of sources through the ETL process, which stands for Extract, Transform, Load. The prepared data is then available for analysis and use by the various teams who need it, as well as for advanced analytics, embedding into applications, and use for other data monetization efforts. Whatever you want to do with data, you have to ETL it first. 

    Explaining the traditional ETL process

    Traditionally, the extraction step of the process involves batch-pulling the data from the source via files, application database, or an API. In the transformation step, the data is cleaned and modeled so that it’s fit for analysis and ingestible by the data warehouse. Finally, the cleaned tables are loaded into the data warehouse for analysis. 

    This traditional ETL process is very difficult to scale. It often requires full-time data engineers to develop and maintain the scripts that keep the data flowing. This is because the data providers often make changes to their schemas or APIs, which then break the scripts that power the ETL process. Every time there’s a change, the data engineers scramble to update their scripts to accommodate them, resulting in downtime. With businesses now needing to ingest data from so many disparate (often fast-moving) data sources, maintaining ETL scripts for each one is not scalable. 

    Better living through ETL tools: the modern ETL process

    The modern ETL process follows a slightly different order of operations, dubbed ELT. This new process arose as a result of the introduction of tools to update the ETL process, as well as the rise of modern data warehouses with relatively low storage costs.

    Today, ETL tools do the heavy lifting for you. They have integrations for many of the major SaaS applications, and have teams of engineers who maintain those integrations, taking the pressure off of your in-house data team. These ETL tools are built to connect to most major data warehouses, allowing businesses to plug in their apps on one end and their warehouse on the other, while the ETL tools do the rest.

    Users can usually control orchestration via a simple drop-down selections within the apps, alleviating the need to stand up your own servers or EC2 box or building DAGs to run on platforms like Airflow. ETL tools can also typically offer more robust options for appending new data incrementally, or only updating new and modified rows, which can allow for more frequent loads, and closer to real-time data for the business. With this simplified process for making data available for analysis, data teams can focus on finding new applications for data to generate value for the business.

    The ETL process and data warehouses

    Data warehouses are the present and future of data and analytics. Storage costs on data warehouses have drastically reduced in recent years, which allows businesses to load as many raw data sources as possible without the same concerns they might have had before.

    Today, data teams can ingest raw data before transforming it, allowing them to do the transformations in the warehouse instead of a separate staging area. With the increased availability of data and a common language to access that data, SQL, it allows the business more flexibility in leveraging their data to make the right decisions

    The modern ETL process: delivering better results, faster

    Under the traditional ETL process, as data and processing requirements grew, the chance that on-premise data warehouses would fail grew as well. When this happened, IT would have to swoop in to fix the issue, which usually meant adding more hardware.

    The modern ETL process in today’s data warehouses sidesteps this issue by offloading the compute resource management to the cloud data warehouse. Many cloud data warehouses offer compute scaling that allows for dynamic scaling when needs spike. This allows data teams to still see scalable performance while holding increased numbers of computationally expensive data models and ingesting more large data sources. The decreased cost in compute power along with compute scaling in cloud data warehouses allows data teams to efficiently scale resources up or down to suit their needs and better ensure no downtime. The bottom line is that, instead of having your in-house data and/or IT team fretting over your data storage and computing issues, you can offload that pretty much completely to the data warehouse provider.

    Data teams can then build tests on top of their cloud data warehouse to monitor their data sources for quality, freshness, etc. giving them quicker, more proactive visibility into any problems with their data pipelines.

    From ETL to ELT and beyond

    Data, analytics, and BI have radically evolved since their inception. We are leaps and bounds beyond Excel tables and on-prem-centric data sources. Cloud-native data warehouses, cloud-native architecture in analytics and BI platforms, and embedded analytics powered by these systems have redefined what it means to be truly data-driven in our modern age.

    The ETL process has been updated and can now deliver insights from a wide array of datasets, which helps companies and teams of all kinds make smarter decisions, faster. It also opens the doors for advanced analytics, next-level data monetization, and much more. Whatever you’re building with your data, a modern ELT setup will help you get more from your data, easier.

    While we have seen a large migration to data warehouses, looking to the future, we might see even another evolution that involves data lakes being more widely used as the initial destination for all raw data sources, offering even further cost benefits. Additionally, new tools are starting to take form that allow for transformation within these data lakes, thus continuing the evolution of tools and processes within the data pipeline following the path of ETL to ELT.

    Author: Adam Luba

    Source: Sisense

  • How the need for data contributes to the evolution of the CFO role

    How the need for data contributes to the evolution of the CFO role

    The role of the CFO has evolved in recent years from the person in control of the purse strings, to the trusted right hand of the CEO.

    Their importance was further enhanced during the pandemic as they were often required to oversee changes in a matter of days or even hours that would normally have taken months or years to bring to fruition. They are no longer just the person in charge of the money, but a strategic planner whose insights and counsel inform some of the company’s biggest business decisions.

    But although their role has evolved, the technology which helps them is still playing catch-up, with the lack of reliable analytics and data one of the biggest hurdles to progress.

    A changing role and the need for data

    The finance function has traditionally been known for its stability and process-based culture. But Covid brought about a need for quick-fire decisions, where the rule book had to be re-written overnight or thrown out completely. Data has always been central to agile business planning, forecasting and analysis – all tools which have become central to the modern CFO role. The pandemic sped up the need for this type of tech, yet many firms still lack these insights to do the job properly. 

    This level of data collection and insight requires the right technology. But in a survey of CFOs by Ernst and Young, eight out of 10 respondents said legacy technology and system complexity was one of their top three inhibitors to progress. 

    Data that tells ‘one truth’

    Companies are awash with information, each different department has their own KPIs and methods of reporting. But what CFOs really need if they are to perform their modern role well is not just data, but to be able to connect the dots and gain an holistic view.

    They need integrated financial insights on accounting and finances data, better traceability and operational reporting on things such as customer segmentation, products and revenue assurance.

    To enable this, Teradata’s multi-cloud software integrates data fast across finance systems – such as Oracle, PeopleSoft, and SAP – to connect and  integrate in near real-time. It also has a pre-built financial data model that’s ready to receive and structure data to enable controlled, user-friendly access. This all helps reconcile data from a wide variety of different sources into a trusted, compliant platform.

    Turning insights into action

    Just one example of how this level of data insight has helped a firm was when a global retail company needed to modernize their analytics ecosystem. Their processes were manual and time consuming. Their spreadsheet-based model results couldn’t feed downstream models and analytics.

    Crucially, a lack of trust in the model also meant analytics results had limited use to the business. The company worked with Teradata to create a finance-driven data foundational model. It provided in depth detail into things like revenue and costs from aggregate views of branches, products, vendors and customers.

    This information enabled the financial justification for strategic business decisions. It’s this level of detail that can continue to enable CFOs to retain their position of trusted advisor to the CEO and an indispensable asset to the company.

    Source: CIO

  • More and more organizations are basing their actions on their data

    More and more organizations are basing their actions on their data

    Many corporations collect data but don't end up using it to inform business decisions. This has started to shift.

    All in all, 2020 will go down as one of the most challenging and impactful years in history. It will also be known as one of the most transformative, with companies and individuals adjusting quickly to the new normal in both work and play, with a 'socially distant' way of life changing how people interact and communicate.

    ven amidst the chaos, we saw an influx of existing technologies finding new industry opportunities, such as videoconferencing tools, streaming platforms such as Netflix, telehealth applications, EdTech platforms, and cybersecurity, to name a few. All of these technologies are powered by one fundamental thing, yet this entity isn't being tapped to its full potential by SMBs and enterprises alike.

    That thing is data, collected by companies with the intent to inform business decisions and better understand and serve their customers. However, from what I have seen, more than 80 percent of data that businesses generate goes unused. This will drastically change in the next three years, with the majority of the data consumed being put to use.

    What's driving this trend

    Data generation was already a hot topic prior to the COVID-19 pandemic with a projected 59 zettabytes (ZB) of data created, captured, and copied over the last year according to IDC. This trend has only accelerated with the pandemic as companies are fast-tracking digital transformation initiatives. Adding to this, the ongoing health crisis is resulting in the avoidance of face-to-face interactions during the workday, causing digital interactions to increase tenfold. This has created even more data through connectivity tools and applications.

    Companies have realized that analyzing this data can help leaders make better-informed decisions rather than relying on gut feeling. Data has become so important to companies' success that according to Gartner, by 2022, 90 percent of today's corporate strategies will unequivocally list information as a critical enterprise asset and analytics as an essential competency. Leading organizations know that in order to drive success in their industry, they have to leverage data and analytics as a competitive differentiator, fueling operational efficiencies and innovation.

    Setting up for success

    Though the majority of data collected by businesses currently goes to waste, there are more tools emerging to help companies unify consumed data, automate insights, and apply machine learning to better leverage data to meet business goals.

    First, it's important to take a step back to evaluate the purpose and end goals here. Collecting data for the sake of having it won't get anyone very far. Companies need to identify the issues or opportunities associated with the data collection. In other words, they need to know what they're going to do with every single piece of data collected.

    To determine the end goals, start by analyzing and accessing different types of data collected to determine if it was beneficial to the desired outcome or has the potential to be but wasn't leveraged. This will help identify any holes where other data should be tracked. This will also help hone the focus on the more important data sets to integrate and normalize, ultimately making data analysis a more painless process that produces more usable information.

    Next, make sure the data is useful - that it's standardized, integrated across as few tech platforms as possible (i.e., not a different platform for every department or every function), and that the collection of specific data follows company rules and industry regulations.

    Finally, use data in new ways. Once your organization has integrated data and technology solutions, the most meaningful insights can often only be found using multidimensional analytics dashboards that take data from two previously siloed functions to understand how pulling a lever in one area affects costs or efficiencies in another.

    Using data to streamline business processes and lower costs

    One industry that's collecting data and using it efficiently to optimize business processes is the telematics industry. Before the digital transformation era, fleet managers and drivers had to rely on paper forms for vehicle inspections or logging hours of service. Now, many telematics-driven companies are relying on connected operations solutions to collect, unify, and analyze data for a variety of tasks such as improving fuel management, driver safety, optimized routing, systematic compliance, and preventive maintenance.

    We have seen fleets with hundreds of assets switch from other out-of-the box telematics solutions, to a more business-focused solution, which allows them to leverage data insights from their connected operations and realize meaningful improvements and costs savings. One such client recently reported saving $800,000 annually in field labor costs, an annual savings of $475,000 in fleet maintenance and repairs, and they've seen compliance with their overdue maintenance reduction initiative go from around 60 percent to 97 percent. It's clear that data contains the answers to an organization's challenges or goals. The question remains whether the organization has the tools to unearth the insights hidden in its data.

    Empowering decision makers through data

    The most important piece to the entire data chain is ensuring the right data insights get into the hands of decision makers at the right time. What use is accurate, analyzed data if it goes unused - as most of today's data does? Including the right stakeholders from across all business functions in the data conversations may unearth current challenges, as well as new opportunities that may have not otherwise been known. This is a step that many companies are now recognizing as crucial for success, which is why we will see more data consumed and put to use over the next three years.

    If they haven't already, executives and decision-makers at all levels should start looking at business operations through a data-centric lens. Companies that recognize and act on the fact that their competitive edge and profit growth lies in the insights hidden in their operational data can expect to see immediate ROI on their efforts to mine their data for golden insights. If they're not doing something about this now, they might just be in a race to the bottom.

    Author: Ryan Wilkinson

    Source: TDWI

  • Pyramid Analytics: Main lessons learned from the data-driven drilling and production conference

    Pyramid Analytics: Main lessons learned from the data-driven drilling and production conference

    It was great to be at the data-driven drilling and production conference in Houston on June 11 and 12. The conference was well attended by hundreds of oil and gas (O&G) professionals looking to use technology to minimize downtime, enhance safety, and deliver digital transformation throughout their businesses.

    We talked to dozens of attendees looking to educate themselves about modern data collection and ingestion methods, better information management and integration processes, E&P automation & control systems, more efficient change management, and drilling optimization techniques, and advanced and predictive analytics.

    As an analytics and BI vendor, we were there to learn more about how practitioners are using advanced analytics, particularly AI and machine learning, to extract more value out of their data.

    Three key themes

    In our conversations with attendees and other vendors, three key themes emerged:

    • The persistence of data silos

      No surprise here: data silos aren’t going anywhere. The upstream organizations we spoke to struggle with data sharing across departments. It’s a common scenario for users to have limited access to distributed data. It is also common for upstream organizations to perform analytics using numerous tools (many of the individuals we spoke to freely admitted to using three or four different BI tools). This perpetuates the cliché: there is no single version of the truth. The result is duplicate data, duplicate efforts for reporting, duplicate logic and business rules, and more. As a result, collaboration and efficiency suffer.
    • AI and ML operationalization remain elusive

      Many of the professionals we talked to lack effective systems for putting advanced analytics into production. Here’s a common scenario. A line-of-business user will throw data scientists a set of data and say, 'here’s the data, do your magic'. The data isn’t always optimized, so data scientists often spend time prepping the data before they can even analyze it. Then they analyze the data using standalone ML software applications before outputting a flat file and sending it to a business analyst to reload into one of several desktop-based BI applications. This results in a perpetual cycle of extracting, importing, analyzing, exporting, re-importing, and re-analyzing data. The whole process is cumbersome and inefficient; meaningful insights derived from AI and ML initiatives remain limited.

    • It’s hard to move beyond legacy analytics systems 

      For many O&G companies, there is a strong desire to adopt new data and analytics technologies; they acknowledge legacy tools simply aren’t equipped to quickly accommodate newer sources of data and perform advanced and prescriptive analytics. However, the difficulty of migrating from legacy systems often holds some people back, no matter how siloed their data environment is. Many organizations have had their current desktop-based analytics solutions in place for years, and in some cases decades. However, the huge store of analytic models, dashboards, and reports they have created over the years cannot be easily migrated or re-created. 

    The three challenges identified above are tough. But that doesn’t make trying to solve them any less urgent. And from our perspective, this doesn’t make them any less solvable. The price of inaction is too high. No one can stand on the sidelines while the technology environment changes.

    Author: Brigette Casillas

    Source: Pyramid Analytics

  • Qlik: De gevolgen van te lage data vaardigheden

    Qlik: De gevolgen van te lage data vaardigheden

    Er is een kloof ontstaan tussen de datagedreven ambities van organisaties en het daadwerkelijke vermogen van medewerkers om waarde te halen uit data. Dit blijkt uit een nieuw rapport van Accenture en Qlik genaamd 'The Human Impact of Data Literacy', uitgevoerd namens The Data Literacy Project.

    Data is een goudmijn die binnen een bedrijf innovatie en groei kan stimuleren. Echter, zodra werknemers moeite hebben om data te begrijpen, kan dit productiviteit en bedrijfswaarde beïnvloeden. Accenture en Qlik hebben een onderzoek uitgevoerd onder 9.000 werknemers wereldwijd. Hieruit blijkt dat bedrijven elk jaar door stress gemiddeld meer dan vijf werkdagen (ongeveer 43 uur) per werknemer verliezen. Stress die wordt veroorzaakt door informatie-, gegevens- en technologiekwesties. Werknemers stellen bijvoorbeeld hun werk uit of melden zich zelfs ziek. De kosten hiervan lopen wereldwijd in de miljarden: $109,4 miljard in de VS; $15,16 miljard in Japan; $13,17 miljard in het Verenigd Koninkrijk; $10,9 miljard in Frankrijk; $9,4 miljard in Australië; $4,6 miljard in India; $3,7 miljard in Singapore; $3,2 miljard in Zweden; en $23,7 miljard in Duitsland. Een te lage vaardigheid om data te interpreteren en analyseren heeft invloed op het vermogen van organisaties om mee te draaien in een datagedreven economie. Bijna alle ondervraagde werknemers (87%) ziet data als een waardevolle toevoeging, terwijl maar een minderheid daadwerkelijk gebruik maakt van data in de besluitvorming. Slechts een kwart van de ondervraagde werknemers gelooft dat ze volledig vaardig zijn om data effectief te gebruiken. Daarbij geeft 21% van de ondervraagden aan vertrouwen te hebben in hun datavaardigheid: het vermogen om data te lezen, te begrijpen, te bevragen en ermee te werken. Bovendien neemt 37% van de werknemers hun beslissingen op basis van data; bijna de helft (48%) baseert besluiten op hun intuïtie.

    Vaardigheden en training

    Een gebrek aan datavaardigheden vermindert de productiviteit op de werkvloer. Driekwart (74%) van de ondervraagde werknemers geeft aan zich overweldigd of ongelukkig te voelen bij het werken met data. Dit heeft invloed op hun algehele prestaties. Het leidt er bijvoorbeeld toe dat werken met data wordt vermeden: 36% zet alternatieve methoden in. Een ruime meerderheid (61%) vindt dat de overload aan data stress oplevert op de werkvloer. Dat komt verder naar voren in het onderzoek: iets minder dan een derde (31%) van het wereldwijde personeelsbestand geeft aan ten minste één dag ziekteverlof op te nemen, vanwege stress omtrent informatie, data en technologie. 'Niemand twijfelt aan de waarde van data, maar veel bedrijven moeten hun benadering van databeheer, analyse en besluitvorming opnieuw uitvinden. Dit betekent dat hun personeel over de vaardigheden en training moet beschikken die nodig zijn om de nieuwe kansen van data te benutten', zegt Sanjeev Vohra, Group Technology Officer en Global Data Business Lead bij Accenture. 'Datagedreven bedrijven die zich richten op continu leren, zijn productiever en behalen een groot concurrentievoordeel'.

    Handvatten bieden in een datagedreven wereld

    Om succesvol te blijven in de datarevolutie, moeten bedrijfsleiders ervoor zorgen dat hun medewerkers zelfverzekerder en kundiger worden. Datavaardige medewerkers geven sneller aan dat zij het zelfvertrouwen hebben om betere beslissingen te nemen, en worden daardoor ook meer vertrouwd om deze beslissingen te nemen. Training in datavaardigheid zou de productiviteit ten goede komen, gelooft 37% van de werknemers.

    Jordan Morrow, Global Head of Data Literacy bij Qlik en voorzitter van de Data Literacy Project Advisory Board, voegt hieraan toe: 'Ondanks het feit dat bedrijven de waarde van data erkennen voor het succes van hun bedrijf, worstelen de meeste bedrijven nog steeds met het opzetten van teams die die waarde daadwerkelijk kunnen verzilveren. In plaats van te investeren in de zelfredzaamheid van werknemers om met data te werken, is er veel aandacht besteed aan het verlenen van selfservice-toegang tot data.  Het probleem is dat werknemers zonder de juiste training of de benodigde tools niet met data uit de voeten kunnen. Je moet het zien als vissen zonder hengels, aas of netten: iemand naar de waterkant brengen wil nog niet zeggen dat ze dan een vis kunnen vangen'.

    Vijf stappen

    In het rapport 'The Human Impact of Data Literacy' van Qlik en Accenture worden vijf stappen opgesomd die organisaties helpen bij het inrichten van hun datavaardigheidsstrategie, om zo te werken aan een ​​datagedreven personeelsbestand, inclusief duidelijke dataverwachtingen en een cultuur van co-evolutie.
    Qlik en Accenture zijn oprichters van het Data Literacy Project, met als doel datavaardigheden verder te ondersteunen. Deze wereldwijde community stimuleert het voeren van diepgaande discussies en het ontwikkelen van tools die nodig zijn om een ​​zelfverzekerde en succesvolle, datavaardige samenleving te vormen.

    Bron: BI-platform

  • Talend: A personal data lovestory

    Talend: A personal data lovestory

    When I was in my mid-twenties, I thought I had it all. I had just recently graduated from a top law school, passed the California Bar Exam, and was working as a junior associate at a prestigious San Francisco law firm. Three short years later, I had turned my back on law and embarked on a career in the technology field, which after many twists and turns, including stints as an analyst at Gartner, positions at a number of start-ups (some of which were actually somewhat successful) and some of the world’s largest companies (Dell and EMC), has landed me at my current position at Talend’s product marketing team.

    Over the years, I have been asked many times why I left the practice of law. My usual answer has always been what you would expect. Quality of life (i.e. no time left for a personal life), office politics (need to cozy up to the right partners to advance), and an unhealthy dislike for billable hours (who wants to document and charge for every minute of every day) were some of my go-to responses. But now that I have been working at Talend for more than half a year, I have realized that the true reason went much deeper than that. Let me try to explain.

    Talend provides data integration, quality and management solutions to organizations of all sizes, from smaller companies to some of the world’s largest enterprises. Our number one goal is to make sure that organizations have all the data they need to make the right decisions and take the right actions, whether it is to have more compelling engagements with customers, develop better products, make better strategic decisions or make more efficient and cost-effective operational decisions. And I believe in this goal. When you think about it, this is the exact opposite of what a lawyer does.

    A lawyer’s job (and I am speaking from the perspective of a trial lawyer, which is what I did) is to limit the amount of data, evidence in the legal parlance, that is used by the ultimate decision maker (whether it is a jury or a judge) as much as possible to what favors your client’s side. Through a variety of motions before a trial and objections during trial (think of terms like hearsay, prejudicial, or irrelevant that you have heard in numerous TV shows or movies), lawyers try to limit the data or evidence that should be considered in making the ultimate decision.

    While this seems to work fine in an adversarial situation, think what it would be like if your business decisions were made the same way. What if a company decided to develop one product over the other because the product development team for the chosen product was able to limit what the other team could share with the executive decision makers. Or, if a decision to expand to a new territory was made based on incomplete market data from all regions.

    I have always been a data head deep down. In college, my favorite class (and my highest grade) was statistics. Looking back on it, I think I realized at a sub-conscious level that limiting or hiding data was not what I wanted to do for a living. That’s why I find it so appropriate that I ultimately ended up at Talend, a company whose goal is the opposite.

    If you are guilty of being as data driven as I am and want to ensure that you have all the information you need to make the right decisions and take the right actions, consider how your organization can benefit from improved data transparency and data access.

    Author: Jason Tolu

    Source: Talend

  • The Real Business Intelligence Trends in 2022  

    The Real Business Intelligence Trends in 2022

    Many companies are still adapting to changed requirements due to the COVID-19 pandemic. Although the situation now seems less acute and more long-term changes toward a ‘new normal’ are on the horizon, day-to-day business is far from settled. Some companies are dealing with last year’s decline in orders, while others are coping with the ongoing supply chain disruptions or are still in the midst of adapting their business model to the changed requirements or better equipping themselves for possible future crises.

    A look at this year’s business intelligence trends reveals that companies are still working to position themselves well for the long term and are working on the foundation of their data usage. Instead, companies are addressing the root causes of their challenges (e.g., data quality) and also tackling the holistic establishment of a data-driven culture.

    The BARC Data, BI and Analytics Trend Monitor 2022 illustrates which trends are currently regarded as important in addressing these challenges by a broad group of BI and analytics professionals. Their responses provide a comprehensive picture of regional, company and industry specific differences and offer insights into developments in the BI market and the future of BI.

    Our long-term comparisons also show how trends have developed, making it possible to separate hype from stable trends. BARC’s Data, BI and Analytics Trend Monitor 2022 reflects on the business intelligence, analytics and data management trends currently driving the market from a user perspective.

    The Most (and Least) Important Business Intelligence Trends in 2022

    We asked 2,396 users, consultants and vendors for their views on the most important BI, analytics and data management trends, delivering an up-to-date perspective on regional, company and industry-specific differences and providing comprehensive insights on the BI, analytics and data management market.

    Data quality/master data management, data-driven culture and data governance are the three topics that practitioners identified as the most important trends in their work.

    At the other end of the spectrum, mobile BI, augmented analytics and IoT data and analytics were voted as the least important of the twenty trends covered in BARC’s survey.

    Master data and data quality management in first position has retained this ranking over the last five years while the second most important trend, establishing a data-driven culture, has steadily increased in importance. 

    The significance of these two topics transcends individual regions and industry sectors. Establishing a data-driven culture is a trend that was newly introduced to the BARC Trend Monitor three years ago. Starting from fifth position in the first edition, it made its way up to third place in the last two years and is now ranked number two. 

    Data governance has also increased in importance. Having held down fourth position for several years, it rose to number three this year. Data discovery and visualization and self-service analytics (ranked four and five respectively) have been equally consistent trends, but both have now taken a back seat to data-driven culture.

    Our View on the Results

    Master data and data quality management in first position  has been ranked as the most important trend for five years in a row now. The stability of this trend shows the relevance of having good quality data to be significantly higher than other trend topics with a much broader presence in the media. It also reflects the fact that many organizations place high emphasis on their master data and data quality management because they have not reached their goals yet.

    This is in line with findings of other BARC Surveys that repeatedly show that companies are constantly battling with insufficient data quality as a hurdle to making better use of data. Hence, master data and data quality management will remain very important and is also linked to the equally stable significance of data governance, which was ranked in fourth position for four consecutive years before climbing to third place this year.

    Establishing a data-driven culture has increased in importance and is now ranked as the second most important trend. Since its introduction to the Trend Monitor in 2019, this trend has always ranked among the top five and is constantly gaining in prominence. This can be explained by the rising awareness that fostering a data-driven culture is vital to realizing the full data potential of a company. 

    Data discovery and data visualization and self-service BI have slipped down the rankings slightly this year. However, being ranked four and five in our list of 20 topics underlines their importance to organizations. All the top trends combine organizational and technological elements. They act as a solid foundation on which most companies are keen to put great emphasis.

    The top five trends represent the foundation for organizations to manage their own data and make good use of it. Furthermore, they demonstrate that organizations are aware of the relevance of high quality data and its effective use. Organizations want to go beyond the collection of as much data as possible and actively use data to improve their decision making processes. This is also supported by data warehouse modernization, which holds on to sixth position this year. 

    Some trends have slightly increased in importance since last year (e.g., data catalogs and alerting). However, most have stayed the same or just changed one rank.

    There are some major shifts in the downward trends. Data preparation by business users dropped from rank seven to rank ten due to alerting and agile BI development climbing the rankings. Mobile BI also fell three places to rank eighteen. In this case, a continuous downward trend can be observed over the last four years.

    Source: Business Application Research Center (BARC)

  • Using the right workforce options to develop AI with the help of data

    Using the right workforce options to develop AI with the help of data

    While it may seem like artificial intelligence (AI) has hit the jackpot, a lot of work needs to be done before its potential can really come to life. In our modern take on the 20th century space race, AI developers are hard at work on the next big breakthrough that will solve a problem and establish their expertise in the market. It takes a lot of hard work for innovators to deliver on their vision for AI, and it’s the data that serves as the lifeblood for advancement.  

    One of the biggest challenges AI developers face today is to process all the data that feeds into machine learning systems, a process that requires a reliable workforce with relevant domain expertise and high standards for quality. To address these obstacles and get ahead, many innovators are taking a page from the enterprise playbook: where alternative workforce models can provide a competitive edge in a crowded market. 

    Alternative workforce options

    Deloitte’s 2018 Global Human Capital Trends study found that only 42% of organizations surveyed said their workforce is made up of traditional salaried employees. Employers expect their dependence on contract, freelance and gig workers to dramatically increase over the next few years. Acceleratingthis trend is the pressure business leaders face to improve their workforce ecosystem as alternative workforce options bring the possibility for companies to advance services, move faster and leverage new skills. 

    While AI developers might be tempted to tap into new workforce solutions, identifying the right approach for their unique needs demands careful consideration. Here’s an overview of common workforce options and considerations for companies to select the right strategy for cleaning and structuring the messy, raw data that holds the potential to add rocket fuel to your AI efforts:

    • In-house employees: The first line of defense for most companies, internal teams can typically manage data needs with reasonably good quality. However, these processes often grow more difficult and costlier to manage as things progress, calling for a change of plans when it’s time to scale. That’s when companies are likely to turn to alternative workforce options to help structure data for AI development.
    • Contractors and freelancers: This is a common alternative to in-house teams, but business leaders will want to factor in extra time it will take to source and manage their freelance team. One-third of Deloitte’s survey respondents said their human resources (HR) departments are not involved in sourcing (39%) or hiring (35%) decisions for contract employees, which 'suggests that these workers are not subject to the cultural, skills, and other forms of assessments used for full-time employees'. That can be a problem when it comes to ensuring quality work, so companies should allocate additional time for sourcing, training and management.
    • Crowdsourcing: Crowdsourcing leverages the cloud to send data tasks to a large number of people at once. Quality is established using consensus, which means several people complete the same task. The answer provided by the majority of the workers is chosen as correct. Crowd workers are paid based on the number of tasks they complete on the platform provided by the workforce vendor, so it can take more time to process data outputs than it would with an in-house team. This can make crowdsourcing a less viable option for companies that are looking to scale quickly, particularly if their work requires a high level of quality, as with data that provides the intelligence for a self-driving car, for example.
    • Managed cloud workers: A solution that has emerged over the last decade, combining the quality of a trained, in-house team with the scalability of the crowd. It’s ideally suited for data work because dedicated teams develop expertise in a company’s business rules over time by sticking with projects for a longer period of time. That means they can increase their context and domain knowledge while providing consistently high data quality. However, teams need to be managed in ways that optimize productivity and engagement, and that takes something. Companies should look for partners with tested procedures for communication and process.

    Getting down to business

    From founders and data scientists to product owners and engineers, AI developers are fighting an uphill battle. They need all the support they can get, and that includes a dedicated team to process the data that serves as the lifeblood of AI and machine learning systems. When you combine the training and management challenges that AI developers face, workforce choices might just be the factor that determines success. With the right workforce strategy, companies will have the flexibility to respond to changes in market conditions, product development and business requirements.

    As with the space race, the pursuit AI in the real world holds untold promise, but victory won’t come easy. Progress is hard-won, and innovators who identify strong workforce partners will have the tools and talent they need to test their models, fail faster and ultimately get it right quicker. Companies that make this process a priority now can ensure they’re in the best position to break away from the competition as the AI race continues.

    Author: Mark Sears

    Source: Dataconomy

  • Van data driven naar data-informed besluitvorming

    intuitie 855x500Veel organisaties starten net met het data driven maken van hun besluitvorming, anderen zijn al verder gevorderd. De prominentere plaats van Big Data en algoritmen in besluitvorming van organisaties lijkt op het eerste gezicht alleen maar een positieve ontwikkeling. Wie wil er nou niet de customer journey kunnen volgen, de lead time verkorten en maximaal wendbaar zijn? Wie wil er geen slimme algoritmen waardoor complex speurwerk én moeilijke beslissingen geautomatiseerd worden?

    Besluitvorming, gedreven door Big Data en algoritmen, kent echter een aantal valkuilen: beslissingen, die teveel steunen op data, bevorderen een cultuur waarin medewerkers minder kritisch zijn, minder verantwoordelijkheid nemen en minder vertrouwen op hun eigen kennis en ervaring. Deze valkuilen zijn vooral van toepassing als de data en algoritmen nog niet ver genoeg ontwikkeld zijn, wat bij veel organisaties het geval is. Daarom pleiten wij voor ‘data-informed’ besluitvorming, waarin organisaties een balans vinden tussen enerzijds data en algoritmen, en anderzijds intuïtie, gestoeld op kennis en ervaring. In deze werkwijze is de medewerker nog in control. Hij verschuilt zich niet achter data en algoritmen, maar gebruikt deze om slimmere beslissingen te nemen.

    De upside van data driven besluitvorming

    De Big Data revolutie ontstond vanuit de groeiende aanwas en rijkere data die wordt verzameld en opgeslagen. Bovendien maakt slimme tooling het onttrekken en analyseren van data steeds gemakkelijker. Organisaties als Google, Tesla en de campagneteams van Hillary Clinton en Donald Trump zijn baanbrekend met hun datagedreven besluitvorming. Zo gebruikt Google Big Data en complexe algoritmen om advertenties te optimaliseren, zodat deze zo goed mogelijk bij de doelgroep aansluiten. Tesla zet sensoren en Big Data in om technische problemen op afstand te detecteren en te verhelpen (of zelfs te voorspellen en te voorkomen), waardoor recalls tot het verleden behoren. Dergelijke toepassingen zijn niet alleen weggelegd voor hippe startups, opgeschaalde multinationals of presidentskandidaten met veel geld. Datagedreven sturen kan iedereen door bijvoorbeeld met één proces of product te starten.

    Nederlandse vervoersbedrijven bepalen aan de hand van een voorspellend model de materieel- en personeelsinzet. Dit helpt hen om de mobiliteit tijdens pieken beter te stroomlijnen en geeft hen de kans om de dienstverlening keer op keer te verbeteren. Energiebedrijven gebruiken data voor het plegen van preventief onderhoud en het verduurzamen van hun processen. Profvoetbalclubs zetten tijdens wedstrijden data in om de klantbeleving te vergroten door spelers op het veld te volgen of zelf beelden te laten maken en te delen via social media en smartphones.

    De valkuilen van data driven besluitvorming

    Wanneer organisaties puur op basis van data en algoritmen beslissingen nemen, noemen we dat ‘data driven’ of ‘data centric’. Veel processen en zelfs beslissingen zijn (deels) geautomatiseerd, het menselijk brein verdwijnt naar de achtergrond en de data staat centraal in de besluitvorming. Wanneer algoritmen en data nog onvoldoende ontwikkeld zijn, verhoogt dit de kans op de volgende valkuilen:

    • Aannames worden onvoldoende getoetst;
    • Contextkennis wordt onvoldoende ingezet;
    • De data is onbetrouwbaar.

    Aannames worden onvoldoende getoetst

    In de aanloop naar de economische crisis van 2008 stuurden veel financiële instellingen op basis van risicomodellen die bijna niemand meer begreep. Het risico van hypotheekproducten schatten zij veel te laag in. Zij stelden de modellen nauwelijks ter discussie, maar gebruikten ze als verantwoording van correct handelen. Het resultaat: een systemische miscalculatie die bijna niemand zag aankomen, met desastreuze gevolgen.

    Dit voorbeeld illustreert dat het risicovol is om aannames van algoritmen niet of minder goed te laten toetsen door de mens én wat er gebeurt als we het vertrouwen in onze eigen intuïtie kwijtraken. Intuïtie kan een waardevolle toevoeging op data zijn, want met één van beiden dek je nog zelden de relevante werkelijkheid af.

    Contextkennis wordt onvoldoende ingezet

    Het CBS stelde dat Nederlanders in 2011 meer gingen lenen. Dit baseerden zij op hogere creditcardbestedingen. Maar wat was het geval? Nederlanders bestelden meer producten online en de creditcard was vaak het enige beschikbare betaalmiddel. Het CBS telde alle creditcardtransacties als leningen, ook gewone betalingen. Oftewel: iemand die online een boek of een vliegticket met een creditcard betaalde, was volgens het CBS iemand die niet meer bij de bank kon lenen en daarom zijn creditcard gebruikte.

    Dit voorbeeld illustreert het gevaar van het blind volgen van de data zonder contextkennis. Mét contextkennis had een analist op een lager detailniveau (type creditcardbesteding) geanalyseerd en geïnterpreteerd.

    De data is onbetrouwbaar

    In de campagne voor de presidentsverkiezingen van 2016 in de VS maakten zowel de teams van Hillary Clinton en Donald Trump gretig gebruik van Big Data en algoritmen. Onder meer voor nauwkeurige peilingen en efficiënte inzet van campagnemiddelen. Trump won, ondanks het beperkte budget (slechts de helft van Clinton). Het verhaal gaat dat de data van team Clinton minder betrouwbaar waren. Deelnemers van polls durfden tegenover haar team er niet voor uit te komen dat ze op Trump gingen stemmen. Tegen team Trump waren ze eerlijker. Zij zagen – tegen alle polls in – de overwinning al vijf dagen van te voren aankomen.

    Het vertrouwen in Big Data bij verkiezingscampagnes wordt nu ter discussie gesteld. Er was echter niets mis met de ontwikkelde algoritmen en de aanpak in het algemeen, maar met onbetrouwbare data zijn deze weinig waard of zelfs schadelijk, blijkt nu. Mensen kunnen nu eenmaal liegen of sociaal wenselijke antwoorden geven. In de sociale wetenschappen worden er niet voor niets allerlei strategieën toegepast om dit te minimaliseren. Het is dus belangrijk om aannames en datakwaliteit regelmatig te toetsen.

    Onjuiste of incomplete kennis kan desastreuze én onethische gevolgen hebben

    In het Amerikaanse rechtssysteem gebruiken ze geautomatiseerde data-analyse om de kans op recidive te berekenen. Er komt geen mens meer aan te pas. Ze crunchen de data en bepalen zo of iemand wel of niet vervroegd vrijkomt. Wetenschappers spreken over het doemscenario van volledig geautomatiseerde rechtspraak. Hoogleraar recht en informatisering Corien Prins: ‘Want op een gegeven moment is het uit je handen, dan heb je er niets meer over te zeggen.’

    Het belang van intuïtie

    Intuïtie wordt vaak als iets vaags of ongrijpbaars gezien. Dat heeft vooral met de definities te maken die worden gehanteerd: “iets aanvoelen zonder er over na te denken” of “het gevoelsmatig weten, zonder erover te hoeven nadenken”. Wat vaak wordt vergeten is dat intuïtie is opgebouwd op basis van kennis en ervaring. Hoe meer kennis en ervaring, hoe beter de intuïtie is ontwikkeld. Intuïtie wordt ‘bovenrationeel’ genoemd. Het werkt immers snel, moeiteloos en onbewust, in tegenstelling tot het ‘normale’ rationele denkproces, wat langzaam, complex en bewust is. Malcolm Gladwell beschreef in zijn boek Blink: The Power of Thinking Without Thinking dat bepaalde kunstcritici in een fractie van een seconde zien of een schilderij echt of namaak is, zonder dat ze daar direct een verklaring voor hebben. De ontwikkeling van kunstmatige intelligentie is nog niet zover dat zij deze experts kunnen vervangen.

    Beslissen op basis van intuïtie of onderbuikgevoel kent echter de nodige beperkingen. We hebben nogal wat vooroordelen (bias). Sommige waarheden zijn contra-intuïtief. Je denkt dat je alleen de boodschappen koopt die je echt nodig hebt. Wat blijkt: je maakt toch regelmatig gebruik van “drie-halen-twee-betalen”, waardoor je regelmatig voedsel weggooit. ‘Confirmation bias’ (tunnelvisie) is een veel voorkomende bias: we zien alleen de datapunten die in onze visie passen en alternatieven maken geen kans. Bovendien zijn we als mens niet in staat gigantische hoeveelheden data in korte tijd zonder rekenfouten te analyseren, zoals een computer dat kan. Bij deze menselijke tekortkomingen helpen data en algoritmen voor betere beslissingen.

    Van data driven naar data-informed

    Het is zaak om als organisatie geen genoegen te nemen met alleen data of alleen intuïtie. Het zijn twee bronnen die elkaar versterken. Wat is de optimale balans? Dat wordt met name bepaald door de stand van de technologie. Op gebieden waar algoritmen en kunstmatige intelligentie intuïtie nog niet kunnen vervangen, is het verstandig om ‘data-informed’ besluitvorming (zie Figuur) te hanteren. In deze aanpak is data niet leidend – zoals bij data driven besluitvorming – maar een verrijking van onze eigen capaciteiten. We hebben namelijk zelf onvoldoende mogelijkheden om alle informatie te kennen, te combineren, toe te passen en foutloos te werken. We hebben wel de kwaliteiten om niet-meetbare factoren mee te wegen, we kennen verklaringen en kunnen betekenis geven aan de data. En bovenal: we kunnen verantwoordelijkheid nemen. Data voorziet ons van informatie, maar wij gebruiken daarnaast intuïtie om beslissingen te nemen. Ditzelfde concept wordt toegepast in het vliegverkeer. Hoe goed de automatische piloot ook werkt, de menselijke piloot blijft eindverantwoordelijk. Zijn kennis en ervaring is nodig om besluiten te nemen, op basis van wat het vliegtuig voorstelt. Zowel data driven werken als volledig op basis van intuïtie werken kent dus beperkingen. Combineer het beste van beiden om als organisatie snel en gedegen besluiten te kunnen nemen.

    data driven data informed 1024x523

    Figuur. Data driven en data-informed (illustratie door Nick Leone, geïnspireerd op Fishman (2014) “The Dangers of Data Driven Marketing”).

    Case: Datagedreven verbeteren bij de Sociale Verzekeringsbank

    De Sociale Verzekeringsbank (SVB) wil hun klanten optimaal bedienen. Daarvoor is inzicht benodigd in de klantreis. De SVB brengt de digitale klantreis in beeld op basis van data, over de klantkanalen heen, met behulp van Process Mining. Deze data wordt uiteindelijk ingezet om de klantreis te sturen en te verbeteren. De SVB formuleerde onderzoeksvragen over de te verwachten klantreis. Bijvoorbeeld “Hoeveel klanten die een transactie uiteindelijk offline regelen zijn wel in de online portal geweest?” en “Op welke webpagina haken klanten af?” Data-analisten genereerden inzicht in de daadwerkelijke klantreis. Uit de data-analyse bleek bijvoorbeeld dat meer klanten dan verwacht afhaakten van online naar offline en dat zij dit vooral deden op een specifieke webpagina in de portal. De resultaten werden geduid door domeinexperts binnen de organisatie. Zij gaven direct aan dat het afhaken zeer waarschijnlijk een gevolg was van een extra authenticatie-stap. Na verdere analyse bleek dat deze stap vrij onverwacht in het proces kwam: de klant was hier niet voorbereid, waardoor zij het niet meer begrepen en/of zij niet bereid waren een extra stap te zetten. Op basis van de gezamenlijke conclusies zijn verbetervoorstellen uitgewerkt op gebied van proces, IT en webcontent. De effectiviteit hiervan is vervolgens weer getoetst door middel van data-analyse.

    Met alleen data had de SVB weinig inzicht gekregen in de context van de customer journey en beweegredenen van klanten en was er geen verbetering gerealiseerd. En met alleen intuïtie zou er veel minder inzicht in de daadwerkelijke klantreis zijn geweest. Klanten bewegen zich vaak anders dan men verwacht. Bovendien is (nog) niet elk gedrag en elke beweegreden van de klant in data te vatten.

    De basisingrediënten van data-informed werken

    Een data-informed besluitvormingscultuur herken je – naast het optimaal inzetten van data – aan kritisch denken, vertrouwen in eigen beoordelingsvermogen en (onderling) begrip van het waarom van besluiten. Een onderdeel daarvan is een periodieke toetsing van de beslismodellen. Bijvoorbeeld door regelmatig geautomatiseerde besluitvormingsprocessen achteraf te analyseren of door de feedback van klanten en andere stakeholders te gebruiken als input voor je beslismodellen. Deze cultuur van data-informed verbeteren vraagt om een datahuishouding die op orde is en expertise op gebied van data science.

    Tot slot nog een aantal concrete tips voor data-informed besluitvorming:

    • Zorg dat je personeelsbestand met data weet om te gaan. Om als organisatie competitief te zijn moeten de medewerkers kritisch zijn, complexe analyses kunnen uitvoeren en interpreteren, en acties kunnen definiëren.
    • Zorg dat je data blijft interpreteren en toetsen met je intuïtie en andersom. Bijvoorbeeld door met hypothesen of onderzoeksvragen te werken en niet te zoeken naar willekeurige verbanden. Dit scherpt je begrip over wat de data echt betekent en wat er werkelijk gebeurt in het proces of met de klant.
    • Innoveer en exploreer met nieuwe data-oplossingen in een ‘speeltuin’, om nieuwe analyses en analysemethoden te stimuleren. Implementeer deze zodra de oplossing getoetst is en de kwaliteit van de data en het algoritme op orde is.

    Source: managementsite.nl, 23 januari 2017

EasyTagCloud v2.8