28 items tagged "Decision Making"

  • 3 Strategic Questions the Media Industry’s Future Depends On

    There is no question that the media industry is experiencing dramatic disruption on many fronts—in the way it creates content, distributes content to consumers, and monetizes audiences. These changes are driven by seismic shifts in consumer behavior and an explosion of both consumer- and B2B-facing technologies. The disruption reveals itself in the fast growth of newer content brands like Refinery29 and Vice, the increased use of technologies like Outbrain and Taboola to drive traffic, and the growth of programmatic approaches to advertising revenue. As we reflect on disruption across the industry there are key strategic questions, all of which fundamentally consider balance:

    • What is the right balance between humans and technology across the full media and advertising ecosystem?
    • How do we maximize our creativity as an industry while integrating data-driven approaches?
    • When and how do we shift our businesses from legacy operating models to ones that better reflect the future?

    It is these tensions that now shape the most important considerations for advertisers, their agencies, and the media companies that convene audiences at scale.

    Humans and Automation

    The tension between human-driven and technology-driven capabilities is often miscast, positioning automation as a threat to the people that drive our industry. However, the more strategic opportunity is to enable humans to do what they do best and leverage technology to drive processes that are best served either by highly repeatable algorithmic tasks or by analytical complexity that surpasses the capacity of the human brain. If we draw the line carefully between these complementary approaches, we can unleash the talent in our organizations and apply humans to areas of growth and competitive differentiation. The grounding principle is: let humans do what humans do best, and let technology do what technology does best.

    Approaches to content, distribution, and monetization across the media industry all afford opportunities to explore the nuances of blending people and automation:

    Content and creative. In the content arena, long-form, quality journalism depends on the highest-caliber talent in reporting and editing. However, new technologies like CrowdTangle are better suited to spotting trends from social media to inform reporting and to identifying the optimal promotional mix for a news organization. National Public Radio and Upworthy are just two media organizations using CrowdTangle to power fast-moving social media trend analysis and news curation. Advertising also offers opportunities to blend humans and automation. On the one hand, the recent creative from Nike celebrating “losers” could only have come from the raw ideation of the best creative minds, but at the automated end of the spectrum, hyper-targeted 1:1 digital campaigns may not only benefit from precision in finding consumers but perhaps also from dynamic creative strategies matching multiple creative options with precise audience targets, an approach which can only be executed at scale through technology — via tools like CPXi’s AdReady.

    Distribution strategies. In traditional marketing communications, a well-informed brief will shape a human-driven strategy and insight-guided planning process. Smart cross-functional teams cull through ideas on the best ways to find and influence consumers to embrace a particular perception or take a specific action. While that human ideation is still critical, the inputs get exponentially richer with the right use of data and technology. The transparency and volume of social media interactions, for example, enable us to look past traditional demographic or psychographic characteristics to find clusters of consumers or conversations that are defined by data science to have mathematical density and importance as real communities. Execution against such sophisticated targeting strategies is guided by human insight but also requires powerful data analysis and technology. In any data- or technology-driven process, if people don’t connect the dots between different parts of the strategy, add judgment and context to analyses, and help frame the questions that data enables us to answer, we will not achieve the right outcome. The balance is subtle and sophisticated.

    Monetization. Traditionally, marketplaces for media were largely created by people. Sales people from media companies, agency account teams serving brands, and the clients themselves connected demand with supply. Briefs from clients informed RFPs from agencies, which informed responses and pitches from media companies. Of course, at this moment, particularly for any media that is digital, supply and demand can intersect in real time via bidding in a range of auctions and exchanges. Programmatic technologies allow us to perform a match between placement and price in a highly dynamic, high-volume environment, as a complement to the people-driven processes. To declare that programmatic will become the entire marketplace of the future is too extreme and undervalues the balance between humans and technology. The future demands a balance between big ideas like sponsorships and branded entertainment that can only be developed through conversation and human ideation, and highly efficient media amplification strategies that can best be executed via technology.

    The more sophisticated our approach to balancing humans and technology, the more likely that we can simultaneously unleash the creativity and intelligence of our teams, while making them able to get more impact out of the ideas they create through scalable technologies.

    Creativity and Data

    Creativity sits at the core of brand stewardship, advertising, and content creation. Chief marketing officers and their teams contemplate and shape the brand attributes that best define the relationship between a product or service and its customers. Creative agencies unfurl their best ideas to make advertising memorable while informing or entertaining audiences. And at media companies, journalists, photographers, video producers, and illustrators bring their talent, skills, and experience to shaping stories and features, large and small. But the increased availability of data, and perhaps more importantly, the ability to derive meaningful insights from it, provide new opportunities to inform our creative ideas and to measure their impact.

    A willingness to embrace data strategies as part of a creative process can become a point of differentiation and advantage:

    Content creation. Whether we are shaping important news stories as journalists or producing award-winning advertising creative, storytelling is the means by which we connect messages to audiences in resonant, meaningful ways. Historically, content creation was an exclusively human process but careful blending of technology into the mix can drive even greater editorial or advertising success. Think of the story-building inspiration of a mood board, which is a collage of visual stimuli that evoke the essence, tone, identity, and intent of a potential advertising campaign hoping to reach a specific target audience. By contrast, contemplate the possibility of monitoring a data-defined cluster of that same target audience to evaluate the visual media (think Instagram, Pinterest, memes) that they might be sharing in real time on Twitter. A world of transparent social media engagement offers powerful new sources of insight into the content that most readily engages communities of customers. This data-driven approach yields a dynamic mood board algorithmically calculated based on tweet and retweet volumes within the target audience cluster (full credit to Scale Model at Betaworks for this concept). Data-driven processes cannot replace human creativity and judgment, but they can be a rich complement.

    Marketing strategies. A sharp creative mind can generate ideas to engage audiences by bringing stories to life. Revlon’s Times Square billboard, which projects real-time images of people gathered below on the street over the tagline “Love Is On” (also displayed online), would not have surfaced but for creatives who connected strategy with a means to bring the brand to life. It is one of countless examples where the sheer power of human ideas defines success. But as channels and platforms proliferate, it becomes less feasible to see creativity as the sole factor in deciding how to impact audiences—from media-mix models to precise digital targeting approaches. As the options continue to multiply, data becomes an objective means to evaluate potential strategies across paid, owned, and earned channels. And, instead of traditional demographic breaks dominating the media choices, data reveals more dynamic and meaningful views of audience segmentation to elicit true engagement. Still, while data can offer a starting point for more nuanced views of clustering, human judgment that allows us to discern the data worth a keener focus.

    Measurement. Data is used most robustly for measuring the results and impact of engagement strategies. Whether the metric is outcome-based, like sales, leads, and traffic, or more qualitative, like brand perception and lift, data is widely used to understand the effect of campaigns and to develop audience. And yet, measurement is perhaps the arena most challenged on the metrics front. The fragmentation of digital platforms has fostered a lack of consistent standards, and many of the most innovative experiences rely solely on proprietary publisher-owned metrics for reporting. Impact measurement only becomes more complex as ideas are executed across platforms. And most measurement scenarios do not offer a seamless view across paid, owned, and earned data sets. Yet even as better, more consistent measurement emerges, data will only take us so far in the journey to understand impact. Thoughtful analytics are best married to human judgment to derive insight laden with broader context. Ultimately, human judgment is best poised to truly understand the more subtle dimensions of brand equity and influence.

    Data enables us to free up time previously deployed against the manual parsing and review of the many marketing, communications, and media options, offering new opportunities to apply human creativity to bigger ideas that capture audiences’ imaginations.

    Future Transformation

    The future requires change on a massive scale for most organizations, and the best approach involves leadership’s embracing the complexity, not only of developing the right strategy but executing it with deep attention to the details that matter. Operating models can shift but require a conscious approach to a range of issues, including organizational structure, workflows, technology platforms and overall change management. The product mix can be re-architected to rely increasingly on newer and high-growth offerings, but not before buyers are ready to embrace the new opportunities beyond experimentation. The challenge is to lead the marketplace and be sure new supply connects with demand in real time. And from a financial perspective, investment decisions and revenue expectations require careful forecasting and pacing against expectations to understand the multifaceted shift from legacy business lines to newer ones.

    Disruption of the media industry often feels like a brute force, moving quickly and without discretion. However, the ways we must respond as participants in the ecosystem is quite the opposite, requiring judicious, nuanced approaches. The critical concept is to balance the tensions to drive powerful results.

  • 5 Lessons for leaders aiming to disrupt their industry

    5 Lessons for leaders aiming to disrupt their industry

    Being a disruptive leader takes vision, communication and balance. Here are five things I've learned as I've disrupted my industry.

    I am naturally a disruptive leader — someone who not only embraces change but thrives in it, driving innovation and transformation. I love pushing boundaries and surprising people with bold decisions that change our company and industry. Seeing your strategic vision and hard work pay off as your company pivots and the industry follows feels exhilarating. But it can also be exhausting and even damaging if you don't understand your decisions' true impact.

    Growing StoneAge from a small traditional manufacturing company that sells high-pressure waterblasting tools to an industry-leading automation and robotics firm, I've made good and bad decisions as we've disrupted ourselves and the markets we serve. Changing how people think and work is difficult, and it always takes longer than one thinks. I've pushed too hard, made decisions too quickly and intimidated people with our hard-charging approach during my mission to change our company, industry and the world.

    As we embark on our next phase of disruption, I'm taking what I've learned over the last decade and applying the lessons to our approach — and below, I'll share those lessons with you:

    1. Disruption requires planning and careful decision-making

    I tend to underestimate the effort and planning required to make a disruptive change. When we decided to change our sales model from distribution to direct (which was highly disruptive to our niche industry), we began executing less than eight weeks from making the decision. While the industry was surprised, and it took them a while to respond, I didn't fully prepare my team for the effort either. Through brute force, we made it happen, but it was far more difficult and costly than it had to be. I learned that a clearly articulated vision and a well-executed plan are far less taxing to the organization. We now operate off of a 2030 Vision and a well-organized plan that keeps us on track while allowing us to be agile as we learn and grow.

    2. Acknowledge uncertainty and risk

    It can be challenging for leaders to say "I don't have all the answers," but the truth is, we don't. Disruptive leaders have a vision for what can be, but there is no way to guarantee outcomes. Rather than pretend that the road ahead will be smooth, admit to your team that there will be challenges, mistakes and maybe even turmoil as you travel towards disruption. Admit that the company may have to pivot as new challenges pop up (ahem, Covid), and share with them how the company will handle it by making good decisions, staying focused on problem-solving and being agile and curious.

    3. Be transparent and honest

    When we changed our business model from distribution to direct sales, we were moving fast, and we didn't want anyone in the industry to get wind of what we were doing. So, the executive management team holed up in a conference room dubbed "The War Room" and mapped out our game plan. We decided to lock the door at night so no one could come in and see our whiteboards filled with ideas and decisions. We told our employees that this was top secret, and we wouldn't be sharing the plan until we had it figured out.

    The lack of transparency was a departure from how we usually do things. We are an open-book company and share our financials and strategy with everyone in the company. The lack of transparency scared people, and people felt left out and in the dark. In fact, on an employee survey a few months later, someone described feeling like a mushroom, left in the dark and fed poop. Don't let this happen. Be transparent and honest with your people. You are asking them to do hard things, because disruption isn't easy. Honor them by telling them the truth and trusting them to help you achieve your vision and plan.

    4. It takes time for seeds to grow

    It's challenging to be patient, and if you underestimate how long changing a company or industry takes, you might not let the seeds you planted grow. I made this mistake in the past, changing direction when I didn't see immediate results, and I had to slow down and backtrack to get the company on the right path again. Operating in line with our 2030 Vision, we now take a long-term view of progress and are more patient as we innovate. The results are remarkable; not only are we disrupting the industry, but we are also having a lot more fun while we do it. People are aligned and clear about our mission. They can see the results, making them feel part of something bigger than themselves. The seeds have sprouted and are developing buds, and we are nurturing the next growth phase.

    5. Embrace the paradox of change

    The Paradoxical Theory of Change states that the more you try to change by being something you are not, the more you stay right where you're. Organizations are the same. As you look to disrupt yourself, your company and your industry, don't forget who you are at your core. Stay true to your company values and your vision. Change happens when you build upon who you are as a company, not when you try to become an entirely different one. At my company, we use our 2030 Vision and what we call the "Recipe For Success," our core operating principles, to guide us as we assess new opportunities or consider new strategic initiatives while remaining true to who we are as a company. Doing so pays off; we can see where we want to go and feel confident to get there, because we know who we are.

    What excites me most as I continue to mature as a disruptive leader is the balance I am finding between long-term planning and living in the moment. The thrill of making an impact can take over your life, but if you are only living in the future, it isn't easy to enjoy the present. The goal is to find the sweet spot of both being present and enjoying the moment while at the same time setting yourself and your company up for success in the future.

    Author: Kerry Siggins

    Source: Entrepreneur

  • 7 Tips to integrate new research technologies in your organization

    7 Tips to integrate new research technologies in your organization

    The pandemic has accelerated the adoption of emerging research technologies. However, it's important to consider not just your current needs but also what these technologies actually do. Here are seven key tips for brands and agencies to consider when adopting new ResTech.

    ResTech is booming amid the pandemic. Recognising that they need ongoing insights at a time of overwhelming uncertainty and change, brands are investing in tech to get a real-time and accurate understanding of their customers.

    In today’s unpredictable landscape, using tech is no longer optional in research. And in many projects, emerging tech can play an important role in driving business outcomes.

    How ResTech is used alongside traditional techniques

    Let me demonstrate with an example. A global beverage brand approached us recently to get a more nuanced understanding of the desirability of its drinks. Pre-pandemic, the obvious techniques would have been in-person ethnography and in-person interviews and focus groups – techniques that are slow and time-consuming but appropriate for this type of work.

    New tech bridged the gap. We recruited hundreds of consumers for in-home missions and used the Rival Technologies platform to get quantitative and qualitative responses on their behaviours and attitudes. After gathering hundreds of videos in just a few days, we used facial coding software and voice tonal analysis to capture sentiment beyond what consumers said. Select participants joined us for virtual in-depth interviews, with real-time transcription software and text analytics aiding the analysis. The project involved five languages in seven countries and was executed from end to end in just a few weeks.

    This example illustrates the need to use mixed methodologies and numerous platforms to get a richer and more complete understanding of people’s attitudes, behaviours, and underlying motivations. Through our experience, we know that maximising the ROI of emerging research technologies is more of an art than a science.

    Seven strategies for integrating ResTech

    Below, find a few key learnings to consider.

    1. Purpose-fit the tech you are considering.

    Tech is never a complete replacement for a traditional research method. Don’t try to force a technology just because it’s cool. Business and research objectives should take precedence. Tech will do some parts of your research project better than traditional approaches, but it will also do other things poorly. Don’t compromise data quality for the sake of innovation.

    2. Prepare for a conversation.

    Tech companies care about the length of the relationship. Their business model is built around yearly subscriptions.

    On the other hand, agencies (and even client-side teams) think of tech on a project-by-project basis. Research teams care about sample size and operational costs, which are a lower focus for a software company. This puts the tech vendors and researchers at cross purposes.

    There’s no single way of doing this correctly, but understanding the tech company’s model and explaining what you’re after is a good first step.

    3. Find the right balance.

    Established vendors are safe bets, but their solutions operate within a box – customisations are often time-consuming and expensive, if at all possible.

    Newer players have more flexibility and can deliver value in surprising ways. Set aside budget and time to play with emerging methodologies because this is where innovation happens.

    That said, aim to have traditional methods as a fallback. Some Reach3 clients, for example, have done parallel studies using Rival and existing vendors to understand the differences between the two. Eventually, brands discover that the new technology delivers deeper, richer insights, but having the traditional tech available provides peace of mind in the early stages.

    4. Get support.

    DIY has its place and time, but if you’re using new tech, you need to make sure the vendor has invested enough in customer success. No matter how great the technology is, if it’s new in the market, you’ll likely run into unforeseen issues. Having solid operations and customer success teams that can help bring your creative ideas to life with their tech is crucial.

    5. Budget appropriately.

    Traditionally, research projects are costed with treating all costs as direct costs and then a services cost is overlayed on top to arrive at a profit. Tech is not a direct cost – it’s an enabler that helps you do something exponentially. As an agency, if you use the traditional costing model, you will price yourself out of the running.

    While your first project will have a steep learning curve, over time there will be efficiencies and eventually, tech will save time and money.

    6. Be clear about liability.

    When working with customer data, be cognisant of who will take on the liability of the tech. Typically, the relationship is between the service provider (tech company) and the client (who owns/is responsible for the customer data). The research agency is not a natural fit in the mix. If you’re on the agency side, engage your client, the tech vendor, and your legal counsel early to navigate this discussion more smoothly.

    7. Experiment with care.

    You will occasionally come across a piece of tech that is amazing at doing something specific, which may seem like an exciting add-on to a research methodology. Be cautious when doing this. Unless you have proper dev support, expanding the application of the tech is a more gargantuan task than it might seem. Doing constrained, smaller experiments can help you validate new use cases without taking on a huge burden.

    Maximizing the value of new tech

    Deploying innovative research tech should never be scary or disruptive. If you’re a corporate researcher, the easiest way to test new techniques is to work with agencies who are already experts in new technologies you’re interested in.

    In the end, ResTech is best deployed by researchers who understand the business issue and know how to integrate technology in a fashion that automates and accelerates insights-generation so humans can do what they do best: use insights to create game-changing business recommendations.

    Author: Bala Rajan

    Source: GreenBook

  • Business Intelligence & Analytics Fueled by Decision Science

    Business Intelligence & Analytics Fueled by Decision Science

    The field of decision science focuses on making data-informed decisions. Decision science helps to analyze the impact of a decision on the business. The best decisions are often made with a combination of data and precise business questions. The more precise the questions, the more precise the data requirements will be.

    Harvard’s Center for Health Decision Science (CHDS) explains that this unique science is a “collection of quantitative techniques” applied to decision-making at both the individual and population levels. It includes “decision analysis, risk analysis, cost-benefit and cost-effectiveness analysis, constrained optimization, simulation modeling, and behavioral decision theory.” Further, “decision science provides a unique framework for understanding public health problems.” 

    However, decision science is not just applied to public health but also pricing decisions such as the optimal price for a product or service; product decisions such as measuring profitability vs. customer satisfaction; marketing decisions such as allocating budget across different marketing activities like public relations, advertising, or sales promotion; and finally, HR decisions such as hiring or firing decisions or performance evaluations. 

    When the right type and volume of data are used to make any of the above decisions, the decisions are far more likely to be accurate and effective. 

    Why Use Decision Science in Business Analytics?

    In businesses, different types of decisions are made daily. As decisions have direct impacts on business performance, they come with inherent risks as well as payoffs. Every time a business decision is made, the risks and potential benefits are quantified and measured. The process of making informed business decisions through a combination of quantitative data analysis, data visualization, and deep modeling techniques is known as decision science. 

    So, to put it in one sentence, decision science “is the process of analyzing the impact of a decision on a business.” The two primary components of decision science are data and a set of tools, which may be both qualitative and quantitative. Qualitative tools include content analytics or data visualization tools. In contrast, quantitative tools include statistical or machine learning (ML) solutions – for example, linear regression may be used to study the impact of advertising budget on sales growth. The data for each business case helps answer business questions, and the set of tools helps analyze the data for making informed decisions. 

    The Role of Data in Effective Decision-Making

    In a typical scenario, a business analyst may use sales data to predict the total number of customers likely to buy a product. If high-quality and high-volume data are available for this exercise, then this type of analytics can help in making multiple future decisions. 

    Data quality plays a critical role in decision science, without which the decisions will neither be reliable nor accurate. Another related requirement for effective decision science is a precise business question to narrow down the exact data sets.

    An infographic from KDNuggets.com explains how decision science differs from data science.

    While data science is an interdisciplinary field designed to extract insights from data, decision science involves the use of both qualitative and quantitative techniques to analyze data and insights for better business decisions. Though data is equally important for both the sciences, the approaches to data analysis and applied mechanisms are quite different.

    Using Scenario Analysis to Gauge Outcomes in Business Analytics

    In some business cases, the decision may involve identifying the customer adoption rate for a product or measuring the impact of change in a government policy on your business. In those cases, a scenario analysis may be used to compare two or more probable “outcomes” so that the most suitable decision is taken based on the result of the comparison. These outcomes may include a scenario describing what is most likely to happen, a scenario describing what is least likely to happen, and a third scenario describing the extreme that could happen.

    Use of Statistics in Determining Outcomes in Business Analytics

    A “statistically significant” result indicates whether a particular result is likely valid. This type of analysis can be applied to both qualitative and quantitative data. A good example of qualitative analysis is a survey to gauge customer sentiment. The results of this survey will help identify whether the customers are satisfied or dissatisfied with your business. 

    The statistically significant result will create a confidence interval around the survey results. The confidence interval represents the statistical significance of the survey results and can be applied to any survey question. 

    Data-Driven Decision Making: Benefits of Decision Science 

    Business decisions that are based on data are more likely to be successful than decisions made without data. This is especially true for large decisions that will have a significant impact on the future of the business. 

    The typical benefits of data-driven decision-making are increased certainty around outcomes, increased chances of outcomes matching your expectations, and enhanced understanding of customers. As you gain a better understanding of customers and competitors through data-informed decisions, the odds of making wrong decisions are substantially reduced. 

    Here are some major benefits of using decision science in an organization:

    • It helps businesses make unbiased, data-informed decisions. 
    • When used with decision support systems, decision science can enable enhanced interpretations and effective decisions promptly. 
    • It can offer a competitive edge in a business environment requiring intelligent data interpretations. 
    • It helps senior management identify uncertainties, value outcomes, and other issues involved in business decisions. 
    • Decision science often helps compare available alternatives and zero in on the optimal solution.

    The Decision Science Role

    In decision science, the analyst takes a “360 view” of the business challenge. By combining different types of data analysis, data visualizations, and behavioral understanding of customers, the decision scientist can make specific, data-informed decisions. 

    The average decision scientist works with various data sources, insights, and highly specific business questions to make business decisions. So, the decision scientist must be a superior data analyst and be skilled in business. The decision scientist analyzes insights as they relate to specific business problems at hand. 


    Decision science is frequently used in the military, business, government, law and education, public health, and public policy. CHDS uses decision analytics to create policies designed to improve population studies through “systematic integration of scientific evidence” to measure the value of outcomes such as mortality rates, quality of life, and costs.

    In the future, data science will progress toward more automation and further evolution of AI-enabled platforms, including augmented reality, robotization of industry processes, and reinforcement learning. In sharp contrast, decision science will move toward automated decision-making and data empowerment. The rising importance of decision science in industries will lead to increasing demand for specialists.

    Author: Paramita Ghosh

    Source: Dataversity

  • Business Intelligence nog steeds hot….

    Business Intelligence outdated? Niets is minder waar zo bewees het Heliview congres ‘Decision making by smart technologies’ dat afgelopen dinsdag in de Brabanthallen in Den Bosch werd georganiseerd.

    200 Klantorganisaties luisterden naar presentaties van o.a. Rick van der Lans, Peter Jager, Frank de Nijs en Arent van ‘t Spijker. Naast het bekende geluid was er ook veel nieuws te beluisteren in Den Bosch.

    Nieuwe technologieën maken heel veel meer mogelijk. Social media en, moderne, big data technologie stellen organisaties in staat veel meer waarde uit data te halen. Hoe organisaties dat moeten doen is veelal nog een uitdaging. Toepassing van de technologie is geen doel op zich zelf. Het gaat erom toegevoegde waarde voor organisaties te produceren. Of door optimalisatie van processen. Dan wel door het beter bedienen van de klant door productontwikkeling. In extremis kan data zelfs de motor achter nieuwe business concepten of –modellen zijn. Voorwaarde is wel een heldere bedrijfsvisie (al dan niet geproduceerd met intelligent gebruik van data en informatie). Belangrijk om te voorkomen dat we ongericht miljoenen stuk slaan op nieuwe technologie.

    Voor de aanwezigen was het gehoorde geluid soms bekend, maar soms ook een confrontatie met zichzelf. Een ding is zeker: De rol van data en informatie bij het intelligent zaken doen is nog niet uitgespeeld. Business Intelligence leeft.

    30 JANUARI 2015

  • Context is key for organizations making data-driven decisions

    Context is key for organizations making data-driven decisions

    As organizations enter a new year, leaders across industries are increasingly collecting more data to drive innovative growth strategies. Yet to move forward effectively, these organizations need greater context around their data to make accurate and streamlined decisions.

    A recent Data in Context research study found that more than 95% of organizations suffer from a data decision gap, which is the inability to bring together internal and external data for effective decision-making. This gap imposes a number of challenges on organizations, including regulatory scrutiny and compliance issues, missed customer experience opportunities, employee retention problems, and resource drainage due to increased manual data workload.

    While the influx of data is endless, organizations that fail to obtain a holistic, contextual view of complete datasets remain at risk for ineffective decision-making and financial waste. However, with the proper systems and technologies in place, companies can overcome the data decision gap to foster success in 2022.

    Siloed Systems Create Fragmented Data

    Fragmented data and disorganized internal systems have plagued companies for years, making it difficult for organizations to harness the full potential of their data due to a lack of context. Information technology has also drastically evolved, presenting companies with hundreds of different applications to choose from for storing data. However, this range of multiple siloed systems can create disparities in data.

    For example, financial services organizations might utilize different systems for each of the products they offer to customers and those systems might not be joined together on the back end. When trying to make informed decisions about a given customer, financial services professionals will need to consider all the available data on that customer to take the right course of action – but they can do so only if they are able to look at that data holistically. Without a single customer view in place, financial and other institutions might struggle to address customer needs, creating negative experiences.

    To combat this issue, organizations need their data to move across systems in real-time feeds. Lags in data processing create missed customer opportunities if employees cannot access the latest view of up-to-date information. However, the right technologies can take fragmented data and make it accessible to individuals across a company, giving multiple employees comprehensive views of timely data.

    Outdated Data Impacts Employee Workloads

    With data constantly evolving, organizations need to implement effective Data Management systems to ensure employees are equipped with the time and knowledge they need to navigate through data seamlessly. Data can become outdated at a fast rate, and manually monitoring for these changes requires sustained energy from employees, which can prevent them from utilizing their time and talents in more productive ways. This can lead to burnout and generate retention issues. 

    Tools like artificial intelligence, entity resolution, and network generation can solve this by updating datasets in real time, giving employees more time to manage their workloads, conduct investigations, and pursue efforts to create stellar customer experiences. Not only do these technologies help improve employee routines, but they are also the key to cleaning up data, catching fraud, and enabling organizations to avoid regulatory and compliance issues.

    Regulatory Scrutiny and Compliance Issues

    The aforementioned study found that nearly half of respondents experienced issues with regulatory scrutiny and compliance efforts as a result of the data decision gap. This comes as no surprise given that organizations are required to have appropriate controls on data, especially in industries like financial services.

    Within financial services, regulators are enforcing stricter rules for organizations to remain compliant with their Anti-Money Laundering (AML) and Know Your Customer (KYC) models. While teams may attempt to keep customer records up to date by leveraging different systems, the underlying problem is data lineage and data quality. When regulators see any inconsistencies in a company’s data, they impose costly fines or freezes in operations until the data is sorted, creating major setbacks both internally and externally. 

    Inconsistencies in data create a lack of trust, which can spark differing views around company operations. This leads to discussions over issues that could have been better managed if a more comprehensive and accessible view of data had been available from the outset. 

    Final Thoughts

    In a world where data will continue to grow exponentially over the next several years, organizations must work to overcome the data decision gap. Organizations will always face challenges as internal and external circumstances continue to evolve, but by adopting technologies and processes to ensure data is always reflective of the latest developments, they can make the best possible decisions.

    Author: Dan Onions

    Source: Dataversity

  • Data access: the key to better decision making

    Data access: the key to better decision making

    When employees have better access to data, they end up making better decisions.

    Companies across sectors are already well in the habit of collecting relevant historical and business data to make projections and forecast the unknown future. They’re collecting this data at such a scale that 'big data' has become a buzzword technology. They want lots of it because they want an edge wherever they can get it. Who wouldn’t?

    But it’s not only the quantity and quality of the data a company collects that play a pivotal role in how that company moves forward, it’s also a question of access. When businesses democratize access to that data such that it’s accessible to workers throughout a hierarchy (and those workers end up actually interacting with it), it increases the quality of decisions made on lower rungs of the ladder. Those decisions end up being more often data-informed, and data is power.

    But that’s easier said than done lately. Businesses have no issue collecting data nowadays, but they do tend to keep it cordoned off.

    Data sticks to the top of a business hierarchy

    A business’s C-suite (often with help from a technical data science team) makes the big-picture decisions that guide the company’s overall development. This means the employees using data to inform a chosen course of action (like last year’s revenue versus this year’s revenue, or a certain client’s most common order) are either highly ranked within the company, or are wonky data specialists. Data lives behind a velvet rope, so to speak.

    But this data would be eminently useful to people throughout an organization, regardless of their rank or tenure. Such a level of access would make it more likely that data guides every decision, and that would lead to more desirable business outcomes over time. It might even overtly motivate employees by subtly reinforcing the idea that results are tracked and measured.

    Data tends not to trickle down to the appropriate sources

    Who better to have a clear view of the business landscape than the employees who toe the front lines every day? What would change if disparate employees scattered throughout an organization suddenly had access to actionable data points? These are the people positioned to actually make a tweak or optimization from the get-go. Whoever comes up with a data-informed strategy on a strong way forward, these are the people actually implementing it. But an organization-level awareness of an actionable data point doesn’t necessarily equate to action.

    As previously established, data has a high center of gravity. It is managerial food for thought on the way to designing and executing longer-term business strategies.

    But when companies change their culture around access to data and make it easy for everyone to interact with data, they make every worker think like such a strategist.

    By the time a piece of data reaches an appropriate source, it’s notnecessarily in a form he or she can’t interact with or understand

    As much as managers might like to think otherwise, there are people in their organization thinking in less than granular terms. They aren’t necessarily thinking about the costs their actions may or may not be having on the company, they don’t think about the overall bottom line. That’s why it’s important that data be in a form that people can use or understand, because it doesn’t always reach them that way.

    Getting data into a useable, understandable form happens by preserving connection between departments and avoiding disconnects.

    There seems to be a big data disconnect at the intersection of engineering and product development

    This is the intersection is where a business’s technical prowess meets its ability to design a great product. While the two pursuits are clearly related to one another on the way to great product design, it’s rare that one person should excel at both.

    The people who design groundbreaking machine learning algorithms aren’t necessarily the people who design a groundbreaking consumer product, and vice versa. They need each other’s help to understand each other.

    But data is the shared language that makes understanding possible. Not everyone has years of data science training, not everyone has business leadership experience, but even people doing menial things can still benefit from great access to data. Coming across the year’s growth goal, for example, might trigger a needle-moving idea from someone on how to actually get there. Great things happen when employees build a shared understanding of the raw numbers that drive everything they do.

    Businesses already collect so much data in the course of their day-to-day operations. But they could start using that data more effectively by bringing it out from behind the curtain, presenting employees across the board with easy access and interaction for it. The motivation for doing so should be clear: when more people think about the same problem in the same terms, that problem is more likely to be solved.

    All they need is access to the data that makes it possible.

    Author: Simone Di Somma

    Source: Insidebigdata

  • Data integration applied to BI: making data useful for decision making

    Data integration applied to BI: making data useful for decision making

    In this technology-driven world, the influx of data can seem overwhelming, if not properly utilized. With data coming in from so many different sources, the only way to extract real insights from these raw inputs is through integration.

    Properly integrated data has a trickle-down effect on all business processes, such as sales, vendor acquisition, customer management, business intelligence, etc. Implementing this level of integration enables businesses to make continuous improvements to their products and services.

    Business intelligence (BI) is one of the most significant data integration use cases. An effective BI process incorporates everything from predictive analytics to reporting and operations management. But this sort of comprehensive analytics framework requires integrated enterprise data to identify process inefficiencies, missed opportunities, and other improvement areas.

    What complicates BI integration?

    Given that enterprise information comes from different sources in varying formats and often contains inconsistencies, duplicates, and errors, users must ensure that quality issues identified during the data extraction process do not propagate to their end results. These unchecked outputs impact the integrity and accuracy of reporting, which in turn negatively influences decision making leading to further inefficiencies across business processes.

    Creating well-defined integration processes that not only consolidate data but standardize it for consistency and quality can make high-quality data readily available for decision making.

    Streamlining BI integration: best practices

    Raw data becomes valuable when transformed into analytics-ready, actionable information. By bringing disparate formats together into a unified data repository, an integrated BI system offers better visibility and efficiency into the enterprise assets.

    Therefore, successful BI initiatives are a combination of an effective integration and analytics strategy. The best practices stated below can help you make the best of it:

    Document a BI strategy

    Every business has a reporting process in place. Before implementing a new BI strategy, it’s important to evaluate existing systems to identify the areas that need improvement. Based on that information, you can design a new strategy, which can include several components depending on your specific business structure. However, the major ones that cannot be ignored include the following:

    • Narrow down the data source channels essential for your reporting process. This may consist of stakeholder or departmental information from databases, files, or web sources.
    • The purpose of BI tools is essential to track business KPIs with supporting data. Therefore, identifying the custom KPIs for your organization is imperative in presenting a broad picture of your business growth and losses.
    • Set a format for reporting: visual or textual. Based on your preferences and the input sources, you can select a vendor for the BI system.

    Set up data integration tools

    The integration stage of the entire process will be time-consuming. You can go about it in two ways:

    • Opt for the manual approach, where you rely on your developers and IT team to develop a BI architecture for your custom requirements.
    • The simpler and faster approach would be to buy an enterprise-ready integration solution from the market. These solutions extract data from different sources using built-in connectors, transform it into the required format, and load into the destination system that is connected to BI tools. Several data integration solutions offer out-of-the-box connectivity to BI tools. Therefore, purchasing a data integration solution would serve the dual purpose of integration and reporting.

    Factor in data security

    Setting up security measures before implementing BI is imperative in protecting your information assets against data breaches. By configuring authorization or authentication protocols and outlining procedures to carry out secure data processes, you can control access to data sets.

    BI is no longer a privilege for enterprises, it’s a necessity that enables organizations to stay ahead of the competition and optimize decision-making.

    Identifying the challenges in their reporting journey and implementing the best practices mentioned above will help organizations leverage the BI capabilities and become data-focused.

    Author: Ibrahim Surani

    Source: Dataversity

  • Do We Need Decision Scientists?

    Do We Need Decision Scientists?

    Twenty years ago, there was a great reckoning in the market research industry. After spending decades mastering the collection and analysis of survey data, the banality of research-backed statements like “consumers don’t like unhealthy products” belied the promise of consumer understanding. Instead of actionable insights, business leaders received detailed reports filled with charts and tables providing statistically proven support for research findings that did little to help decision-makers figure out what to do.

    So, the market research industry transformed itself into an insight industry over the past twenty years. To meet the promise of consumer understanding, market researchers are now focusing on applying business knowledge and synthesizing data to drive better insights and ultimately better decisions.

    A similar reckoning is at hand for data scientists and the AI models they create. Data scientists are mathematical programmer geeks that can work near-miracles with massive complex data sets. But their focus on complex data is much like the old market research focus on surveying consumers – without a clear focus on applying business knowledge to support decision-making, the results are often as banal as the old market research reports. Improving data and text mining techniques and renaming them Machine Learning, then piling on more data and calling it Big Data, doesn’t change that fundamental problem.

    Today’s data scientists can learn a lot from companies like Johnson & Johnson, Colgate, and Bayer. These leaders have successfully transformed their market research functions into insight generators and decision enablers by combining analytical tools with the business skills required to drive better decisions.

    Data scientists could follow a similar path, but what if we took a much bigger leap?

    Leaping to Decision Scientists

    According to Merriam-Webster, science is “knowledge about or study of the natural world based on facts learned through experiments and observation.”

    Applying that definition to data science highlights the critical disconnect. Data scientists do not exist inside companies to study data – they are there to generate knowledge to help business decision-makers make better decisions. By itself, collecting more data and analyzing it faster does not result in objective knowledge that can be relied on for better decision-making. The science of data is not the science of business.

    Now imagine the evolution of a new decision scientist role. Since decision-making effectiveness is almost perfectly correlated with business performance">almost perfectly correlated with business performance, a focus on studying and building knowledge about business decisions and decision-making will directly advance business goals.

    Rather than studying how to generate, process and analyze generalized business data, tomorrow’s decision scientists will focus on tracking, understanding and improving business decision-making.

    Decision scientists will answer critical business decisions.

    Decision scientists will live in an exciting new world where the current lack of scientific knowledge about business decision-making means major discoveries will happen every day. Even better, every decision is a natural experiment, setting the stage for incredibly rapid learning. Finally, businesses will benefit directly by applying new understanding to make better, faster decisions.

    Decision scientists will focus on mapping the structure of business decisions and understanding the process business people use to make decisions, including the mix of data, experience and intuition needed to guide recommendations and decisions that deliver business value. 

    Decision Scientists Will Decode the DNA of Business

    By mapping the decisions that drive a business and then tracking those decisions’ inputs and outputs, decision scientists can bring decision-centric, business-focused scientific direction to the disconnected layers of today’s data-centric world. Much like a complete map of our DNA highlights the genes and interventions important for better health, a comprehensive map of our decisions can focus efforts to drive the most business impact.

    • Decisions – Who makes decisions, and how are their choices made? How can we measure decision success and map that back to the optimal mix of decision inputs and most effective decision processes?
    • Business Issues – How can decisions be used to model a predictable connection between the inputs and outputs of a business? How are business goals connected, via decisions, from the CEO to the front-line manager?
    • Insights – What is the scientific definition of an insight? What is the best role for human synthesis in generating insights, and can that human role be modeled and even automated for faster, more efficient insight generation?
    • Analytics – What analysis and mathematical models are needed to support critical decisions? How can we align top-down analysis that starts with key decisions and business issues with bottom-up analysis that starts with key variables and data sources?

    Decision scientists will shift the focus from the science of inputs (data and analysis) to the science of outputs (recommendations and decisions). Of course, data science will continue as an important activity, except now it will be directed not only by the technical challenges of complex data sets but also by the complex needs of decision-makers. This shift will significantly improve the business value of data and analytics, making tomorrow’s decision scientists an indispensable business resource.

    Author: Erik Larson

    Source: Forbes

  • Everything you need to know about a database management system and its uses

    Everything you need to know about a database management system and its uses

    Strong database management facilitates fast and effective business decision-making.

    Data drives everyday decision-making to help businesses complete tasks and accomplish their goals. Therefore, it requires proper management. But the question is how to effectively manage business data to ensure quick decision-making and smooth workflows? Using a database management system is the answer.

    A database management system makes it easier to store, organize, and share data across your business departments. It pulls data from the various tools, platforms, and applications your business uses and centralizes its storage so it can be easily searched and retrieved. It also eliminates risks such as data loss that delay or disrupt daily workflows.

    If you’re someone who works with data day in and day out or who relates to the everyday challenges of managing databases, this blog is for you. We explain what a database management system is and how you can use it to ensure data integrity and streamline data management processes.

    What is a database management system?

    A database management system is a software platform that helps you store and organize data. It creates a single centralized data source that can be used by stakeholders across departments. It combines the capabilities of data manipulation, analytics, and reporting to ensure better use of key data points.

    A database management system acts as an interface between your databases and employees. Employees can add, update, access, and delete data in the databases, based on the levels of permissions you assign to them. You can use database management software for:

    • Data management: Store, manage, categorize, and update business data.
    • Data retrieval: Find specific data points using the search functionality.
    • Queries: Run queries to perform specific actions such as calculations.
    • Data replication: Create duplicate instances of data and use them as a distributed database among employees.
    • Data security: Ensure data is secure from malicious attacks, unauthorized access, and accidents such as deleted data.
    • Data conversion: Transfer data from one database to another—also known as data migration.

    Why do you need a database management system?

    For people like you who depend on data to get their jobs done, using a database management system has multiple benefits. It assists with structured data management to ensure easy access and sharing. It also frees you from time-consuming manual processing tasks such as finding a specific data point and sharing it with employees.

    In addition, database management software ensures business data is shared only with relevant internal or external stakeholders. This helps mitigate risks such as information loss or unauthorized access.

    Here are a few benefits of implementing a database system into your work processes:

    • Increases productivity due to fewer data-related errors
    • Speeds up decision-making with timely and uninterrupted access to data
    • Improves data sharing and security by allowing access to only authorized users

    Your business’s need for database management software depends on how your employees use data. For instance, some might use it for daily research (normal priority), while others might use it to develop software tools (high priority). Keep such usage scenarios in mind when deciding whether or not to use database management systems.

    1. Relational database management system

    A relational database is a collection of data that is related to each other so different data points can be combined for better usability. The related points could be time, data, or logic, and the relation can be categorized in the following ways:

    • One on one: A data point in one table is related to a data point in another table.
    • One to many: A data point in one table is related to multiple data points in another table.
    • Many to one: Multiple data points in one table are related to a data point in another table.
    • Many to many: Multiple data points in one table are related to multiple data points in another table.

    A relational database management system is software that manages the storage and shareability of relational databases. It organizes data in a relational database by forming functional dependencies between multiple data points. It also stores data in an organized manner so it’s easier for employees to find and use data for their daily tasks.

    A relational data structure uses structured query language (SQL) to allow employees to run queries and find the information they need. A relational database management system typically:

    • Stores large volumes of data
    • Enables fast data-fetching
    • Allow users to simultaneously access multiple data elements

    2. Object-oriented database management system

    An object-oriented database is a collection of data that is presented in the form of an object. Multiple data points are combined into a single unit or object, making it easier for employees to find and use data. This type of database is used to accomplish high-performance tasks, such as software development and programming, that require faster decision-making.

    An object-oriented database management system is software that stores and manages databases as objects. It allows employees to look for complete objects instead of individual data points, resulting in a quicker search. An object-oriented database structure typically:

    • Maintains a direct relationship between database objects and real-world scenarios so the objects don’t lose their purpose
    • Provides an object identifier for employees to quickly locate objects and use them
    • Handles different data types such as pictures, text, and graphics

    3. Hierarchical database management system

    A hierarchical database is a collection of data that is organized into a tree-like structure wherein the stored data is connected through links and arranged from top to bottom. The primary data point is at the top, and the secondary data points follow in hierarchy depending on their relevance. Your business’s organizational structure is a perfect example of a hierarchical database.

    A hierarchical database management system is software that stores and manages hierarchical databases. It maintains accuracy in data hierarchy or flow based on the usage in work processes. Data within a hierarchical system is typically:

    • Easy to add and delete
    • Easy to search and retrieve
    • Follows a one-to-many relational data model

    4. Network database management system

    A network database is a collection of data where each data point is connected to multiple primary and secondary data points. Having interconnected data points makes this data model more flexible in terms of usage.

    A network database management system is software that stores and manages the interrelated data points in a network database. This software was built to overcome the shortcomings of a hierarchical database model that doesn’t allow interconnection between data points, besides the top-to-bottom flow. A network database system typically:

    • Facilitates quick data access
    • Supports many-to-many relational database models
    • Allows to create and manage complex database structures

    Who uses a database management system?

    In the table below, we share a couple of examples of professionals who use a database management system. Please note that these are just a few examples, and there are many such professionals for whom data is on top priority to accomplish tasks.



    Application programmers

    These are professionals who interact with databases to develop software apps and tools. They mostly use an object-oriented database management system to write codes and then convert them into objects for better usability. Converting large codes into smaller objects makes it less confusing for application programmers, especially when checking the performance of the developed applications.

    Data analysts

    These are professionals who collect raw business data and organize it into a database. They mostly use SQL in a relational database management system to identify raw data, draw valuable insights from it, and convert the insights into action points to impact business decision-making.

    DBMS software applications are also used in the following industry functions:

    • Railway reservation systems: A database management system is used to manage information such as ticket bookings, train timings, and arrival/departure status.
    • Library management: A database management system is used in libraries to manage the list of books. This includes keeping track of issuing dates, patron names, and author names.
    • Banking and finance: A database management system is used to manage the list of bank transactions, mode of payments, account details, and more.
    • Educational institutions: A database management system is used to manage the list of students, classes, lecture timings, and the number of hours logged in by both teachers and students.

    Use database management systems to enhance business decision-making

    Data is key to better decision-making, and efficient database management is key to getting data right. Therefore, it’s essential to manage your business data for effective usage, accessibility, and security.

    Author: Saumya Srivastava

    Source: Capterra

  • Finance gedwongen te moderniseren door digitalisering


    Gedreven door technologie en toenemende transparantie-eisen groeit finance uit tot het dataknooppunt van de organisatie. Hoe werkt dat in de praktijk? Drie CFO's bieden een blik achter de schermen.

     De finance-functie moderniseert. Met real-time analytics, altijd en overal beschikbare data, nieuwe samenwerkingstools en een behoorlijke dosis aanpassingsvermogen wordt de strijd aangegaan met de voortdurend veranderende omstandigheden waarbinnen de onderneming moet aantonen van toegevoegde waarde te zijn voor zijn stakeholders. 'Eigenlijk staat in elke branche het verdienmodel onder druk', zegt Robert van de Graaf, CFO met brede ervaring in de financiële sector. 'En overal ervaart men de noodzaak daar antwoorden op te formuleren. Het is immers een kwestie van 'to disrupt or to be disrupted'.' Dat finance in dat proces een leidende rol vervult, is in de ogen van Van de Graaf 'logisch'. 'Finance is per slot van rekening de hoeder over de continuïteit van de onderneming en het daarbij behorende businessmodel.'


    De sensationele voorbeelden - Uber en Airbnb die hele branches bedreigen - trekken uiteraard de meeste aandacht, maar de veranderingen zijn niet altijd meteen ingrijpend en zichtbaar. Wat overigens niets afdoet aan de noodzaak de confrontatie ermee te zoeken, vindt Van de Graaf. 'In vele branches heeft men nu nog een data-voorsprong op de klanten, maar over drie tot vijf jaar is daar geen sprake meer van. Denk aan de impact van The Internet of Things in de woning en scanners waarmee je de eigen gezondheid kan bepalen. Als je wacht tot het zover is, is het te laat. Je moet nú de vraag gaan beantwoorden wat die ontwikkeling gaat betekenen voor je onderneming.'

    In zijn rol van aanjager in dat proces moet finance uit zijn comfortzone stappen, vindt Van de Graaf. 'Ik zie graag dat finance het aangaan van kort cyclische projecten stimuleert. Eis dan niet een business case vooraf, maar spreek bijvoorbeeld af dat binnen drie maanden duidelijk wordt of er al dan niet een klantbehoefte is aangeboord. Houd daarbij de teams in eerste instantie klein, want dat bevordert de creativiteit.'


    Het is een proces waarmee inrichter Koninklijke Ahrend ervaring heeft. 'De vastgoedcrisis van 2008 halveerde de Europese markt voor inrichten en had daarmee een majeure impact op de omzet van dit bedrijf', zegt CFO Rolf Verspuij. Vervolgens kwamen daar de effecten van trends als digitalisering overheen. 'Flexwerken en thuiswerken zorgden voor een verdere volumedaling. De verkoop van kasten en werkplekken waren de kurk waar het bedrijf op dreef, maar dat tijdperk liep af, zoveel was duidelijk.'

    In 2012 ging daarom het roer om. 'Vanzelfsprekend' had finance een leidende rol bij die veranderingsoperatie, vindt Verspuij, die begin 2012 bij Ahrend in dienst trad. 'Uiteindelijk gaat het om het halen van financiële doelstellingen. Van alle onderdelen van het nieuwe businessmodel moet toch bepaald worden of en in welke mate ze bijdragen aan het resultaat.' Bovendien noopte de nieuwe koers tot het aanpassen van informatiesystemen voor meer inzicht in de performance.

    'Flexibiliteit en aanpassingsvermogen zijn zeer belangrijk geworden voor onze organisatie', aldus Verspuij, 'want we onderscheiden ons nu door marktgerichtheid. Vroeger was Ahrend min of meer een productiehuis: we ontwikkelden en produceerden een mooi product, om dat in hoge volumes weg te zetten. Nu is luisteren naar de markt het devies. We spelen daar vervolgens op in met nieuwe concepten en allerlei slimme inrichtingsoplossingen. '


    Waarbij Ahrend nu juist gebruikmaakt van digitalisering: if you can't beat them, join them. 'Zo maken we het de facility-manager gemakkelijk door meubilair te voorzien van geavanceerde technologieën waarmee registratie van gebruik en het creëren van overzicht tot de mogelijkheden behoort. Beheer, onderhoud en kostenbewaking zijn dan efficiënter uit te voeren.' Daarnaast wordt het interessant innovaties toe te passen waarbij gebruikgemaakt wordt van de mobiele telefoon.

    Door al deze veranderingen is Ahrends omzet weer gestegen, zelfs tot boven het niveau van 2007, mede door een aantal overnames vorig jaar. De doorgevoerde veranderingen dragen bovendien zichtbaar positief bij aan de resultaten. 'We zijn er nog niet', zegt Verspuij, 'maar er zijn grote stappen gezet.'


    Ook bij de HVC Groep, een afval-, energie- en grondstoffenbedrijf, hebben marktverschuivingen geleid tot digitalisering van het product- en dienstenpakket. Zo chipt HVC de afvalbakken. De beschikbaarheid van data is een stuwende factor voor de verbreding van de informatievoorziening door de finance-functie, zo vertelt CFO Ingrid Tigchelaar. 'Waarbij er duidelijk een wisselwerking is tussen vraag en aanbod. De roep om transparantie en de technologische mogelijkheden tot dataverzameling en -analyse versterken elkaar.' HVC Groep is in handen van overheden en in die zin een 'klassiek' nutsbedrijf. 'Dat betekent dus: talrijke stakeholders die steeds meer informatie willen over de prestaties en bedrijfsvoering van de organisatie', vertelt Tigchelaar. 'Transparantie is een maatschappelijke norm geworden.'

    Ze werkt momenteel aan de omwenteling om aan die norm te voldoen. 'In de basis is de HVC Groep een volcontinu procestechnologisch bedrijf. We waren al gewend veel gegevens te verzamelen over de bedrijfsvoering, vooral met het oog op de monitoring van de continuïteit en de veiligheid van de bedrijfsprocessen. Echter, die gegevens werden altijd alleen intern gebruikt. Om ze geschikt te maken voor andere stakeholders is een kwaliteitsslag nodig; de buitenwereld stelt nu eenmaal andere kwaliteitseisen aan die informatie. Met alle gevolgen dus voor de ordening, organisatie, verslaggeving en rapportage van die gegevens.'


    Kenmerkend voor de manier waarop finance zich ontwikkelt, zo zegt Tigchelaar, is dat financiële en niet-financiële informatie steeds meer verweven raken. 'In dit type bedrijf liggen de uitdagingen niet in het proces van de verwerking van financiële gegevens, dat is wel op orde. Wel is het belangrijk dat je die financiële gegevens kunt laten aansluiten op al die andere data die belangrijk zijn voor de bedrijfsvoering. Eén bron van informatie: daarmee verhoog je de betrouwbaarheid ervan enorm.'

    Om welke gegevens gaat het dan? Tigchelaar noemt als voorbeeld de hoeveelheid gerecycled afval. 'Met het Rijk zijn daar in de zogeheten VANG-doelstellingen afspraken over gemaakt. Zo moet in 2020 75 procent van het afval gerecycled worden. Deze doelstellingen zijn overgenomen door lagere overheden en die willen verantwoording afleggen aan de burgers. Dat betekent dat wij als inzamelaar en verwerker van afval daar informatie over moeten geven; dat is ook vastgelegd in de dienstverleningsovereenkomsten met onze stakeholders.'

    Een ander voorbeeld is de onlangs afgesloten brand- en ongevallenverzekering. 'Waarbij een goede registratie van alle incidenten in het bedrijf van groot belang is. Dat deden we al, maar alleen voor intern gebruik. Laten zien aan externe stakeholders dat we in control zijn, stelt extra eisen aan de verzameling en verwerking van de betreffende data.'


    source: www.fd.nl


  • Getting real value out of BI: closing the gap between analytics and potential

    Getting real value out of BI: closing the gap between analytics and potential

    The output of BI is used in organizational decision-making, since it is not the process or technologies that are used by decision-makers but rather their output. Arisa Sholo, Copenhagen Business School, 2012

    Traditional Business Intelligence (BI) is not working. BI is supposed to help businesses make data-informed decisions to improve outcomes, but the reality is that most are falling back on gut instinct to drive their actions. Thanks to a serious mismatch between existing BI infrastructure, tools and end-users, there’s a big gap between analytics and potential. While BI seems to have taken several steps forward since its inception, it has also managed to jump backward with every technological advance.  

    To get a clearer picture of what we’re talking about here, we need to step back a little. 

    BI grew within the IT-centric system of record in the 1980s, where those of us with business questions would go to the specialists who ran our databases, ask for reports, and through the painful, iterative dance of “that wasn’t exactly what I was looking for, I actually need *this*” eventually end up with… something, at least.

    By the early 2000s BI evolved into visual-based data discovery, which offered a simple proposition: what if, instead of waiting for IT to figure out what they were looking for, they could make their *own* damn charts, using simple drag-and-drop user interfaces?

    Tremendous success followed. Business Intelligence tools that turned dimensions and measures into charts and graphs flourished, and new names like Tableau, Qlik, and Spotfire started to eclipse the venerable Cognos, Business Objects, and Microstrategy. And as the technical folks responsible for implementing this new breed of tool got asked for more and more sophisticated analyses, eager product leaders converted requests into shiny new features and shipped them at a breakneck pace.

    Visual-based data discovery has a simple premise: Most business questions aren’t hard to answer if you know what you’re looking for, and if a business user understands her question, she should be able to drag and drop dimensions and measures until a chart reveals the answer. Dashboards and self-service BI tools are meant to make analytics quick and easy, correct?

    And there’s the rub.

    Industry analyst firms are circulating a disappointing statistic: Close to 80 and 90% of knowledge workers lack the technical skills, data literacy, or access to make effective use of BI tooling. 

    This is because today’s self-service BI tools are influenced by the needs of data specialists (trained business analysts, data engineers, DBAs, and data scientists) who require more advanced features and capabilities to support their use cases. So that’s why you will notice that Tableau has as many control surfaces as a 737, why Qlik has an expression editor, and why Sisense allows filter values to be expressed in code.  All the vendors in the BI space convert feature requests to software, and race to anticipate the next request of our buyers – which leads to tremendously powerful tools with features 90% of users don’t know how to use.

    Essentially, self-service BI has become too complex, and too impenetrable for its target user. So now, we’ve put ourselves back into the IT-centric system of record. Not only are we back to where we started, but now we’re doing it with tools the data pros don’t like (they don’t need a visual tool, they can write SQL and Python) and the business users don’t like or don’t know how to use.

    So what? Throw it all out? Go back to stone tablets? What if, instead of siloing analytics within the analyst community, you could incorporate data and insights into every workflow your team uses, to drive every decision? 

    This is infused analytics, the much-needed, seamless evolution of data analytics that BI has been sorely missing. 

    Here’s how we can do it: let’s hook our BI infrastructure – with its powerful cloud data warehouses, its beautiful data models, and its ability to mash up metrics from many sources – directly to the tools we *do* know how to use to analyze data. Plug a set of views and a good NLQ engine into Google Sheets, or Excel, or MS Teams, and let’s actually answer our next questions. At least for the 80% of our questions that are easily predictable (does anyone really need to guess what the VP of Sales is going to ask about?) – and for the other 20%, well, that’s why we have statisticians and data analysts on the team.

    Author: Scott Castle

    Source: Sisense

  • Google: a prime example of an intelligent organization basing its decisions on data

    Google: a prime example of an intelligent organization basing its decisions on data

    Google is a company in which fact-based decision-making is part of the DNA and where Googlers (that is what Google calls its employees) speak the language of data as part of their culture. In Google the aim is that all decisions are based on data, analytics and scientific experimentation.

    About Google

    Google is a multinational Internet and software corporation specialized in Internet search, cloud computing, and advertising technologies, based in Mountain View, California, USA. Google’s mission is to organize the world‘s information and make it universally accessible and useful. And with this mission, Google is very serious about using information to inform their decisions.

    Data to inform decision-making

    In companies data should be collected to provide answers to the most important questions and unless you are clear about the questions you need to answer, data is pretty useless. In Google today, the aim is to start with questions and be very clear about the information needs at the outset. Their executive chairman Eric Schmidt says: “We run the company by questions, not by answers. So in the strategy process we’ve so far formulated 30 questions that we have to answer […] You ask it as a question, rather than a pithy answer, and that stimulates conversation. Out of the conversation comes innovation. Innovation is not something that I just wake up one day and say ‘I want to innovate.’ I think you get a better innovative culture if you ask it as a question.” There are thousands of great examples of how Google applies this thinking but let’s look at a great case example from their HR department.

    Fact-based Decision-Making at Google

    Within their global HR function, Google has created a People Analytics Department that supports the organization with making HR decisions with data. One question Google wanted to have an answer to was: Do managers actually matter? This is a question Google has been wrestling with from the outset, where its founders were questioning the contribution managers make. At some point they actually got rid of all managers and made everyone an individual contributor, which didn’t really work and managers were brought back in.

    Project Oxygen

    Within the people analytics department Google has created a group called the Information Lab, which comprises of social scientists who are part of the people analytics department but focus on longer term questions with the aim of conducting innovative research that transforms organizational practice within Google and beyond. This team took on the project of answering the question: Do Managers Matter – codenamed ‘Project Oxygen’. So the objectives and information needs were clearly defined.

    What Data to Use?

    The team first looked at the data sources that already existed, which were performance reviews (top down review of managers) & employee survey (bottom up review of managers). The team took this data and plotted them on a graph which revealed the managers were generally perceived as good. The problem was that the data didn’t really show a lot of variation so the team decided to split the data into the top and bottom quartile.


    Using a regression analysis the team was able to show a big difference between these two groups in terms of team productivity, employee happiness, and employee turnover. In summary, the teams with the better managers were performing better and employees were happier and more likely to stay. While this has confirmed that good managers do actually make a difference, it wouldn’t allow Google to act on the data. The next question they needed an answer to was: What makes a good manager at Google? Answering this question would provide much more usable insights.

    New Data Collection

    So the team introduced two new data collections. The first was a ‘Great Managers Award’ through which employees could nominate managers they feel were particularly good. As part of the nomination employees had to provide examples of behaviours that they felt showed that the managers were good managers. The second data set came from interviews with the managers in each of the two quartiles (bottom and top) to understand what they were doing (the managers didn’t know which quartile they were in). The data from the interviews and from the Great Manager Award nominations was then coded using text analysis. Based on this the analytics team was able to extract the top 8 behaviours of a high scoring manager as well as the top 3 causes why managers are struggling in their role. 

    Using the Insights

    Google used different ways of sharing these insights with the relevant people including a new manager communication that outlined the findings and expectations. But only sharing the insights wasn’t enough, Google saw a need to act on the insights. There were many concrete actions that followed this analysis, here are some key ones:

    • Google started to measure people against these behaviours. For that purpose it introduced a new twice-yearly feedback survey
    • Google decided to continue with the Great Manager Award
    • Google revised the management training

    An Intelligent Organization

    Google is a great example of how good decision-making should be supported by good data and facts. Google clearly followed the five steps I outline in my book ‘The Intelligent Company: Five steps to success with Evidence-based Management’:

    1. Defining the objectives and information needs: ‘Do managers matter?’ and ‘What makes a good manager within Google?’
    2. Collecting the right data: using existing data from performance reviews and employee surveys and creating new data sets from the award nominations and manager’s interviews.
    3. Analysing the data and turning it into insights: simply plotting of the results, regression analysis and text analysis.
    4. Presenting the Information: new communications to the managers
    5. Making evidence-based decisions: revising the training, measuring performance in line with the findings, introducing new feedback mechanisms.

    Author: Bernard Marr

    Source: Smart Data Collective

  • How a Video Game Helped People Make Better Decisions


    oct15 14 games aResearchers in recent years have exhaustively catalogued and chronicled the biases that affect our decisions. We all know the havoc that biased decisions can wreak. From misguided beliefs about the side effects of vaccinating our children, to failures in analysis by our intelligence community, biases in decision making contribute to problems in business, public policy, medicine, law, education, and private life.

    Researchers have also long searched for ways to train people to reduce bias and improve their general decision making ability – with little success. Traditional training, designed to debias and improve decision-making, is effective in specific domains such as firefighting, chess, or weather forecasting. But even experts in such areas fail to apply what they’ve learned to new areas. Weather forecasters, for instance, are highly accurate when predicting the chance of rain, but they are just as likely as untrained novices to show bias when making other kinds of probability estimates, such as estimating how many of their answers to basic trivia questions are correct.

    Because training designed to improve general decision making abilities has not previously been effective, most efforts to debias people have focused on two techniques. The first is changing the incentives that influence a decision. Taxing soda, for example, in the hopes that the increased cost will dissuade people from buying it. The second approach involves changing the way information for various choices is presented or choices are made, such as adding calorie information to fast-food menus or offering salad as the default side order to entrées instead of French fries. However, these methods are often not always effective, and when effective, only affect specific decisions, not decision-makers’ ability to make less biased decisions in other situations.

    My research collaborators and I wondered if an interactive training exercise might effectively debias decision-makers. (The team included Boston University’s Haewon Yoon, City University London’s Irene Scopelliti, Leidos’ Carl W. Symborski, Creative Technologies, Inc.’s James H. Korris and Karim Kassam, a former assistant professor at Carnegie Mellon University.) So we spent the past four years developing two interactive, “serious” computer games to see if they might substantially reduce game players’ susceptibility to cognitive bias.

    There was scant evidence that this kind of one-shot training intervention could be effective, and we thought our chances of success were slim. But, as we report in a paper just published in Policy Insights in the Behavioral and Brain Sciences,the interactive games not only reduced game players’ susceptibility to biases immediately, those reductions persisted for several weeks. Participants who played one of our games, each of which took about 60 minutes to complete, showed a large immediate reduction in their commission of the biases (by more than 31%), and showed a large reduction (by more than 23%) at least two months later.

    The games target six well-known cognitive biases. Though these biases were chosen for their relevance to intelligence analysis, they affect all kinds of decisions made by professionals in business, policy, medicine, and education as well. They include:

    • Bias blind spot – seeing yourself as less susceptible to biases than other people
    • Confirmation bias – collecting and evaluating evidence that confirms the theory you are testing
    • Fundamental attribution error – unduly attributing someone’s behavior to enduring aspects of that person’s disposition rather than to the circumstance in which the person was placed
    • Anchoring – relying too heavily on the first piece of information considered when making a judgment
    • Projection – assuming that other people think the same way we do
    • Representativeness – relying on some simple and often misleading rules when estimating the probability of uncertain events

    We ran two experiments. In the first experiment, involving 243 adult participants, one group watched a 30-minute video, “Unbiasing Your Biases,” commissioned by the program sponsor, the Intelligence Advanced Research Projects Activity (IARPA), a U.S. research agency under the Director of National Intelligence. The video first defined heuristics – information-processing shortcuts that produce fast and efficient, though not necessarily accurate, decisions. The video then explained how heuristics can sometimes lead to incorrect inferences. Then, bias blind spot, confirmation bias, and fundamental attribution error were described and strategies to mitigate them were presented.

    Another group played a computer game, “Missing: The Pursuit of Terry Hughes,” designed by our research team to elicit and mitigate the same three cognitive biases. Game players make decisions and judgments throughout the game as they search for Terry Hughes – their missing neighbor. At the end of each level of the game, participants received personalized feedback about how biased they were during game play. They were given a chance to practice and they were taught strategies to reduce their propensity to commit each of the biases.

    We measured how much each participant committed the three biases before and after the game or the video. In the first experiment, both the game and the video were effective, but the game was more effective than the video. Playing the game reduced the three biases by about 46% immediately and 35% over the long term. Watching the video reduced the three biases by about 19% immediately and 20% over the long term.

    In a second experiment, involving 238 adult participants, one group watched the video “Unbiasing Your Biases 2” to address anchoring, projection, and representativeness. Another group played the computer detective game“Missing: The Final Secret,” in which they were to exonerate their employer of a criminal charge and uncover criminal activity of her accusers. Along the way, players made decisions that tested their propensity to commit anchoring, projection, and representativeness. After each level of the game, their commission of those biases was measured and players were provided with personalized feedback, practice, and mitigation strategies.

    Again, the game was more effective than the video. Playing the game reduced the three biases by about 32% immediately and 24% over the long term. Watching the video reduced the three biases by about 25% immediately and 19% over the long term.

    The games, which were specifically designed to debias intelligence analysts, are being deployed in training academies in the U.S. intelligence services. But because this approach affects the decision maker rather than specific decisions, such games can be effective in many contexts and decisions – and with lasting effect. (A commercial version of the games is in production.)

    Games are also attractive because once such approaches are developed, the marginal costs of debiasing many additional people are minimal. As this and other recent work suggests, such interactive training is a promising addition to the growing suite of techniques that improve judgment and reduce the costly mistakes that result from biased decision making.

    Source: http://www.scoop.it/t/strategy-and-competitive-intelligencebig


  • How business leaders can learn from firefighters

    How business leaders can learn from firefighters

    The Firefighting Dichotomy

    When problems or crises arise in a business context, we often hear leaders refer to how busy their teams are firefighting. This paints a picture of chaos in the workplace, with overworked and stressed-out teams running from problem to problem, not addressing an issue before they are interrupted by the next predicament. Managers constantly juggle where to prioritize overworked people and often make rash decisions to try to put quick fixes in place.

    According to the Cambridge dictionaryfirefighting (in a business context) is “spending time on problems that need to be dealt with quickly, instead of working in a calm, planned way.” But in the real world of fighting fires, firefighting is far from chaotic and unplanned. If you think you’re firefighting, you’re not thinking like a firefighter. In contrast, firefighters are highly trained, dedicated, and coordinated teams who plan, practice, and prepare for a range of different crisis scenarios. They must — their lives and the lives of those in emergency depend on them being able to respond in an assured, controlled, and well-rehearsed manner no matter the disaster.

    What It Takes To Put Out The Fires

    I recently spoke with my new colleague, an experienced volunteer firefighter, to discuss these conflicting views of firefighting. When he was a teenager in rural Tennessee, the local rural fire service put out a fire at his home, saving him and his family. This motivated him to sign up, train, and gain certification as a volunteer firefighter as soon as he was old enough to do so. We agreed that the business world has much to learn from firefighters to be fully fit for any crisis or future scenario. Leaders must:

    • Build high-performance, focused teams. In a business crisis scenario, there’s often an adrenaline rush, with individuals striving to be heroes. However, there’s no room for individual heroics in the fire service. Firefighters operate as a high-performance, close-knit unit where everyone knows their responsibility and is expected to perform their role. The pump operator and frontline firefighters collaborate in unison to ensure a consistent supply of water at high pressure and with the right pattern to deal with the type of fire. There is no room for lapse of concentration or focus. When a firm faces a major challenge, it’s teamwork and collaboration that will pull you through — not individual heroics. Better to have a team of heroes than an individual superstar.

    • Establish clear communication and delegation channels. As you would expect, clear, consistent, and frequent communication in a fire emergency is critical. The incident commander is the ultimate decision-maker but is dependent on the flow of information and situational awareness from frontline teams. Decision-making is devolved to point of need. If the zone one commander at the front of a building fire says the structure is about to collapse, the zone one team will pull out. Equally, teams in other zones are notified to withdraw from the scene if they also face danger. Similarly, in our newfound hybrid working environment, senior leaders need to focus their attention on supporting frontline leaders. Frontline leaders need a clear channel for bidirectional communication with senior leadership.

    • Train their people for future scenarios. Firefighters must complete a rigorous training, qualification, and assessment probationary period. Volunteers complete the same training as full-time firefighters. Training does not stop with qualification. Regular scenario-based training is crucial for effectively dealing with different emergency situations — from high-rise building, aircraft, or tank fires to special situations involving flammable liquids. Undertaking frequent dry runs for different fire scenarios builds muscle memory so that for each situation, every firefighter is prepared. They know their role, what to expect from their colleagues, what equipment is needed, and what to do. Fire commanders monitor training progress through a cycle of continual, 360-degree feedback. Business and tech leaders must similarly ensure that their employees are prepared for any future scenario by developing a future fit learning culture. Rapid change is driving demand for continuous upskilling, and traditional training is not filling the skills gap. Success is achieved through embedding learning into work, performance goals, and a leadership coaching culture.

    • Ensure teams have the right tools and technology and that both are future fit. Reliable, fully operational equipment is the bedrock of the firefighter’s job. A firefighter will never go into a situation without being 100% ready. If people are worn out or equipment is not in proper working order, they will not respond. Each firefighter puts a lot of focus on testing, cleaning, and caring for their gear. They need to rely on their equipment as much as their colleagues. Standards play a critical role and ensure familiarity, consistency, and reliability. For example, every fire engine is equipped with two 150-foot cross-lay hoses, specified for the same pressure and minimum water flow. Technology executives must standardize and prune their portfolios. They must optimize the technology stack to become future fit and ensure teams have the right technology for future work.

    • Base decisions on facts, not gut reactions. When the firefighters arrived at the scene, they were greeted by a panic-stricken woman screaming “my baby is in there.” The crew at first assumed the worst — a young child was caught in the blaze, demanding immediate action. But training kicked in. The threat was less severe: It was her pet dog she was talking about. Often, technology teams are expected to drop everything and respond to a business owner’s “baby” — their project is critical, a higher priority over and above the rest. But with root cause analysis and proper criteria for prioritization, the facts often flag that is not the case. Additionally, when dealing with risk, we can prepare for the worst, but we should not overdramatize without all the facts. There may be smoke, but that does not mean the fire is flaming.

    • Be prepared to adapt. As much as firefighters prepare for different fire scenarios, they can often face a situation they were not quite prepared for. Rockwell told a story of when he was part of a crew called out to a fire at an abandoned house. This is not unusual. Often, homeless people occupying a building will light a small fire for warmth that then escalates into a house fire. The normal priority is to first check if it is safe to enter the structure and identify if any occupants have succumbed to smoke and are unconscious. However, in this situation, the team entering the building discovered that the owner, a farmer, was using it to store flammable propane in tanks. The farmer neglected to display notices outside, warning of the dangers of explosive gases being stored inside. While the team was trained for a flammable scenario, they entered the building unaware of this hidden danger. This left them with no option but to pull back, inform the wider crew that the situation was not as it seemed, and reequip for the changed situation. In business, we cannot anticipate every scenario but must learn to be adaptive. Adaptiveness is the key to business success. Adaptive enterprises continuously assess the environment, anticipating new customer-obsessed opportunities and demonstrating the ability to reshape their core business models and organizational structure.

    • Finish what was started, and capture learning before moving on. Contrast the chaotic business context of firefighting with that of the fire service. Imagine if a fire crew jumps into its truck ready to dash to another call without fully safety-checking the scene. What if there was an unchecked hot spot that caused the fire to rekindle? The fire crew ensures full closure, confirming that the fire is fully extinguished and that the situation is safe for the subsequent cleanup and overhaul process. After each incident, there is a full debrief to assess what worked well, what could have been better, and if there are any ideas or different equipment that would’ve made the job easier. It was this approach that led to the invention of the “jaws of life” hydraulic cutter to rescue people from crashed vehicles. This device replaced the previously used cylinder saw that was noisy, slower, dangerous, and caused extra distress to the trapped victim. Tech leaders are now learning from safety-critical professionals on how to improve incident management practices. They recognize that incidents, in theory, should drive follow-up analysis and corrective action once the fire is out. No incident should be left simmering.

    Author: Phil Brunkard

    Source: Forrester

  • How modern ETL processes lead to better decision making

    How modern ETL processes lead to better decision making

    We live in a world of data: there’s more of it than ever before, in a ceaselessly expanding array of forms and locations. Dealing with data is your window into the ways data teams are tackling the challenges of this new world to help their companies and their customers thrive.

    In recent years we’ve seen data become vastly more available to businesses. This is mostly due to the rise of data warehouses, decrease in cost for data storage, and modern ETL tools that make storing and accessing data more approachable than ever before. This has allowed companies to become more and more data driven in all areas of their business. In fact, being data driven has become ubiquitous and imperative to survival in today’s climate. This article will discuss at a high level how modern businesses are leveraging new technology to ingest a wider variety of data sources. 

    As of 2019, according to Bissfully’s 2020 SaaS trends report, companies in all phases of business growth make use of many SaaS aaplications. smaller companies (0-50 employees) use an average of 102 SaaS applications, while mid-sized ones (101-250 employees) use about 137 SaaS applications. Enterprises, meanwhile, can have as many as 288 SaaS applications in use across the entire organization!

    These SaaS products range from CRMs to marketing automation apps to billing software to help desk tools, etc. These applications can log thousands to millions of rows of data made available via API, much of which can hold valuable insight for businesses. Add to this the company’s own internally-gathered product data and it’s clear that even a small or medium-sized business can do a lot with the data at its disposal. In this article, we’ll cover what the ETL process is, why it’s important to powering data-driven businesses, and how the rise of modern ETL tools has changed the ETL process forever.

    The ETL process: foundation of data-driven businesses

    Every company wants every team within their business to make smarter, data-driven decisions. Customer support teams look at trends in support tickets or do text analysis on conversations to understand where they can provide better onboarding and documentation. Marketing teams want better visibility into their ad performance across different platforms and the ROI on their spend. Product and engineering teams dig into productivity metrics or bug reports to help them better prioritize their resources. 

    The ETL process empowers all these different teams to get the information they need to understand and perform their jobs better. Businesses ingest data from a wide array of sources through the ETL process, which stands for Extract, Transform, Load. The prepared data is then available for analysis and use by the various teams who need it, as well as for advanced analytics, embedding into applications, and use for other data monetization efforts. Whatever you want to do with data, you have to ETL it first. 

    Explaining the traditional ETL process

    Traditionally, the extraction step of the process involves batch-pulling the data from the source via files, application database, or an API. In the transformation step, the data is cleaned and modeled so that it’s fit for analysis and ingestible by the data warehouse. Finally, the cleaned tables are loaded into the data warehouse for analysis. 

    This traditional ETL process is very difficult to scale. It often requires full-time data engineers to develop and maintain the scripts that keep the data flowing. This is because the data providers often make changes to their schemas or APIs, which then break the scripts that power the ETL process. Every time there’s a change, the data engineers scramble to update their scripts to accommodate them, resulting in downtime. With businesses now needing to ingest data from so many disparate (often fast-moving) data sources, maintaining ETL scripts for each one is not scalable. 

    Better living through ETL tools: the modern ETL process

    The modern ETL process follows a slightly different order of operations, dubbed ELT. This new process arose as a result of the introduction of tools to update the ETL process, as well as the rise of modern data warehouses with relatively low storage costs.

    Today, ETL tools do the heavy lifting for you. They have integrations for many of the major SaaS applications, and have teams of engineers who maintain those integrations, taking the pressure off of your in-house data team. These ETL tools are built to connect to most major data warehouses, allowing businesses to plug in their apps on one end and their warehouse on the other, while the ETL tools do the rest.

    Users can usually control orchestration via a simple drop-down selections within the apps, alleviating the need to stand up your own servers or EC2 box or building DAGs to run on platforms like Airflow. ETL tools can also typically offer more robust options for appending new data incrementally, or only updating new and modified rows, which can allow for more frequent loads, and closer to real-time data for the business. With this simplified process for making data available for analysis, data teams can focus on finding new applications for data to generate value for the business.

    The ETL process and data warehouses

    Data warehouses are the present and future of data and analytics. Storage costs on data warehouses have drastically reduced in recent years, which allows businesses to load as many raw data sources as possible without the same concerns they might have had before.

    Today, data teams can ingest raw data before transforming it, allowing them to do the transformations in the warehouse instead of a separate staging area. With the increased availability of data and a common language to access that data, SQL, it allows the business more flexibility in leveraging their data to make the right decisions

    The modern ETL process: delivering better results, faster

    Under the traditional ETL process, as data and processing requirements grew, the chance that on-premise data warehouses would fail grew as well. When this happened, IT would have to swoop in to fix the issue, which usually meant adding more hardware.

    The modern ETL process in today’s data warehouses sidesteps this issue by offloading the compute resource management to the cloud data warehouse. Many cloud data warehouses offer compute scaling that allows for dynamic scaling when needs spike. This allows data teams to still see scalable performance while holding increased numbers of computationally expensive data models and ingesting more large data sources. The decreased cost in compute power along with compute scaling in cloud data warehouses allows data teams to efficiently scale resources up or down to suit their needs and better ensure no downtime. The bottom line is that, instead of having your in-house data and/or IT team fretting over your data storage and computing issues, you can offload that pretty much completely to the data warehouse provider.

    Data teams can then build tests on top of their cloud data warehouse to monitor their data sources for quality, freshness, etc. giving them quicker, more proactive visibility into any problems with their data pipelines.

    From ETL to ELT and beyond

    Data, analytics, and BI have radically evolved since their inception. We are leaps and bounds beyond Excel tables and on-prem-centric data sources. Cloud-native data warehouses, cloud-native architecture in analytics and BI platforms, and embedded analytics powered by these systems have redefined what it means to be truly data-driven in our modern age.

    The ETL process has been updated and can now deliver insights from a wide array of datasets, which helps companies and teams of all kinds make smarter decisions, faster. It also opens the doors for advanced analytics, next-level data monetization, and much more. Whatever you’re building with your data, a modern ELT setup will help you get more from your data, easier.

    While we have seen a large migration to data warehouses, looking to the future, we might see even another evolution that involves data lakes being more widely used as the initial destination for all raw data sources, offering even further cost benefits. Additionally, new tools are starting to take form that allow for transformation within these data lakes, thus continuing the evolution of tools and processes within the data pipeline following the path of ETL to ELT.

    Author: Adam Luba

    Source: Sisense

  • Making better staffing decisions with data analytics

    Making better staffing decisions with data analytics

    Data analytics can help companies choose between Staff Augmentation and Managed Services when coming up with the right HR model.

    Many companies have discovered that it is very difficult to find a pool of talented employees. Fortunately, new advances in big data technology are helping companies get better qualified workers.

    Data analytics technology is very important in assessing the performance of staffing services. Companies can use data analytics to improve their hiring processes.

    What Are the Benefits of Data Analytics in Staffing?

    The Forbes Research Council showed that there are a lot of great benefits of leveraging big data in human resources. These benefits include the following:

    • Improving workforce planning. Big data has become more important as employers move towards a more heterogenous workplace. They can use data analytics to assess the performance of employees in different situations, including traditional workers in-house, remote workers and independent contractors. It also helps with creating a solid hiring model. It has been shown that big data can minimize employment risks during the hiring process.
    • Reducing staffing costs. Employers can use HR software that relies extensively on data analytics technology to minimize the need for unnecessary HR professionals. This reduces costs considerably.
    • Identifying overlooked talent. There are a lot of great employees that don’t get noticed in the labor market. Some people traditionally perform certain services, but are better at some than others. Data analytics helps companies match the right employees or applicants with the right responsibilities.
    • Anticipating hiring needs. There are a lot of challenges that employees face when they try to forecast future staffing needs. Big data and predictive analytics helps companies project future employment needs and allocate sufficient capital to their human resources.
    • Improving employee retention. Big data helps companies assess employee satisfaction by analyzing a variety of key metrics. This helps organizations take a data-driven approach to improving employee retention.
    • Choosing the right staffing model. There are different staffing models available to companies. Data analytics helps them assess the effectiveness of different staffing models in various industries and companies with similar business models. A data-driven approach to choosing a staffing model can be very helpful.

    The benefits of big data in staffing and HR are very clear.

    Using Data Analytics to Choose the Right Talent Acquisition Model

    Human resources are a driving force in digital transformation that makes various IT companies seek the services of people with great expertise in their various fields. Data analytics is helping with this change.

    In our world today, the digital transformation sphere is rolling on a fast track and IT companies seek to be part of these developments by taking advantage of the latest advances in big data technology.

    It is because of this digital transformation that different companies seek to develop digital products, applications, web pages, software and all that have you. They have found that data analytics is making it easier to achieve these objectives.

    However, some companies are facing the downside of not having an effective IT department that can take on digital advancements or utilize the latest data analytics technology.

    This usually is due to the fact that many organizations focus mainly on the business model of a traditional industry that thinks it doesn’t have room for an IT team. If your company falls under that category, don’t be dismayed. There is a perfect solution for you to keep your company up-to-date with the digital world.

    So, what’s the way out? External hiring modes that use the latest data analytics technology.

    What are these hiring modes? They are of two types and from any option, you can select which one is best for your

    You can choose from Staff augmentation and Managed services modules.

    These modes are commonly employed by IT companies who seek to develop digital products, applications, web pages, software, as the case may be.

    “What method would be suitable for my company?”, you may ask. Both methods are highly effective in the manner in which you desire to use them.

    However, I’m in no place to decide which method is best for you, so in this article, I’ll merely create an unbiased comparison between Staff Augmentation and Managed Services. We will start with an overview of the two options and then discuss the importance of using data analytics with it.

    Staff Augmentation VS Managed Services in Focus

    In this section, you find an in-depth view of the Staff Augmentation VS Managed Service Provider, and how they can be utilized to help your IT company in their ways.

    • Foremost, for a comparison between the two IT sourcing models, I’d say that managed services are devoted to delivering an outcome while Staff Augmented is majorly focused on giving inputs to your company.
    • When you contact a company that engages with managed services, it is the service provider that takes total control of all or part of your IT service component.

    On the other hand, when out-tasking, the service provider commits to investing specified resources at a cost.

    • In the case of risk-taking, when engaging in managed services, it is the Managed Service Provider (MSP) that takes on all the risks involved.

    On the other side of the spectrum, Staff augmented services require the clients to take on all the delivery risks.

    • In terms of the period it would take to engage augmented services, the team hired focuses on the job without any long term engagement.

    While with managed services, the Managed Service Provider trains your available staff to increase their expertise. This may take a prolonged period.

    • Also, in the aspect of pricing. With managed services, the pricing module is bound to service levels and outcomes. While with augmented services, the pricing is linked to the factors of availability and period of work.

    Data Analytics Can Help You Select Between the Two

    Data analytics helps make much better staffing decisions. Now that you understand the two staffing models that you can choose from, you can start using data analytics to choose the one that is ideal for your company. You can look at the performance of both Staff Augmentation and Managed Services for different companies by using publicly available data. This can help you choose a model that is best for your respective company.

    Author: Farnaz Erfan

    Source: Smart Data Collective

  • More and more organizations are basing their actions on their data

    More and more organizations are basing their actions on their data

    Many corporations collect data but don't end up using it to inform business decisions. This has started to shift.

    All in all, 2020 will go down as one of the most challenging and impactful years in history. It will also be known as one of the most transformative, with companies and individuals adjusting quickly to the new normal in both work and play, with a 'socially distant' way of life changing how people interact and communicate.

    ven amidst the chaos, we saw an influx of existing technologies finding new industry opportunities, such as videoconferencing tools, streaming platforms such as Netflix, telehealth applications, EdTech platforms, and cybersecurity, to name a few. All of these technologies are powered by one fundamental thing, yet this entity isn't being tapped to its full potential by SMBs and enterprises alike.

    That thing is data, collected by companies with the intent to inform business decisions and better understand and serve their customers. However, from what I have seen, more than 80 percent of data that businesses generate goes unused. This will drastically change in the next three years, with the majority of the data consumed being put to use.

    What's driving this trend

    Data generation was already a hot topic prior to the COVID-19 pandemic with a projected 59 zettabytes (ZB) of data created, captured, and copied over the last year according to IDC. This trend has only accelerated with the pandemic as companies are fast-tracking digital transformation initiatives. Adding to this, the ongoing health crisis is resulting in the avoidance of face-to-face interactions during the workday, causing digital interactions to increase tenfold. This has created even more data through connectivity tools and applications.

    Companies have realized that analyzing this data can help leaders make better-informed decisions rather than relying on gut feeling. Data has become so important to companies' success that according to Gartner, by 2022, 90 percent of today's corporate strategies will unequivocally list information as a critical enterprise asset and analytics as an essential competency. Leading organizations know that in order to drive success in their industry, they have to leverage data and analytics as a competitive differentiator, fueling operational efficiencies and innovation.

    Setting up for success

    Though the majority of data collected by businesses currently goes to waste, there are more tools emerging to help companies unify consumed data, automate insights, and apply machine learning to better leverage data to meet business goals.

    First, it's important to take a step back to evaluate the purpose and end goals here. Collecting data for the sake of having it won't get anyone very far. Companies need to identify the issues or opportunities associated with the data collection. In other words, they need to know what they're going to do with every single piece of data collected.

    To determine the end goals, start by analyzing and accessing different types of data collected to determine if it was beneficial to the desired outcome or has the potential to be but wasn't leveraged. This will help identify any holes where other data should be tracked. This will also help hone the focus on the more important data sets to integrate and normalize, ultimately making data analysis a more painless process that produces more usable information.

    Next, make sure the data is useful - that it's standardized, integrated across as few tech platforms as possible (i.e., not a different platform for every department or every function), and that the collection of specific data follows company rules and industry regulations.

    Finally, use data in new ways. Once your organization has integrated data and technology solutions, the most meaningful insights can often only be found using multidimensional analytics dashboards that take data from two previously siloed functions to understand how pulling a lever in one area affects costs or efficiencies in another.

    Using data to streamline business processes and lower costs

    One industry that's collecting data and using it efficiently to optimize business processes is the telematics industry. Before the digital transformation era, fleet managers and drivers had to rely on paper forms for vehicle inspections or logging hours of service. Now, many telematics-driven companies are relying on connected operations solutions to collect, unify, and analyze data for a variety of tasks such as improving fuel management, driver safety, optimized routing, systematic compliance, and preventive maintenance.

    We have seen fleets with hundreds of assets switch from other out-of-the box telematics solutions, to a more business-focused solution, which allows them to leverage data insights from their connected operations and realize meaningful improvements and costs savings. One such client recently reported saving $800,000 annually in field labor costs, an annual savings of $475,000 in fleet maintenance and repairs, and they've seen compliance with their overdue maintenance reduction initiative go from around 60 percent to 97 percent. It's clear that data contains the answers to an organization's challenges or goals. The question remains whether the organization has the tools to unearth the insights hidden in its data.

    Empowering decision makers through data

    The most important piece to the entire data chain is ensuring the right data insights get into the hands of decision makers at the right time. What use is accurate, analyzed data if it goes unused - as most of today's data does? Including the right stakeholders from across all business functions in the data conversations may unearth current challenges, as well as new opportunities that may have not otherwise been known. This is a step that many companies are now recognizing as crucial for success, which is why we will see more data consumed and put to use over the next three years.

    If they haven't already, executives and decision-makers at all levels should start looking at business operations through a data-centric lens. Companies that recognize and act on the fact that their competitive edge and profit growth lies in the insights hidden in their operational data can expect to see immediate ROI on their efforts to mine their data for golden insights. If they're not doing something about this now, they might just be in a race to the bottom.

    Author: Ryan Wilkinson

    Source: TDWI

  • Recent study unveils lack of sound data infrastructure in healthcare organizations

    Recent study unveils lack of sound data infrastructure in healthcare organizations

    In the race to unearth enterprise insights, the modern health system is like a prospector whose land contains precious metals deep beneath the surface. Until the organization extracts and refines those resources, their value is all but theoretical. In much the same way, only after harmonizing its data can a health system run analytics to inform stronger decision making and realize its full potential.

    In a survey commissioned by InterSystems, Sage Growth Partners found that most health system executives prioritize analytics as a fundamental step toward their broader goals. But they don’t have the tools to get there — at least not yet.

    Just 16% of integrated delivery networks rate their clinical data quality as excellent, 55% consider their supply chain data poor or average, and 87% say their claims data is poor, average, or good. All told, only 20% of organizations fully trust their data. Yet providers recognize the urgent need for healthy data to power analytics, as evidenced by the 80% who say creating and sharing high-quality data is a top priority for the next year.

    These data challenges have real consequences. Poor, untimely decisions and the inability to identify gaps in care translate to severe financial impacts for the enterprise and less desirable outcomes for patients. But while the precious metals remain underground, health systems have the opportunity to start digging today.

    Barriers to Healthcare Insights

    Now 12 years after the HITECH Act accelerated the move to electronic data, healthcare has yet to address bedrock issues such as the lack of a centralized database, challenges integrating multiple data sources, low-quality information, and the failure to create standardized reports. Sage’s findings revealed a harsh truth: Health systems cannot use analytics to generate actionable insights until they overcome these obstacles.

    More than half of surveyed executives acknowledge that poor data impedes enterprise decision making and their ability to identify gaps in care. What’s more, 51% point to data integration and interoperability as the most significant barriers to leveraging analytics for the good of the organization.

    On the ground, the disconnect has meant that health systems are strapped with huge data latency and duplication challenges, despite massive investments in data warehouses. Although many organizations designed dashboards and predictive or prescriptive models, most of these tools either fail to reach production or scale past the walls of a single department due to workflow integration issues. Clinical, claims, and other data, meanwhile, remain siloed.

    Health systems simply haven’t built the infrastructure to produce accurate, real-time, high-quality data.

    Healthy Data and Analytics: Healthcare’s Future

    COVID-19 forced C-suites to make big decisions more often and more quickly, from managing overworked staff to allocating resources among sick and dying patients. Even tracking health outcomes morphed into a tall task. The whiplash of the pandemic led the industry to an inflection point: 85% of executives told Sage that real-time, harmonized data is vital for leaders to make informed operational decisions.

    To make the right moves at the right time, health systems need the most reliable information. That requires strong data technology from start to finish, encompassing pipeline capabilities, aggregation, normalization, standardization, a robust data model, and consistent access.

    If any element of that equation is missing, health system decision making will continue to lag. But success can transform the enterprise.

    Imagine a group of executives — each trusting their data — receiving timely, standardized reports about their health system. Knowing the underlying data is healthy, they would all be confident in the veracity of the insights and ready to draw conclusions. One InterSystems customer, for example, can see in real time and retrospectively how many patients are within a given department, empowering informed staffing decisions and lowering costs with the click of a button.

    Clinical departments stand to gain similar benefits. Interoperability enables them to see previously hidden correlations, improving patient care and outcomes. At InterSystems, we saw how a precise understanding of data enabled a health system client to set effective data governance protocols, which steered clinicians to take quick, knowledgeable action when it mattered most.

    And at a time when artificial intelligence and machine learning models promise to optimize patient care, it’s all the more important that clinicians trust the data driving those insights. Otherwise, these advances will struggle to deliver anything beyond hype.

    Bridge the Healthcare Insights Gap

    Most health systems recognize that it’s time to harmonize their data in pursuit of analytics-driven insights. Organizations that don’t act quickly can bet that their competitors will. When everyone is sitting on precious metals, the only reasonable option is to invest in the technologies that are proven to sift soil and rock from the gold.


    Author: Fred S. Azar

    Source: InterSystems

  • Talend: A personal data lovestory

    Talend: A personal data lovestory

    When I was in my mid-twenties, I thought I had it all. I had just recently graduated from a top law school, passed the California Bar Exam, and was working as a junior associate at a prestigious San Francisco law firm. Three short years later, I had turned my back on law and embarked on a career in the technology field, which after many twists and turns, including stints as an analyst at Gartner, positions at a number of start-ups (some of which were actually somewhat successful) and some of the world’s largest companies (Dell and EMC), has landed me at my current position at Talend’s product marketing team.

    Over the years, I have been asked many times why I left the practice of law. My usual answer has always been what you would expect. Quality of life (i.e. no time left for a personal life), office politics (need to cozy up to the right partners to advance), and an unhealthy dislike for billable hours (who wants to document and charge for every minute of every day) were some of my go-to responses. But now that I have been working at Talend for more than half a year, I have realized that the true reason went much deeper than that. Let me try to explain.

    Talend provides data integration, quality and management solutions to organizations of all sizes, from smaller companies to some of the world’s largest enterprises. Our number one goal is to make sure that organizations have all the data they need to make the right decisions and take the right actions, whether it is to have more compelling engagements with customers, develop better products, make better strategic decisions or make more efficient and cost-effective operational decisions. And I believe in this goal. When you think about it, this is the exact opposite of what a lawyer does.

    A lawyer’s job (and I am speaking from the perspective of a trial lawyer, which is what I did) is to limit the amount of data, evidence in the legal parlance, that is used by the ultimate decision maker (whether it is a jury or a judge) as much as possible to what favors your client’s side. Through a variety of motions before a trial and objections during trial (think of terms like hearsay, prejudicial, or irrelevant that you have heard in numerous TV shows or movies), lawyers try to limit the data or evidence that should be considered in making the ultimate decision.

    While this seems to work fine in an adversarial situation, think what it would be like if your business decisions were made the same way. What if a company decided to develop one product over the other because the product development team for the chosen product was able to limit what the other team could share with the executive decision makers. Or, if a decision to expand to a new territory was made based on incomplete market data from all regions.

    I have always been a data head deep down. In college, my favorite class (and my highest grade) was statistics. Looking back on it, I think I realized at a sub-conscious level that limiting or hiding data was not what I wanted to do for a living. That’s why I find it so appropriate that I ultimately ended up at Talend, a company whose goal is the opposite.

    If you are guilty of being as data driven as I am and want to ensure that you have all the information you need to make the right decisions and take the right actions, consider how your organization can benefit from improved data transparency and data access.

    Author: Jason Tolu

    Source: Talend

  • The benefits of continuous market monitoring

    The benefits of continuous market monitoring

    Market intelligence studies are often event-driven and have a clear begin and end. Usually, there is a decision that needs to be substantiated by specific market insights. As a consequence, managers start running to collect the required data or enable their research partner. After the decision has been made, activities are dominated by daily routine again. After some time, this cycle repeats itself, triggered by a new question that needs to be validated or substantiated.

    Considering the latter, would it not make way more sense to track your market on a more continuous base?

    For several reasons we think it does!

    Continuously monitoring your market provides actionable insights that are highly valuable in order to make decisions on marketing, strategic planning, positioning and product development. Considering your products and brands as key assets, knowing how they are used, perceived and recognized by your clients is crucial to steer future company’s growth and development.

    You can measure your brand and product by focusing on:

    • General performance
    • Awareness (aided & spontaneous)
    • Usage (penetration rates & market shares)
    • Customer satisfaction
    • Price position

    Of course, these parameters above are suggestions, the possibilities are endless. Structural market monitoring offers many benefits, we will highlight three of them:  

    1. Always access to the right market information

    Imagine you are asked to contribute to a new strategic plan for your department; you’ll definitely need figures about market size, potentials, trends & drivers, shares and competitor performance. Also, you need those figures as soon as possible. Obtaining this information can be extremely time consuming, especially if you have to start from scratch. By continuously monitoring your market you ensure yourself of direct access to relevant market information based on market definitions made common in your organization.

    It does not matter if you need to provide information for a strategy plan, marketing campaign or product development session; all information is directly available at your fingertips.

    2. Track market developments overtime

    By monitoring your market on a continues base, you are able to recognize long term trends.

    • Is your product or brand gaining more awareness?
    • Is this also resulting in growing sales figures or they lacking behind?
    • Is the new marketing campaign paying off?

    By defining clear KPI’s for your product or brand you can draw the bigger picture to track developments over time.

    Also, you can compare the impact of actions over time and draw comparisons. One organization may benefit from surveying customers yearly, another option is to conduct monthly or quarterly research. It all depends on your audience, goals, and business objectives. By market monitoring you are able to measure performance over time.

    3. Be on top of your market

    ‍We all know: daily operations are time consuming, research shows that most business leaders have the constant feeling they are missing out on key market developments. By continuously monitoring your market, you stay on top of developments, drivers and new competitors. This is an absolute necessity in today’s fast-paced world.

    Stop using general assumptions to drive strategic decision making, but use data and fact-based evidence.

    Key takeaway

    You can measure your product or brand on a wide variety of aspects, depending on the objectives of your organization. Surveying customers on a regular basis (for instance) provides your organization with actionable insights, that can steer marketing, strategic planning, positioning and product development decisions. Additionally, continuous market monitoring provides you with direct access to the right information, be able to track developments over time and stay on top of your market.

    Source: Hammer Market Intelligence

  • The human impact of data literacy

    The human impact of data literacy

    What if I told you only 32% of business executives said that they’re able to create measurable value from data, and just 27% said their data and analytics projects produce actionable insights? Let me put it another way: How excited would you be if I said I made you some chocolate chip cookies, but I only put in 32% of the required sugar and 27% of the required flour?

    I sure hope you wouldn’t eat those cookies. The cookies would be underprepared and not correctly baked with all the necessary ingredients for tasty success. To make an analogy, there are companies creating data and analytics (think: making cookies) without the necessary cultural and organizational ingredients to derive the greatest value from their creations.

    To help others better understand how data literacy – properly and programmatically implemented – can encourage organizations to use these needed ingredients, I recently co-presented a webinar with Martha Bennett, VP and Principal Analyst, from Forrester, and Rishi Muchhala, Manager of Enterprise Intelligence, from Nemours Children’s Health System. The webinar had thousands of attendees, and we received many good questions. I’ve formulated them and provided detailed answers below.

    Question topic 1: What about the data culture of an organization?

    This was a recurring theme in each of the questions that were asked and for good reason. The number one obstacle to data literacy success has nothing to do with data, technology or the software you deploy; it has everything to do with your culture and the people in your organization. Now, how many of you reading this think changing a culture is easy? If so, trust me – it’s not.

    Changing a culture is definitely not easy. It involves changing the DNA of an organization, so that people embrace – not just accept – data. This means data fluency, data literacy, analytical competence and data mentoring must be encouraged and reinforced at multiple touchpoints throughout the organization. Part of the solution is convincing people at all levels that data is empowering.

    Question topic 2: What are key areas to focus on in a data literacy program?

    This question is very large in scope, and you could get lost trying to address all facets of a data literacy program. Below are a few key areas a data literacy program should concentrate on.

    • Leadership – For any data literacy program to succeed, it must have leadership buy-in. The leaders of any organization set the tone and agenda for cultural change, marking how to measure it, conveying its progress and extolling its virtues.
    • Tailored learning – Remember that each individual is at his or her own data literacy stage, and we cannot expect a program to succeed if we try to fit everyone into the same puzzle space. One size does not fit all – people learn at different speeds in different ways, and you should provide for differing learning experiences that nurture data literacy growth across that spectrum.
    • Curiosity, creativity and critical thinking – Work hard to foster the '3 Cs of Data Literacy', which form the foundational pillars of nearly all data literacy programs. People should have a strong desire to know and understand, as well as engage in divergent and novel thinking. This is more likely to occur when the tenets of such thinking are embedded in every part of a data literacy program.

    Mind you: I am not recommending that everyone go back to school, study statistics and so forth. But, I am saying we need a culture that encourages the questioning and challenging of assumptions.

    Question topic 3: Who should lead the data literacy effort in the company?

    This is another great question. I have been approached by people who wonder if a grassroots movement among the employee base is the key to data literacy success. I have been approached by people who wonder if it is the executive team that leads the charge. The short answer is both.

    In order for your data literacy program to succeed, you must have leadership and executive buy-in. By having buy-in from the executive team, you ensure the workforce understands the company is behind the data literacy initiative. Then, create excitement through grassroots work and data literacy evangelists. These two techniques help organizations drive a holistic and inclusive approach to data literacy.


    The human impact of data literacy cannot be overemphasized. A workforce and society empowered by data leads to smarter, better-informed decision making, which makes us less prone to errors, groupthink and orthodoxy. This means we will be more open to challenging others’ practices that are not supported by evidence and also more accepting of data-based feedback that challenges our own approaches. In short, as a society, increased data literacy can only help us grow, as professionals and people, enriching and deepening our perspectives.

    Author: Jordan Morrow

    Source: Qlik

  • The key to implementing analytics successfully into your business

    The key to implementing analytics successfully into your business

    Today’s analytics are so powerful and accessible that they can drive value-based decision-making at all levels of the organization. No matter your industry, applications for analytics are available for any department—marketing, HR, operations, and others—enabling every team member to access and leverage insights they understand.

    Most senior leaders believe achieving these results means transformative changes, not only to enterprise technologies but also to the fundamental ways their employees do their jobs. This simply is not the case. Real-world evidence supports the often-counterintuitive idea that new analytics technologies must adapt to employees’ existing (good) habits when it comes to decision-making, rather than employees changing their habits to accommodate unfamiliar tools.

    This approach does not reduce the complexity of both analytics implementation and user adoption. After all, “advanced analytics can drive value only if employees use them to make decisions,” as McKinsey describes: “[But] The more familiar and intuitive a model is, the more likely it is to gain acceptance... integrating the analytics into core processes can make new systems feel like a natural extension of existing ones, rather than an abrupt change.”

    Here we take a closer look at how analytics can align with existing, effective workflows, thereby helping employees in their decision-making roles rather than forcing them to change. We will explore the key elements that make up a successful modern analytics environment as well.

    Modern Analytics Must “Fit In” With Existing Best Practices

    recent Korn Ferry report identifies one of the greatest workforce challenges of 2021: “needed capabilities are changing faster than it’s possible for organizations to shift their workforces.” That’s why leading companies are attempting to bring facts-based decisions into all levels of the organization. They are “democratizing” access to advanced analytics as part of a “business decision supply chain.”

    The business decision supply chain can be defined generally as a sequence of processes by which raw data is transformed into trusted, actionable insights that everyday business users can leverage for each of their own unique purposes. In the right environment, these insights not only provide the best value for each employee’s decision-making but are also easy for that employee to access and understand—no heavy technical training required. The result is twofold:

    1. Employees are happier—they are better at their jobs with less guessing and manual work.
    2. Results per employee are better across the enterprise, driving business value and resiliency.

    Making employees happy seems like a small goal for enterprise technology, but it’s actually critical to securing consistent use of analytics tools throughout the organization. The most powerful analytics are useless if users don’t feel the need or desire to use them.

    The Three Key Elements to Successful Analytics

    Good employees don’t live in a vacuum, either. They are aware of the business challenges surrounding their organizations and are most often open to some degree of change. With that in mind, we’ve identified three key elements that contribute to a successful, data-driven decision-making environment: in the subject areas of peopleprocesses, and technology. If you’re early in analytics adoption, consider how to approach these three factors as you begin.

    1. Help People Decide and Adapt to Market Changes

    In practice, individual teams and the initiatives with which they are most familiar shape analytics success for organizations. Leading companies are working to instill a profound sense of trust in analytics capabilities among their employees as a result.

    But business leaders must proactively understand the needs of those specific teams before deciding which data resources are right for them. They should maintain the essential goal of helping those employees in their decision-making—rather than forcing change alongside new technology adoption—as they move forward.

    Korn Ferry highlights an example of this change in their report, namely the changing role of a hypothetical salesperson. With fewer in-person visits to clients, her focus has shifted to become more data-focused: Today, she builds compelling proposals for clients using analytics. But while the salesperson’s focus has shifted, her responsibilities and KPIs have not fundamentally changed. She simply has more resources at her disposal, even if the broader business climate has reduced her opportunities to engage clients face-to-face.

    2. When It Comes to Processes, Integrate and Improve

    In practice, integrating analytics is about keeping “what works” for employees—processes they prefer and that drive results. For example, marketers can transition away from subjective decision-making and begin thinking strategically based on a wide variety of data-driven insights. Their responsibilities have not changed, but the ways in which they drive value are more accurate, and they are more confident.

    McKinsey cites another example—that of a mining company where dispatchers and operators were subjected to an alarm they most often ignored. That’s because the alarm did not fit into their existing workflow and therefore served no practical purpose. It was by “embedding the alert mechanism directly into the [particular] monitoring system” that those workers found it useful as the alert coordinated with their preferred processes.

    3. Make Sure Your Analytics Technology Has “Legs to Stand On”

    As you consider analytics platforms, remember this: All the best capabilities in the world won’t matter if employees don’t use the tools. How employees access and use analytics are therefore foundational to adoption and ultimately, ROI. Analytics environments that lend themselves to this level of adoption are critical as a result.

    Realize Business Value, and Employee Success

    New technologies always involve learning requirements. But the fundamental advantages of existing processes and goals that employees have come to value must be enhanced, not eliminated. Data leaders must consider both user and business needs and then assign data resources and self-service capabilities to employees with those in mind. That’s how the business decision supply chain becomes truly successful, transforming data into real business value.

    Author: Omri Kohl

    Source: Pyramid Analytics

  • The omnipresent challenge of collaborating effectively

    The omnipresent challenge of collaborating effectively

    Whether talking about collaboration tools or just the idea of collaboration, at  this point in time it should not be “news.” We’ve collaborated forever and the tools have been around for well over 20 years.

    And yet it seems we still struggle to figure out how to collaborate effectively.

    Maybe it is our org structures, where competing goals, or just different leaders, cause conflicts.

    Maybe it is biases and underlying tensions, where we either seek to be part of the crowd or worry about what an effective collaboration means to me as an individual.

    Maybe it is that we just assume we all know how to collaborate and yet fail to make it work.

    Whatever the reason, organizations that talk the collaboration talk, often fail to walk the walk effectively.

    Let me give you an example.

    There is all kinds of research and commentary that effective collaboration requires, at a minimum, a well declared goal/purpose and clearly defined roles for “collaborators.”

    But let’s look at in practice using, you guess it Gartner’s Enterprise Technology Adoption Profiles (ETAs).

    I’m gonna make some connections between a few different research studies, but bear with me. There are 3 of our profiles where the “control” attribute is based on a Collaborative Approach between business and IT:

    • SCD – Strict Planners, Collaborative, Dynamic pace of change
    • ACR – Accommodating Planners, Collaborative, Responsive pace of change
    • FCM – Flexible Planners, Collaborative, Measured Pace of Change

    In our study of high quality deals (situations where both the customer and vendor are happy)”, we saw one of these groups stand out in terms of the % of high quality deals among our respondents:

    • SCD – 50%
    • ACR – 15%
    • FCM – 12%

    A big, big difference. That stood out, but then we did a second survey, where we asked respondents how much they agreed with a statement that basically said “we regret nearly every technology product we purchase via subscription.” The results (higher is worse, of course):

    • SCD – 28%
    • ACR- 71%
    • FCM – 71%

    Again, a huge difference. It seems that our SCD group has figured out how to collaborate. Their strict approach to technology planning and focus on moving as fast as it prudently possible helps them standout. The other two seem to muddle along, trying to delay or avoid decisions (yes, we have other research that shows those profiles have lots of no decisions) as much as possible.

    But just recently, I found some other data in some of our research. I’ll be writing about this with some colleagues, but we recently asked about roles responsible for decision making or influencing decisions around SaaS and IaaS/PaaS. What jumps out for me is all of these ETA groups involve a diverse set of roles in decisions, with a range of 8 to 11 roles typically involved.

    But, there was a big difference–the average number of roles designated as decision makers.  The means for the SCD groups was 3.69 (SaaS) and 3.20 (Iaas/PaaS).  The means for the ACR group was 5.12 and 4.77. Finally, the mean for the FCM group was 5.82 and 5.45. The two groups that struggle with effective decisions have significantly more people designated as decision makers (vs. influencers). More decision makers is not a bad thing–if the specific decision responsibilities are clear. But if they are muddled, then it backfires.

    Putting the data from the different studies together, it seems that the ACR and FCM groups have a lot of unclear roles and responsibilities, leading to paralysis, passing the buck, or just getting stuck (BTW, another fact from studies–these two groups have many more no decisions than any other ETA

    It is also important to remember that the ACR and FCM groups are two of the largest ETA groups–typically accounting for well over 40% of the market. For vendors, this is both a warning signal and an opportunity. Collaboration, done right, is powerful (look at the SCD group). You have an opportunity to help your customers that want to be collaborative, but haven’t figured out how, to do it more effectively. It will make a big difference for you, and them.

    Author: Hank Barnes

    Source: Gartner

  • The results are in: analytics lead to better product decisions

    The results are in: analytics lead to better product decisions

    You already know how infusing analytics into your daily apps and workflows has boosted performance and opened up insights to making smarter, better decisions. Now, product decision makers say they’ve found that providing data analytics to their customers enhances the value of their products. In a new Harris Poll study commissioned by Sisense, product decision makers overwhelmingly (93%) report that providing customized and personalized analytics to customers, at the point of decision, would give their organization an advantage over others in the industry.

    It’s clear to product decision makers that analytics offerings in products drive business value for their customers, as well as for their own businesses. In fact, 100% think that at least a segment of their customers are in need of better data and analytics. 75% say that most or all of their customers have this need. This represents an enormous opportunity for product decision makers to differentiate their products and services from the competition.

    Additionally, 80% of respondents even feel such offerings could allow them to charge up to nearly 20% more for their products and services because of the value of the insights that advanced analytics provides to customers and users.

    Digging Deeper: Key points About Infusing Analytics

    The best (and best-selling) apps and software solutions on the market have analytics and data features in some form. The ability to harness the robust insights provided by your data is what separates industry leaders from the rest of the pack. But far too often, these insights aren’t used in creative, strategic ways to drive growth. In fact, 38% of product decision makers believe their customers look at data and analytics monthly or less often.

    Taking a closer look, the analytics that product decision makers believe would be most useful to customers are prescriptive (56%) and predictive (what may happen in the future) (53%). More specifically, 81% of product decision makers feel if they could provide their customers with personalized and customized data and analytics, it should be embedded – including embedding in communication software/platforms (57%), in custom-built apps (55%) and in off-the-shelf business or SaaS apps (52%).

    An advanced analytics solution embedded into products and apps doesn’t just crunch complex data into easy-to-understand insights — it presents actionable intelligence to the right users at the right decision-making points in their workflows. This type of in-workflow enhancement opens up critical, actionable intelligence where, when, and how the user needs it. This empowers them to take action at the best time.

    Current Barriers to Analytics Success

    Lack of skill is not the issue here. Product decision makers are confident in their customers’ ability to use the data and analytics they receive to make decisions. Nearly three quarters (74%) say all/most of their customers are skilled in doing so. As a result, around 1 in 10 (11%) feel their customers simply do not have the data and analytics they need to make decisions to take appropriate actions.

    The deeper issue is that nearly all (92%) product decision makers currently deliver data and analytics to their customers via non-embedded methods. Specifically, 57% still deliver it by web portal/dashboards and 60% by email. Looking to the future, 81% of product decision makers say that if they could provide their customers with personalized data and analytics, it should be provided by embedding those into communication software or platforms, custom-built apps or off-the-shelf business or SaaS applications.

    In short, product decision makers have found that what drives growth in their companies will also help boost their customers’ operations.

    Author: Rachel Burstyn

    Source: Sisense

  • Using Business Intelligence in an optimal way to support business decisions

    Using Business Intelligence in an optimal way to support business decisions

    By utilizing a fact-based, real-time, singular version of the truth, business people are empowered to achieve and maintain a competitive edge through the use of industry specific business intelligence. Executives, CFOs, branch managers and your sales team have immediate access to crucial information to make fast and educated decisions.

    Data is available across core business processes such as industry trends, customer behavior, productivity, inventory, and detailed financial analysis. Business intelligence software extracts the information, transforming it into clear insights to enable actionable and strategic decision-making so people can readily achieve their goals.

    Better sales decisions

    Industry specific business intelligence enables companies to discover detailed sales trends based on their customers’ preferences, reactions to promotions, online shopping experiences, purchasing habits, and patterns and trends which affect sales.  Leveraging customer buying habits permits a company to decide the best course of action to retain valuable customers and take advantage of missed sales opportunities.

    By drilling down to such comprehensive insights, a company can quickly decide which link-sell opportunities to increase or which products are best for cross selling.  By identifying customers in decline, a business can determine the best plan to reposition the product before they stop buying altogether. Sales managers are able to identify the best type of customers, where to find them, and determine the most effective acquisition and conversion strategies. By identifying bottom-buyers, a company may make decisions around the best promotional strategies or whether to let those customers go.

    Having a clear picture of sales trends also allows collaboration in marketing and management decision-making. 

    Better marketing strategy

    By monitoring trends to determine customer preferences, a company can quickly make strategic marketing decisions to best capitalize on their products or services. Data analytics software can identify promotional returns and analyze campaign outcomes. A company can now use it to decide how to prioritize campaigns, tailor promotions, and engage in social media to maximize marketing efforts. This enables a company to make decisions that will fine-tune their marketing strategies, reduce overhead and garner a better return on investment.

    Better business decisions 

    Data analytics allows Executives to make decisions based on statistical facts. Those facts can be used to guide choices about future company growth by evaluating a long-term view of the market and competition. Data analytics can help Executives decide how to streamline processes by using visualizations identifying the productivity in each area of the company, including employee management. By identifying actionable insights, a manager can determine the most effective strategies to improve employee productivity, streamline the recruitment process, or reduce employee turnover. Data analytics allows Executives to funnel all of the facts into making crucial operational decisions.

    Better inventory decisions

    Using data analytics to identify problem areas and opportunities allows a company to make decisions that will refine their inventory management. For example, the decision to reduce excess inventory also reduces the cost to maintain it. With better visibility, a company can make better decisions about how much to order and when. Knowing a product’s ordering patterns along with the best times, prices, and quantities to buy also allows managers to change pricing tiers to increase profit margins and capitalize on every opportunity.

    Better financial decisions

    Data analytics offers an up to date view of a company’s financial picture. A Manager can view profit and loss, general ledger, and balance sheet figures through features such as Phocas' Financial Statements. Top-notch BI will allow businesses to drill all the way down to individual transactions to get instant answers to revenue opportunities and cost concerns. By examining incoming and outgoing finances of the present and past, a business can make decisions based on the company’s future financial status. Breaking down revenue by location evaluates the strength of product lines by branch. For example, a business may decide to remove a specialty item from one location and increase its promotion in another. Customizing the dashboard allows Executives to track key performance indicators (KPI’s) to enable effective financial oversight and management.

    By analyzing data and monitoring critical business operations, a company is well positioned for successful strategic decision-making based on factual insights and “one view of the truth.”

    Source: Phocas Software

  • What Does Being Data-Driven Really Mean? A F1 Racing Analogy

    What Does Being Data-Driven Really Mean? A F1 Racing Analogy

    Did you watch the Hungarian F1 race Sunday July 31st? Max Verstappen qualified 10th on Saturday but won the race. Lewis Hamilton started 7th and ended second. The pole sitter, George Russel ended up third. The two Ferrari drivers, who qualified 2nd and 3rd and favored to perform well, ended up 4th and 6th respectively. Something went array for Ferrari, but what?

    What is Strategy?

    This is yet another story about what strategy really means. On lap 21 the TV camera panned to the corners before the main straight. The two Ferrari drivers of Carlos Sainz and Charles Leclerc were under 1 second apart chasing down the leader, George Russel.  As the camera followed the leading three cars, It looked like the Ferrari’s were about to overtake the leader. Then we heard over the Ferrari radio, ‘box, box, box’. Box means ‘head to the pits’. Just as Ferrari speed and tire management looked like it was paying off and they might take the lead while on the track, someone at Ferrari decided to take the cars out of the battle. What?  Why?

    As the race unfolded things went from bad to worse. One of the Ferrari’s put on hard tires. All the other front runners were on soft or medium tires. A couple of other cars were on hard tires but their race pace was off and they were not competitive. Putting on hard tires didn’t make sense. From there on, the Ferrari lost position and was uncompetitive with the top drivers.

    Post-Race Review

    In the post-race interviews the real problem surfaced. Christian Horner, team principle of the winning Red Bull, gave it away. It seems that on the previous day, in warmer weather, the hard tire did offer higher performance than soft or medium tires. That was what the data said then.

    When Mattia Binotto was interviewed, the Ferrari team principle noted that his cars “didn’t perform as expected”. Ferrari had decided to put on hard tires on race say (Sunday) using data from Saturdays qualifying session. Christian’s critical killer phrase was this: “You only had to look up from the screen.”

    Here is the story in my words. Ferrari were relying too much on data alone to inform or dictate their strategy. Red Bull and Mercedes all used data, but they added a human element. They had looked up from their screens and saw how the hard tires were actually performing in the cooler weather. Thus they discounted the data due to different conditions. Ferrari didn’t. Ferrari assumed the data was reliable, trusted. And so it was. But the data didn’t apply to the actual race conditions of the day!

    What Being Data-Driven Really Means

    Conclusion: to be data-driven does not mean to (only) use data or use data all the time to automate every decision. Some decisions need to be augmented: Machines or humans using data is not always enough. Sometimes you need to look up or away from the data.

    So to be data-driven does not mean to (only) use data. I like to say that to be data-driven really means “to help business leaders ask smarter questions of the business and environment around them”. Ferrari didn’t ask a smarter question. They asked a question from the previous day and got an answer for the previous day. The other teams asked a different, smarter question. And ignored the data and expanded their data set. (Toto Wolff quote)

    The Ferrari drivers, who should have been on the podium at the end of the race, finished in the doldrums.

    Author: Andrew White

    Source: Gartner

  • Why market research should always be taken into account when making corporate decisions

    Why market research should always be taken into account when making corporate decisions

    Not long ago, a mailer from a major industry trade organization fluttered into our box, advertising a marketing research conference entitled “Earning a Place at the Table.” The blurb asks, chillingly, if market researchers have a place at the table where decisions are made, whether they’ve earned that place – and if they’d even know what to do if they had it. Wow.

    Being notoriously thick-skinned, we overlooked the insult. But this is scary stuff, with an unmistakable whiff of inferiority complex. Why should the organizers ask professional market researchers such skittish questions? Whence this cringing, whining conference theme? Market research is at the very root of all intelligent, directed action – or so we believe. What sensible person or organization acts without understanding context, alternatives, and probable consequences? Only a dope or an egomaniac would dive into a business, social, or military program without due diligence – in other words, without good intelligence. In still other words, without proper marketing research.

    So maybe the problem lies in what currently constitutes “proper marketing research”. Another organization’s most recent conference, The Advertising Research Foundation, anointed the consumer the “new marketing compass”, assuring attendees that by acknowledging that, it would lead the way to profitability and, one assumes, a place at the table where the decision-making process takes place. The difficulty is, however, that proper research isn’t always the old model stuff, no matter how it gets dressed up for the new millennium.

    Run-of-the-mill mid-20th Century market research is a superb rearview mirror, but a crummy windshield. As a trend predictor, it often produces what we call “excellent answers to meaningless questions”. Using the old model, market research companies and departments tend to crank out wads of analysis that may be impressive; maybe phrased brilliantly; may even be true – but which is valueless as a leading indicator, which is what you need your “compass” to do – point you in the right direction.

    Frighteningly, there seems to be a robust market for lagging indicators. Is it any wonder market researchers are increasingly banished from the table where the decisions are made? Relevant decisions in business revolve around knowing what customers want, so the businesses can create, develop, and maintain brand loyalty. OK, to a limited extent, traditional research can help brands know that.

    But there’s much more to the story. The way the world works nowadays, present tense thinking is no longer enough. It’s essential to predict not only what customers want, but what they will want. Who wants you at the table telling folks what happened last time!?

    Some marketers are blessed with a great “feel” for the market at any given time, but few can really see the future. And although forward-thinking researchers modestly disclaim their godhood, it’s true that some updated methodologies do a remarkably good job of measuring the direction and velocity of customer values – and identifying the values for which customers have the highest expectations. This makes it easier to leverage those values to the brand’s benefit.

    The latest research techniques that are able to slip behind respondents’ unconscious defenses and other right-brained shenanigans can show where customers’ loyalty drivers lie, and which way lies not just present but also future happiness – for the brand as well as for the customer.

    Up until 1989, making the customer happy was enough to differentiate a brand, maintain a loyal customer base, and taking measures of that was just fine. After that “happiness” or “satisfaction” paled into “expectation”. And unless you can always meet or exceed expectations – and measure and report that status – you won’t keep your customers loyal, and you won’t be profitable, the ostensible outcome of the decision-making exercise.

    The problem then becomes, what gives you the best chance of always meeting or exceeding customer expectations? The best way is via research that constantly updates the understanding of the direction and velocity of customer values. We deliberately repeat that exact phrase, because understanding the concept – and acting upon it – is what will bring research companies back to “the table”. Sadly, most businesses don’t, won’t, or can’t understand the concept. Most of them realize how little they understood only when they find that their “compass readings” misled them and their brand goes belly up.

    If market research is not impacting corporate decision-making, there are surely reasons. And the blame isn’t always with the researchers. Where were the decision-makers when the research was ordered? Where were the marketing geniuses when the page was blank? For every marketing disaster, there are plenty of fingers to point, and more than enough pointees to go around. New Coke and any of the recent Kmart or GAP advertising strategies did not happen in a vacuum.

    So, what’s the solution? How do we get market research back into the boardroom, back to the decision-making table? A good first step is for researchers to recognize – and admit aloud – the limitations of traditional research and the benefits of updated, predictive methodologies. Second, they need to acknowledge that most companies are awash with data but possess little meaningful intelligence, my version of “we-taught-them-how-to-count-now-we-have-to-teach-them-how-to-think”! This will require some guts and some new learning, but the investment in this kind of intellectual capital will reap rewards for everyone.

    The next step is to discuss the new realities openly and candidly with our clients. Make them aware that databases are not intelligent; that even the prettiest PowerPoint presentation is useless if it offers no understanding of customers’ evolving values; that closed-circuit thinking is likely to result only in lots more numbers, but very little insight. If that’s all you bring to the boardroom, you don’t get invited to the table. And rightly so.

    Ignorance, denial, sclerotic thinking by management – these are some reasons why market research is kept away from “the table”. But let’s be frank: Another reason is how few researchers show up at the table with anything truly useful. This can and must change, and the sooner the better. If not, we’re doomed to year after year of conferences like the one mentioned above.

    Author: Robert Passikoff

    Source: GreenBook Blog

EasyTagCloud v2.8