48 items tagged "data analytics "

  • 3 Important don'ts when entering the market with a new business

    3 Important don'ts when entering the market with a new business

    Entering the market with a new business is an exciting experience. Your marketing strategy will play a crucial role in the success of your company. Here are three of the most common marketing mistakes, and how you can overcome them. These are 3 important don'ts' that people tend to do.

    When most people think about starting a business, they often think too far ahead. Even though the long term plan is important, it’s also important to take some time to think about your short term marketing strategy.

    Instead of focusing heavily on niche specific marketing mistakes, we are going to take a look at mistakes that new business owners can make when working out how they are going to market their company and target their customers.

    1. Don't overestimate the idea of needing a complete website for marketing

    One of the first mistakes many business owners make is to believe they must have a complete website to start marketing to their audience. A common trend among new startups that may be useful is creating a ‘coming soon’ page for their potential customers.

    A coming soon page is a way to give customers a taste of what’s to come when your website and business officially launches. We often have it set in our minds that it is impossible to promote something that doesn’t yet exist, but that’s not true.

    2. Don’t forget a contact page

    Surprisingly, many new business owners forget about this crucial aspect of their website. It doesn't matter if you’re creating a coming soon page or launching the full website on day one, you must have an easy way for customers to contact your business!

    There is a variety of free and premium contact form builders available you can download for your website. These builders give you the freedom and flexibility to build custom contact pages for your customers so they can communicate what they need help with regarding your product or service. You could also use this as an opportunity to discover what kind of pain points your customers have, as it pertains to your niche, so you can work on improving your company once you notice a reoccurring problem.

    Contact pages are relevant because they are a way for you to keep in contact with the people who buy your products or services, potentially adding them to your email marketing list. This strategy could open the door for future email offers and makes marketing new products to a test audience easy.

    3. Don’t be afraid to experiment

    When people start marketing their website to a broader audience, they often feel as if their way is the only right way. You have to be able to come to terms with your notions of what customers want and run split tests on your marketing campaign and on-site ads to learn what your customers expect from your brand.

    For example, if you’re running a site about content marketing and keep pushing a free checklist for new bloggers and no one is responding, maybe it’s time to think about other things that could benefit your target audience. You can run split tests with two ad campaigns for instance, one with the free checklist and one with access to a SEO webinar and see which one is the most attractive.

    The point here is that sometimes you have to put different options out there, see how the customers respond, and use the obtained analytic data to determine where to take your business model next.


    As a new business owner, your marketing strategy will likely change rapidly during the first few years of business. You’ll learn how to handle customer issues and how you can use their problems to build a better product.

    After you’ve nailed down a formula, it’s essential to keep your mind open and anticipate small changes while marketing. The small changes can and will add up to more significant changes over time. One final tip is that you must be prepared to evolve with your brand and your customers. Don’t get too comfortable or set in your ways.

    If the past decade has taught us anything, it’s that marketing is continually changing based on social media, customer perception and needs, and how you present your business to potential leads.

    Author: Thomas Griffin

    Source: Business.com

  • 3 Predicted trends in data analytics for 2021

    3 Predicted trends in data analytics for 2021

    It’s that time of year again for prognosticating trends and making annual technology predictions. As we move into 2021, there are three trends data analytics professionals should keep their eyes on: OpenAI, optimized big data storage layers, and data exchanges. What ties these three technologies together is the maturation of the data, AI and ML landscapes. Because there already is a lot of conversation surrounding these topics, it is easy to forget that these technologies and capabilities are fairly recent evolutions. Each technology is moving in the same direction -- going from the concept (is something possible?) to putting it into practice in a way that is effective and scalable, offering value to the organization.

    I predict that in 2021 we will see these technologies fulfilling the promise they set out to deliver when they were first conceived.

    #1: OpenAI and AI’s Ability to Write

    OpenAI is a research and deployment company that last year released what they call GPT3 -- artificial intelligence that generates text that mimics text produced by humans. This AI offering can write prose for blog posts, answer questions as a chatbot, or write software code. It’s risen to a level of sophistication where it is getting more difficult to discern if what it generated was written by a human or a robot. Where this type of AI is familiar to people is in writing email messages; Gmail anticipates what the user will write next and offers words or sentence prompts. GPT3 goes further: the user can create a title or designate a topic and GPT3 will write a thousand-word blog post.

    This is an inflection point for AI, which, frankly, hasn’t been all that intelligent up to now. Right now, GPT3 is on a slow rollout and is being used primarily by game developers enabling video gamers to play, for example, Dungeons and Dragons without other humans.

    Who would benefit from this technology? Anyone who needs content. It will write code. It can design websites. It can produce articles and content. Will it totally replace humans who currently handle these duties? Not yet, but it can offer production value when an organization is short-staffed. As this technology advances, it will cease to feel artificial and will eventually be truly intelligent. It will be everywhere and we’ll be oblivious to it.

    #2: Optimized Big Data Storage Layers

    Historically, massive amounts of data have been stored in the cloud, on hard drives, or wherever your company holds information for future use. The problem with these systems has been finding the right data when needed. It hasn’t been well optimized, and the adage “like looking for a needle in the haystack” has been an accurate portrayal of the associated difficulties. The bigger the data got, the bigger the haystack got, and the harder it became to find the needle.

    In the past year, a number of technologies have emerged, including Iceberg, Hudi, and Delta Lake, that are optimizing the storage of large analytics data sets and making it easier to find that needle. They organize the hay in such a way that you only have to look at a small, segmented area, not the entire data haystack, making the search much more precise.

    This is valuable not only because you can access the right data more efficiently, but because it makes the data retrieval process more approachable, allowing for widespread adoption in companies. Traditionally, you had to be a data scientist or engineer and had to know a lot about underlying systems, but these optimized big data storage layers make it more accessible for the average person. This should decrease the time and cost of accessing and using the data.

    For example, Iceberg came out of an R&D project at Netflix and is now open source. Netflix generates a lot of data, and if an executive wanted to use that data to predict what the next big hit will be in its programming, it could take three engineers upwards of four weeks to come up with an answer. With these optimized storage layers, you can now get answers faster, and that leads to more specific questions with more efficient answers.

    #3: Data Exchanges

    Traditionally, data has stayed siloed within an organization and never leaves. It has become clear that another company may have valuable data in their silo that can help your organization offer a better service to your customers. That’s where data exchanges come in. However, to be effective, a data exchange needs a platform that offers transparency, quality, security, and high-level integration.

    Going into 2021 data exchanges are emerging as an important component of the data economy, according to research from Eckerson Group. According to this recent report, “A host of companies are launching data marketplaces to facilitate data sharing among data suppliers and consumers. Some are global in nature, hosting a diverse range of data sets, suppliers, and consumers. Others focus on a single industry, functional area (e.g., sales and marketing), or type of data. Still, others sell data exchange platforms to people or companies who want to run their own data marketplace. Cloud data platform providers have the upper hand since they’ve already captured the lion’s share of data consumers who might be interested in sharing data.”

    Data exchanges are very much related to the first two focal points we already mentioned, so much so that data exchanges are emerging as a must-have component of any data strategy. Once you can store data more efficiently, you don’t have to worry about adding greater amounts of data, and when you have AI that works intelligently, you want to be able to use the data you have on hand to fill your needs.

    We might reach a point where Netflix isn’t just asking the technology what kind of content to produce but the technology starts producing the content. It uses the data it collects through the data exchanges to find out what kind of shows will be in demand in 2022, and then the AI takes care of the rest. It’s the type of data flow that today might seem far-fetched, but that’s the direction we’re headed.

    A Final Thought

    One technology is about getting access, one is understanding new data, and one is executing information based on the data. As these three technologies begin to mature, we can expect to see a linear growth pattern and see them all intersect at just the right time.

    Author: Nick Jordan

    Source: TDWI

  • 3 Things we have learned about CI during the time of COVID-19

    3 Things we have learned about CI during the time of COVID-19

    There is no adequate way to express the effect COVID-19 has had on society. It’s changed the way we live and the way we work. Competitive intelligence (CI) might seem like an 'extra' in the time of COVID, but it’s more crucial to your bottom-line now more than ever. 

    Here are three lessons we’ve learned about competitive intelligence for businesses in the era of COVID-19.

    1: Every single deal matters and good competitive intel equals more revenue

    CI is about driving action. It’s not enough to simply push CI to stakeholders and have no action being taken as a result. This causes competitive intelligence to become a nice-to-have at best and a cost-center at worst. When CI is used to drive decision making and action, it becomes critical to revenue generation.

    The current scarcity of deals increases the likelihood of a competitor being present in a deal, so you need to ensure that you're setting up your CI to be easily leveraged by your sales team.

    Here are a few ways you can use CI to help sales win deals:

    • Battlecards: Provide your sales team with competitive battlecards. Making sure battlecards are easily accessible and up-to-date with the most current CI will enable your sales team to knock competitors out of deals quickly.
    • Deep dive competitive training: Take time to sit with sales and do a deep dive into one of your chief competitors. Add role-playing into the training so sales can get practice on selling against that specific competitor.

    • Leverage field intel: Your reps spend all day talking to prospects, and in doing so, they gather excellent intel on your competition. Give sales the ability to share great field intel so they can help their fellow reps win more deals.

    2: There are more competitive signals being put out there than ever before

    While sales have been declining, marketing engagements have increased significantly, specifically marketing email open rates and website visits. Meaning, buyers might not be ready to sign a check quite yet, but they are certainly looking to educate themselves with content and virtual events in the meantime.

    This means that your competitors’ marketing teams are likely putting out more content and campaigns than ever before, both on and off their website. Tracking and analyzing these signals is crucial to understanding your competitors’ strategies, and since there is more of an emphasis on engaging and educating prospects, there are now more competitive signals to glean intelligence from.

    Here are some competitive signals you should be keeping an eye out for:

    • Messaging changes: Track your competitors’ homepages and other website pages for any changes in messaging, it will signal how they are adjusting their strategy during the COVID era. 

    • Employer reviews: Find out what former and current employees of your competitors’ are saying about them. Employee reviews can give you visibility into competitor strategy like what investments (or lack thereof) are being made. Glassdoor now lets you filter reviews by “COVID-19” so you can see how your competitors are handling the crisis internally.

    • Marketing campaigns: Marketing teams are putting out more content than ever to educate and engage buyers. Keep track of your competitors’ social media campaigns, content initiatives, and virtual events to see how they are currently engaging the market.

    3: Optimal distribution is key to getting stakeholders to take action on CI

    Remote work is the new reality, and with that comes certain challenges. You may feel like competitive intelligence is being ignored if you aren’t interacting with your stakeholders in-person, or that some context is being lost. 

    The key to getting others to take action on CI is to deliver it to them in a format that is optimal for their consumption. Stakeholders all have different needs: sales needs to win more deals, executives need guidance on strategy, marketing needs to understand messaging and campaigns, and product needs to understand the competitor roadmap. In addition to having different needs, your stakeholders consume information differently. Tailor your information and communication method for each stakeholder. 

    Here are examples of how you can distribute competitive intelligence to different stakeholders:



    Intelligence Types

    CI Format

    Executive Team

    Guidance on strategy

    Team changes, financial data (SEC filings, etc.), messaging changes, new customers and partners

    Dashboards, weekly CI digests, periodic CI updates via remote meetings


    Win more deals

    Pricing changes, messaging changes, positive/negative product reviews, employee reviews

    Battlecards, intel updates via chat (Slack, etc.), competitor trainings


    Run better campaigns

    Website changes, messaging changes, marketing campaigns, social media activity

    Weekly CI digests, alerts of high priority shifts


    Roadmap guidance

    Team changes, new customers and partners, positive/negative product reviews, product updates, pricing and packaging changes

    Dashboards, weekly CI digests, alerts for high priority shifts

    Embracing the new way of working

    No one can predict the future, but we all must adapt to our present reality. There will likely be more changes coming down the road for businesses, and the best you can is do your best to be cognizant of trends and continue to enable your teams and serve your customers.

    Author: Lauren Kersanske

    Source: Crayon

  • 4 Trends That Are Driving Business Intelligence Demands

    IM Photo business intelligence fourMany organizations have sung the praises of business intelligence for years, but many of those firms were not actually realizing the full benefits of it. That picture is beginning to change, as advanced analytics tools and techniques mature.

    The result is that 2016 will definitely be the ‘year of action’ that many research firms have predicted when it comes to data analytics. That, at least, is the view of Shawn Rogers, chief research officer at Dell Statistica, who believes “we are at a tipping point with advanced analytics.”

    If Rogers sounds familiar, it may be due to his early connection to Information Management. Rogers was, in fact, the founder of Information Management when it was originally called DM Review Magazine. He is now in his second year as chief research officer for Dell Statistica. “Prior to that I was an industry analyst. I worked for Enterprise Management Associates and I covered the business intelligence, data warehousing and big data space.”

    Rogers believes there are a number of key trends driving business intelligence today that are making it more useful for a greater number of organizations.

    “The maturity in the market has helped everyone evolve to a much more agile and flexible approach to advanced analytics. I think there are four things that are driving that which make it exciting,” Rogers says.

    “One of them is the new sophistication of users,” Rogers notes “Users have become very comfortable with business intelligence. They want advanced insights into their business so they’re starting to look at advanced analytics as that next level of sophistication.”

    “They’re certainly not afraid of it. They want it to be more consumable. They want it to be easier to get to. And they want it to move as fast as they are. The users are certainly making a change in the market,” Rogers says.

    The market is also benefitting from new technologies that are enhancing the capabilities of advanced analytics.

    “It now functions in a way that the enterprise functions,” Rogers explains. “Now the technology allows advanced analytics on all of the data within your environment to work pretty much at the speed of the business.”

    Certainly not insignificant is the economic advantage of more competition from data analytics tool vendors.

    “There are all kinds of solutions out there that are less money. It has opened the door for a much wider group of companies to leverage the data in their enterprise and to leverage advanced analytics,” Rogers observes.

    “Lastly, the data is creating some fun pressure and opportunities. You have all these new data sources like social and things of that nature. But even more importantly we’re able to incorporate all of our data into our analysis,” Rogers says.

    “I know that when I was in the press and as an analyst I use to write a lot about the 80/20 rule of data in the enterprise – the 20 percent we could use and the 80 percent that was too difficult. Now with all these new technologies and their cost benefits we’re not ignoring this data. So we’re able to bring in what use to look like expensive and difficult to manage information, and we’re merging it with more traditional analytics.”

    “If you look at more sophisticated users, and economic advantage, and better technology, and new data, everything is changing,” Rogers says. “I think those four pieces are what are enabling advanced analytics to find a more critical home in the enterprise.”

    Finally, the other key trend driving the need for speed when it comes to analytics and business intelligence return on investment is where those investments are coming from. Increasingly they are not from IT, Rogers stresses.

    “I think there has been a big shift and most of the budgets now seem to be coming from the line of business – sales, marketing, finance, customer service. These are places where we’re seeing budgets fly with data-driven innovation,” Rogers says.

    “When you shift away from the technology side of innovation and move toward the business side, there is always that instant demand for action. I think that saturation of big data solutions, the saturation of analytics tools, and a shift from IT to the business stakeholder standpoint is creating the demand for action over just collecting data,” Rogers concludes.

    Source: Information Management

  • 5 Arguments that will convince sales people of the value of analytics

    5 Arguments that will convince sales people of the value of analytics

    Many sales reps have a certain way of doing things. Implementing new processes or adding new tools or technologies that attempt to change their habits can often be met with resistance.

    Sales reps rely on their “tried-and-true” methods learned from predecessors, or they lean on their personal knowledge and experience to manage their customers and plan their approach with individual customers. Gut-feel has been the leading driver for sales strategies for many years, but in today’s fast paced and competitive environment, sales reps need every advantage they can get.  

    A recent McKinsey article suggested, “driving sales growth today requires fundamentally different ways of working, as well as outstanding execution across large, decentralized sales teams and channel partners. While many sales leaders accept this reality in principle, they don’t put sufficient energy or focus into driving that level of change. Advances in digital and analytics, however, mean that sales leaders can now drive and scale meaningful changes that pay off today and tomorrow.”

    So, if you’re a sales rep that doesn’t think you need data analytics, here are five reasons why you do:

    1. There are always more sales opportunities than you think 

    This alone should steer your team toward data analytics. Data can uncover trends in your customers’ buying behavior that can help you identify gaps in their ordering. In addition, your customers’ data can also reveal upsell or cross-sell opportunities that can help you increase your sales volume across a much wider swath of products, without impacting any of your existing sales. While your gut feel may tell you to spend more time with a customer, data can help you understand why, pointing you to new complementary products that can quickly grow your sales.

    2. It is critical to uncover challenges before they impact your bottom line

    There is a good chance one or more of your customers purchase products from other suppliers. What if that same customer started to buy less from you and more from that other supplier that recently entered the market? What if that decline occurred over several months? Would you even know? These are difficult questions to ask and answer, but if you’re like many sales people, you have dozens of customers that you are working with and a slow decline in sales with a single customer may go unnoticed. With data at your fingertips, from your laptop to your mobile device, you can constantly monitor your customers' purchasing habits, and ask questions about negative trends before they start to impact your company’s bottom line and your paycheck.

    3. Retaining customers is easier than finding new ones

    This is related to number two, but it deserves its own bullet point. Retention is a simple business reality that makes your business data even more important. Underserved customers are underserved for a variety of reasons. Perhaps they are new and got lost in the shuffle, or turnover at the sales rep position has left them without support for a period of time. Perhaps they have made several large purchases over the last year and deserve better pricing, or they were once a loyal customer, but their sales have slowly declined, and are at risk of leaving to a competitor. Engaging these at-risk customers requires that you recognize the signs before they take their business elsewhere.

    4. It will make your life easier 

    Access to data analytics has oftentimes only been given to the IT team or specially trained individuals. Data analytics turns raw data into actionable intelligence. No more reading outdated spreadsheets, guessing where your next sale will come from or what information to share with your customer during your next sales meeting. Business intelligence software is designed to help you quickly mine value from data so you can make the right decision for you and your customers. Rows and columns of data are now presented in charts, graphs and tables that you can click to uncover transactional level details that brings to the surface the accounts that need your attention the most. Data analytics helps you eliminate the guess work about your job and focus on what customer you can help the most while also helping you achieve your sales goals.

    5. It helps you prepare to perform

    Imagine going into a customer meeting with their entire order history at your fingertips, or an understanding of their recent commitment to certain brand, style or size of product. How will that information shape your next product presentation or sales proposal? You can turn your customers into data advocates by reviewing with them weekly reports about their engagement with you. Could that information help them improve efficiencies, capitalize on sales promotions or recognize holes in their own ordering? As you share and use your data to help them, you show them that you are committed to their success, as well as your own.

    Data analytics is a powerful tool for sales people that are looking to maximize their performance, grow sales and retain customers. The results of implementing analytics are better revenue growth at the same or improved margins, quickly, while customer satisfaction improves. If you’re not using data to drive your business, there’s no better time than the present to start.

    Source: Phocas Software

  • Applying data science to battle childhood cancer

    Applying data science to battle childhood cancer

    Acute myeloid leukaemia in children has a poor prognosis and treatment options unchanged for decades. One collaboration is using data analytics to bring a fresh approach to tackling the disease.

    Acute myeloid leukaemia (AML) kills hundreds of children a year. It's the type of cancer that causes the most deaths in children under two, and in teenagers. It has a poor prognosis, and its treatments can be severely toxic.

    Research initiative Target Paediatric AML (tpAML) was set up to change the way that the disease is diagnosed, monitored and treated, through greater use of personalised medicine. Rather than the current one-size-fits-all approach for many diseases, personalised medicine aims to tailor an individual's treatment by looking at their unique circumstance, needs, health, and genetics.

    AML is caused by many different types of genetic mutation, alone and together. Those differences can affect how the cancer should be treated and its prognosis. To understand better how to find, track and treat the condition, tpAML researchers began building the largest dataset ever compiled around the disease. By sequencing the genomes of over 2,000 people, both alive and deceased, who had the disease, tpAML's researchers hoped to find previously unknown links between certain mutations and how a cancer could be tackled.

    Genomic data is notoriously sizeable, and tpAML's sequencing had generated over a petabyte of it. As well as difficulties thrown up by the sheer bulk of data to be analysed, tpAML's data was also hugely complex: each patient's data had 48,000 linked RNA transcripts to analyse.

    Earlier this year, Joe Depa, a father who had lost a daughter to the disease and was working with tpAML, joined with his coworkers at Accenture to work on a project to build a system that could analyse the imposing dataset.

    Linking up with tpAML's affiliated data scientists and computational working group, Depa along with data-scientist and genomic-expert colleagues hoped to help turn the data into information that researchers and clinicians could use in the fight against paediatric AML, by allowing them to correlate what was happening at a genetic level with outcomes in the disease.

    In order to turn the raw data into something that could generate insights into paediatric AML, Accenture staff created a tool that ingested the raw clinical and genomic data and cleaned it up, so analytics tools could process it more effectively. Using Alteryx and Python, the data was merged into a single file, and any incomplete or duplicate data removed. Python was used to profile the data and develop statistical summaries for the analysis – which could be used to flag genes that could be of interest to researchers, Depa says. The harmonised DataFrame was exported as a flat file for more analysis.

    "The whole idea was 'let's reduce the time for data preparation', which is a consistent issue in any area around data, but particularly in the clinical space. There's been a tonne of work already put into play for this, and now we hope we've got it in a position where hopefully the doctors can spend more time analysing the data versus having to clean up the data," says Depa, managing director at Accenture Applied Intelligence.

    Built using R, the code base that was created for the project is open source, allowing researchers and doctors with similar challenges, but working on different conditions, to reuse the group's work for their own research. While users may need a degree of technical expertise to properly manipulate the information at present, the group is working on a UI that should make it as accessible as possible for those who don't have a similar background.

    "We wanted to make sure that at the end of this analysis, any doctor in the world can access this data, leverage this data and perform their analysis on it to hopefully drive to more precision-type medicine," says Depa.

    But clinical researchers and doctors aren't always gifted data scientists, so the group has been working on ways to visualise the information, using Unity. The tools they've created allow researchers to manipulate the data in 3D, and zoom in and out on anomalies in the data to find data points that may be worthy of further exploration. One enterprising researcher has even been able to explore those datasets in virtual reality using an Oculus.

    Historically, paediatric and adult AML were treated as largely the same disease. However, according to Dr Soheil Meshinchi, professor in the Fred Hutchinson Cancer Research Center's clinical research division and lead for tpAML's computational working group, the two groups stem from different causes. In adults, the disease arises from changes to the smallest links in the DNA chain, known as single base pairs, while in children it's driven by alterations to larger chunks of their chromosomes.

    The tpAML has allowed researchers to find previously unknown alterations that cause the disease in children. "We've used the data that tpAML generated to probably make the most robust diagnostic platform that there is. We've identified genetic alterations which was not possible by conventional methods," says Meshinchi.

    Once those mutations are found, the data analysis platformcan begin identifying drugs that could potentially target them. Protocols for how to treat paediatric AML have remained largely unchanged for decades and new, more individualised treatment options are sorely needed.

    "We've tried it for 40 years of treating all AML the same and hoping for the best. That hasn't worked – you really need to take a step back and to treat each subset more appropriately based on the target that's expressed," says Meshinchi.

    The data could help by identifying drugs that have already been developed to treat other conditions but may have a role in fighting paediatric AML, and by showing the pharmaceutical companies that make those drugs there is hard evidence that starting the expensive and risky.

    Using the analytics platform to find drugs that can be repurposed in this way, rather than created from scratch, could cut the time it takes for a new paediatric AML treatment to be approved by years. One drug identified as a result has already been tested in clinical trials.

    The results generated by the team's work has begun to have an impact for paediatric AML patients. When the data was used to show a subset of children with the disease who had a particular genetic marker that were considered particularly high risk, the treatment pathway for those children was altered.

    "This data will not only have an impact ongoing but is already having an impact right now," says Julie Guillot, co-founder of tpAML.

    "One cure for leukaemia or one cure for AML is very much unlikely. But we are searching for tailored treatments for specific groups of kids… when [Meshinchi] and his peers are able to find that Achilles heel for a specific cluster of patients, the results are dramatic. These kids go from a very low percentage of cure to, for example, a group that went to 95%. This approach can actually work."

    Author: Jo Best

    Source: ZDNet

  • Are you aware of the value of your data?

    Are you aware of the value of your data?

    While most executives understand that their data is an asset, many haven’t harnessed the valuable insights accessible with a data analytics solution. The immense amount of data you generate may seem impossible to understand, but data analytics will transform it into clear, actionable information. Another way at looking at data, if you closed your doors tomorrow, what would the new owner be most interested in? The products on your shelves or the insights into your customers?

    Better understand your customers

    Leveraging your data can help you better understand your customers. For instance, you can create robust customer profiles that include information such as sector, job title, geographical locations, channels they use, and preferences. Identify their purchasing behaviors such as what they are buying, what they aren’t, when, how often, in what quantity, and their lifetime value.

    Understanding your customers enables your sales team to recognize new cross- and up-selling opportunities and recognize your top performing accounts. Knowing your best customers means you can reinforce those relationships by periodically rewarding them with a special promotion for products they like. Another benefit of analytics is the ability to identify when a customer is declining. By analyzing customer buying habits and visit frequency, your team can quickly detect a reduction in order frequency or volume, and make a sales call to find out if there is a problem.

    Transactional data keeps an eye on product sales

    Transactional data such as time, place, price, discount, and payment methods is generated at the point of sale. This data can help you measure the success of your various product lines. By analyzing your transactional data, you can tell you whether a product is gaining traction with your target customer base or it can reveal an unexpected dip in sales.

    While it’s important to determine which products aren’t selling as expected, it’s equally important to identify the products with high conversion rates. It may be that the price point is too low, for example. Finally, your transactional data can help you identify trends such as seasonal buying patterns. Knowing when sales increase due to the season can help you better manage the trend. If you know that sales for a particular product line typically increase in October, you can prepare for this by adjusting your stock level to meet the upcoming rise in demand.

    Be more strategic

    Even though many companies have adopted data analytics to guide their decision making, many other companies still rely on traditional approaches. Without realizing it, this means they are a step behind their competition. On the other hand, companies that use a data analytics solution to extract the value from their data have greater success. A study from the MIT Center for Digital Business found that companies that adopt a data-driven culture have 4% higher productivity rates and 6% higher profits. Data-driven companies rely on hard, verifiable data to back up their decision-making rather than making decisions based on intuition and gut-alone. An analytics solution can show you where to strategically deploy your business resources so you can gain a competitive advantage.

    Manage costs

    A major business resource is your capital. Managing your costs enables you to make the most profitable investments. Data analytics can help you lower costs companywide. For instance, analytics can help you track shipments and optimize deliveries to lower your shipping costs. Your marketing team can use analytics to trim marketing costs by creating targeted marketing campaigns and assessing their effectiveness. Finally, data analytics can help you improve employee performance and operational efficiencies across your various departments.

    To remain competitive in our data-driven economy, your business decisions must be based on credible evidence rather than on subjective experience. Data analytics helps companies achieve their goals by identifying fact-based, actionable insights so executives can develop effective strategies for each area of the business. 

    Source: Phocas Software

  • Augmented analytics: when AI improves data analytics

    Augmented analytics: when AI improves data analytics

    Augmented analytics: the combination of AI and analytics is the latest innovation in data analytics. For organizations, data analysis has evolved from hiring “unicorn” data scientists – to having smart applications that provide actionable insights for decision-making in just a few clicks, thanks to AI. 

    Augmenting by definition means making something greater in strength or value. Augmented analytics, also known as AI-driven analytics, helps in identifying hidden patterns in large data sets and uncovers trends and actionable insights. It leverages technologies such as Analytics, Machine Learning, and Natural Language Generation to automate data management processes and assist with the hard parts of analytics. 

    According to Gartner, by the end of 2024, 75% of enterprises will operationalize AI, driving a 5x increase in streaming data and analytics infrastructures. The capabilities of AI are poised to augment analytics activities and enable companies to internalize data-driven decision-making while enabling everyone in the organization to easily deal with data. This means AI helps in democratizing data across the enterprise and saves data analysts, data scientists, engineers, and other data professionals from spending time on repetitive manual processes.

    How does AI improve analytics?

    The latest advances in Artificial Intelligence play a significant role in making business processes more efficient and powerful with the help of automation. Analytics, too, is becoming more accessible and automated because of AI. Here are a few ways in which AI is contributing to analytics:

    • With the help of machine learning algorithms, AI systems can automatically analyze data and uncover hidden trends, patterns, and insights that can be used by employees to make better-informed decisions. 
    • AI automates report generation and makes data easy-to-understand by using Natural Language Generation.
    • Using Natural Language Query (NLQ), AI enables everyone in the organization to intuitively find answers and extract insights from data, thereby improving data literacy and freeing time for data scientists.
    • AI helps in streamlining BI by automating data analytics and delivering insights and value faster.

    So, how does it work?

    While traditional BI used rule-based programs to deliver static analytics reports from data, augmented analytics leverages AI techniques such as Machine Learning and Natural Language Generation to automate data analysis and visualization. 

    • Machine Learning learns from data and identifies trends, patterns, and relationships between data points. It can use past instances and experiences to adapt to changes and improvise on the data. 
    • Natural Language Generation uses language to convert the findings from machine learning data into easy-to-decipher insights. Machine Learning derives all the insights, and NLG converts those insights into a human-readable format.

    Augmented analytics can also take in queries from users and generate answers in the form of visuals and text. This entire process is of generating insights from data is automated and makes it easy for non-technical users to easily interpret data and identify insights.

    Augmented analytics for enterprises

    Business Intelligence can help in making improved business decisions and driving better ROI by gathering and processing data. A good BI tool collects important data from internal and external sources and provides actionable insights out of it. Augmented analytics simply improves business intelligence and helps enterprises in the following ways:

    1. Accelerates data preparation

    Data analysts usually spend most of their time in extracting and cleaning their data. Augmented analytics takes away all the painstaking processes that data analysts need to do by automating the ETL (extract, transform and load) data process and providing valuable data that can be useful for analysis.

    1. Automates insight generation

    Once the data is prepared and ready for processing, augmented analytics uses it to automatically derive insights. It uses machine learning algorithms to automate analyses and quickly generate insights, which would take days and months if done by data scientists and analysts. 

    1. Allows querying of data

    Augmented analytics makes it easy for users to ask questions and interact with data. With the help of NLQ and NLG, it takes in queries in the form of natural language, translates it into machine language, and then produces meaningful results and insights in the form of easy-to-understand language. This makes data analytics a two-way conversation wherein businesses can ask questions to their data and get answers in real-time.

    1. Empowers everyone to use analytics products

    The feature of querying data makes it possible for professionals to delve deeper into their data and also enables everyone in the organization to use analytics products. Enterprises no longer require data scientists or professionals with technical expertise to use BI tools to analyze data. This has led to an increase in the user base of BI and analytics tools.

    1. Automates report generation and dissemination

    With augmented analytics, insights can be generated from data at the speed of thought. These insights can further be used to automate report writing, saving a lot of manual efforts in report generation. 

    Augmented analytics in action

    Augmented Analytics can be used to solve various business problems. Some use cases and applications of it include demand forecasting, fraud, and anomaly detection, deriving customer and market insights, performance tracking, and so on. Here are a few examples:

    • Banking and financial institutions use augmented analytics to generate personalized portfolio analysis reports.
    • Retail and FMCG companies use intelligence powered by augmented analytics to track market insights and make informed decisions.
    • Companies in the financial services sector use recommendations and insights mined by augmented analytics to detect and prevent fraud or anomalies.
    • Media and entertainment companies use insights generated from augmented analytics to provide tailored content to their users.
    • Marketing and sales functions across businesses use augmented analytics to extract data from external and internal sources and gain insights into sales, customer trends, and product performance.

    Wrapping up

    The complexity and scale of data being produced and used by businesses across sectors are more than humans alone can handle. Enterprises have started adopting the new AI wave in analytics to tackle data and improve their processes. Augmented analytics is the disruptor, and leveraging it with BI platforms can help businesses to analyze data faster, optimize their operations and make data teams more productive.

    Author: Neerav Parekh

    Source: Dataconomy

  • BI dashboards: best practices

    BI dashboards: best practices

    If you want your business intelligence dashboards to succeed, you'll need to make sure you follow these best practices along the way. Here's what to know.

    Business intelligence (BI) dashboards are increasingly used by companies around the world. If you use one or intend to, knowing some business intelligence best practices can help you avoid pitfalls.

    Here are 10 business intelligence best practices to follow as you design a dashboard and choose which information to display.

    1. Identify your reporting requirements

    BI dashboards make it easy to gather statistics and turn them into reports. Before diving into that task, clarify what to include in the review and which departments will read it.

    For example, the accounting department likely needs substantially different metrics than your customer service team. Get confirmation of the necessary details and the intended audience first to save yourself from wasting time on extra work and including irrelevant information.

    2. Choose a dashboard to meet your needs

    There are several kinds of BI dashboards on the market:

    • Strategic: These aggregate crucial details about your organization’s current status and health while highlighting opportunities for expansion.
    • Analytical: Dashboards that show data variables across a set timeframe and help you spot trends.
    • Operational: Choose these dashboards if you want to focus on key performance indicators and real-time operations changes.
    • Tactical: Mid-level managers most commonly use these dashboards, which give deep dives into a company’s processes by showing weekly metrics and trends.

    Find business intelligence dashboard examples based on the category above that most closely matches your needs before investing in a solution. Doing that increases the chances of feeling satisfied with your investment.

    3. Design your dashboard to minimize distractions

    One of the most useful dashboard design best practices to follow involves getting rid of superfluous information. Make your dashboards useful for everyone by following the five-second rule. Pick the dashboard’s content carefully so that anyone looking at it can get the details they need in a few seconds.

    Scrutinize the information and verify that each graphic or text snippet serves a well-defined purpose. If it doesn’t, take it out. Adding too much data to your dashboard could make it more challenging for people to focus on the parts that matter most to their work.

    4.  Call attention to relevant numbers

    Some viewers may appreciate graphic helpers that highlight statistics. For example, one of the Power BI dashboard best practices Microsoft recommends for its product is to use a card visualization for numerical figures.

    If you use a different BI product without that feature, consider other ways to help numbers stand out. For example, you might put them in a bright color or increase the size of the figure compared to the surrounding text.

    5. Restrict dashboard access to authorized parties

    Working with a BI dashboard also means engaging in the appropriate security measures. Some content management systems allow you to only give administrative capabilities to people with the right credentials. You could take the same approach with your BI interface.

    Think about setting role-based privileges based on whether a person requires editing privileges for their work or only needs to look at the content. Adjust or remove an individual’s access as appropriate, such as when they get promotions or leave the company.

    Also, encourage everyone to demonstrate good password hygiene, including using a different password for each service and never sharing credentials.

    6. Arrange your data according to the inverted pyramid model

    News professionals understand the inverted pyramid approach well. It involves mentioning the most important information first in an article and devoting the most overall space to it. The less-crucial details appear near the end of the piece and may only encompass a single paragraph.

    You can follow dashboard design best practices by letting the inverted pyramid model dictate how you show the data. For example, feature the main details inside the largest panes or sections.

    7. Select the right kind of chart

    Charts can be ideal for helping executives deal with the challenge of interpreting data and using it for decision-making. You’ll get the best results when you pick a chart type that aligns with your needs and the type of data presented.

    For example, line charts work well for showing trends over time, while pie charts let you show how single categories relate to an overall value. You might also use a vertical bar graph to help users compare differences side by side. The main thing to remember is that no one chart is the universal ideal.

    8. Include the most important information on a single screen

    If you’ve spent time checking out business intelligence dashboards, it may have become obvious that all the crucial details are immediately presented and don’t require swiping between several screens. Allowing people to see the essential material on one screen is the best approach because it increases clarity and helps you stick to your main points.

    Think about how some of the people who see the content may have packed schedules and might feel eager to get the information they need without wasting time. We discussed earlier how you should cut out unnecessary information to prevent distractions. This is a related point, but it’s a tip that encourages you to think about which data to show first while remembering your audience’s requirements.

    9.  Consider optimizing your dashboard for mobile users

    Web designers know how important it is to design content for mobile phones, especially since many people view it on those devices more often than their computers. One of the related Power BI best practices is to tweak your dashboard for those who look at it on smartphones.

    Doing that involves switching the content from Web View to Phone View in the dashboard upper-right corner. You’ll only see that option as the dashboard’s owner. While in phone view, you can adjust the layout so that it appears differently to phone versus computer users by rearranging tiles or changing their sizes and shapes.

    If you use a different product, determine whether it has a mobile-friendly option.

    10. Display data in the proper context

    As you design your chart, pay attention to how factors like the relative size and color of content on the BI dashboard could lead people to draw certain conclusions, not all of them necessarily correct. Ensure that you use labels and source citations to help people see the data in the right framework and not get the wrong ideas.

    You’ve probably seen at least a few dashboards that looked fantastic at first glance but later realized they did not offer enough context. In that case, you probably came away with some questions and uncertainties. Including reference points for the statistics and charts on a dashboard helps viewers feel confident while digesting the material.

    Tips to guide your efforts

    These business intelligence best practices will help you get the most out of any dashboard you purchase and use. Remember that it’s also valuable to devote sufficient time to training yourself or your colleagues on how to use the tool. Each BI on the market has different features and layouts.

    The more thoroughly you get acquainted with them, the easier it’ll be to get the results you want.

    Author: Kayla Matthews

    Source: Smart Data Collective

  • Big Data on the cloud makes economic sense

    With Big Data analytics solutions increasingly being made available to enterprises in the cloud, more and more companies will be able to afford and use them for agility, efficiency and competitiveness

    For almost 10 years, only the biggest of technology firms such as Alphabet Inc.’s Google and Amazon.com Inc.
    used data analytics on a scale that justified the idea of ‘big’ in Big Data. Now more and more firms are
    warming up to the concept. Photo: Bloomberg

    On 27 September, enterprise software company SAP SE completed the acquisition of Altiscale Inc.—a provider of Big Data as-a-Service (BDaaS). The news came close on the heels of data management and analytics company Cloudera Inc. and data and communication services provider CenturyLink Inc. jointly announcing BDaaS services. Another BDaaS vendor, Qubole Inc., said it would offer a big data service solution for the Oracle Cloud Platform.

    These are cases in point of the growing trend to offer big data analytics using a cloud model. Cloud computing allows enterprises to pay for software modules or services used over a network, typically the Internet, on a monthly or periodical basis. It helps firms save relatively larger upfront costs for licences and infrastructure. Big Data analytics solutions enable companies to analyse multiple data sources, especially large data sets, to take more informed decisions.

    According to research firm International Data Corporation (IDC), the global big data technology and services market is expected to grow at a compound annual growth rate (CAGR) of 23.1% over 2014-2019, and annual spending is estimated to reach $48.6 billion in 2019.

    With Big Data analytics solutions increasingly being made available to enterprises in the cloud, more and more companies will be able to afford and use them for agility, efficiency and competitiveness.

    MarketsandMarkets, a research firm, estimates the BDaaS segment will grow from $1.8 billion in 2015 to $7 billion in 2020. There are other, even more optimistic estimates: research firm Technavio, for instance, forecasts this segment to grow at a CAGR of 60% from 2016 to 2020.

    Where does this optimism stem from?

    For almost 10 years, it was only the biggest of technology firms such as Alphabet Inc.’s Google and Amazon.com Inc., that used data analytics on a scale that justified the idea of ‘big’ in Big Data. In industry parlance, three key attributes are often used to understand the concept of Big Data. These are volume, velocity and variety of data—collectively called the 3Vs.

    Increasingly, not just Google and its rivals, but a much wider swathe of enterprises are storing, accessing and analysing a mountain of structured and unstructured data. The trend is necessitated by growing connectivity, falling cost of storage, proliferation of smartphones and huge popularity of social media platforms—enabling data-intensive interactions not only among ‘social friends’ but also among employers and employees, manufacturers and suppliers, retailers and consumers—virtually all sorts of connected communities of people.

    g tech web
    A November 2015 IDC report predicts that by 2020, organisations that are able to analyse all relevant data and deliver actionable information will achieve an extra $430 billion in productivity benefits over their less analytically oriented peers.

    The nascent nature of BDaaS, however, is causing some confusion in the market. In a 6 September article onNextplatform.com, Prat Moghe, founder and chief executive of Cazena—a services vendor—wrote that there is confusion regarding the availability of “canned analytics or reports”. According to him, vendors (solutions providers) should be carefully evaluated and aspects such as moving data sets between different cloud and on-premises systems, ease of configuration of the platform, etc., need to be kept in mind before making a purchase decision.

    “Some BDaaS providers make it easy to move datasets between different engines; others require building your own integrations. Some BDaaS vendors have their own analytics interfaces; others support industry-standard visualization tools (Tableau, Spotfire, etc.) or programming languages like R and Python. BDaaS vendors have different approaches, which should be carefully evaluated,” he wrote.

    Nevertheless, the teething troubles are likely to be far outweighed by the benefits that BDaaS brings to the table. The key drivers, according to the IDC report cited above, include digital transformation initiatives being undertaken by a lot of enterprises; the merging of real life with digital identity as all forms of personal data becomes available in the cloud; availability of multiple payment and usage options for BDaaS; and the ability of BDaaS to put more analytics power in the hands of business users.

    Another factor that will ensure growth of BDaaS is the scarcity of skills in cloud as well as analytics technologies. Compared to individual enterprises, cloud service providers such as Google, Microsoft Corp., Amazon Web Services and International Businsess Machines Corp. (IBM) can attract and retain talent more easily and for longer durations.

    Manish Mittal, managing principal and head of global delivery at Axtria, a medium-sized Big Data analytics solutions provider, says the adoption of BDaaS in India is often driven by business users. While the need is felt by both chief information officers and business leaders, he believes that the latter often drive adoption as they feel more empowered in the organisation.

    The potential for BDaaS in India can be gauged from Axtria’s year-on-year business growth of 60% for the past few years—and there are several niche big data analytics vendors currently operating in the country (besides large software companies).

    Mittal says that the growth of BDaaS adoption will depend on how quickly companies tackle the issue of improving data quality.

    Source: livemint.com, October 10, 2016


  • Competenties en mogelijkheden voor succes met (big) data analytics


    Voor bedrijven uit alle industrieën is big data analytics van grote waarde. Deze waarde ontstaat onder andere door een betere focus op de klant en het verbeteren van processen. Toch is het niet gemakkelijk om deze waarde er meteen uit te halen. Veel organisaties onderschatten de kosten, complexiteit en competenties om op dat punt te komen.

    Big data analytics

    Big data analytics helpt bij het analyseren van datasets die over het algemeen een stuk groter en gevarieerder zijn dan de data types uit traditionele business intelligence of datawarehouse omgevingen. Het doel van big data analytics is het herkennen van verborgen patronen, onbekende correlaties, markttrends, voorkeuren van de klant en andere informatieve bedrijfsinformatie.

    Waarom is succes behalen met big data lastig?

    Succes behalen met big data is niet vanzelfsprekend. Veel organisaties worstelen op verschillende aspecten met het inzetten van big data. De volgende aspecten kunnen worden onderscheiden:

    • Big data analytics wordt gezien als een technologie project en niet als een transformatie dat op verschillende fronten binnen de organisatie plaatsvindt.
    • Het ecosysteem van aanbieders is gefragmenteerd en veranderd snel.
    • Nieuwe technologieën en architecturen vragen om nieuwe vaardigheden van gebruikers.
  • Data analytics: From studying the past to forecasting the future

    Data analytics: From studying the past to forecasting the future

    To compete in today's competitive market place, it is critical that executives have access to an accurate and holistic view of their business. The key element to sifting through a massive amount of data to gain this level of transparency is a robust analytics solution. As technology is constantly evolving, so too are data analytics solutions. 

    In this blog, three types of data analytics and the emerging role of artificial intelligence (AI) in processing the data are discussed:

    Descriptive analytics

    As the name suggests, descriptive analytics describe what happened in the past. This is accomplished by taking raw historical, whether from five minutes or five years ago, and presenting an easy-to-understand, accurate view of past patterns or behaviors. By understanding what happened, we can better understand how it might influence the future. Many businesses use descriptive analytics to understand customer buying patterns, sales year-over-year, historical cost-to-serve, supply chain patterns, financials, and much more.

    Predictive analytics

    This is the ability to accurately forecast or predict what could happen moving forward. Understanding the likelihood of future outcomes enables the company to better prepare based on probabilities. This is accomplished by taking the historical data from your various silos such as CRM, ERP, and POS, and combining it into one single version of the truth. This enables users to identify trends in sales, forecast demands on the supply chain, purchasing and inventory level based on a number of variables. 

    Prescriptive Analytics

    This solution is the newest evolution in data analytics. It takes previous iterations to the next level by revealing possible outcomes and prescribing courses of actions. In addition, this solution will also show why it will happen. Prescriptive analytics answers the question: What should we do? Although this is a relatively new form of analytics, larger retail companies are successfully using it to optimize customer experience, production, purchasing and inventory in the supply chain to make sure the right products are being delivered at the right time. In the stock market, prescriptive analytics can recommend where to buy or sell to optimize your profit.

    All three categories of analytics work together to provide the guidance and intelligence to optimize business performance.

    Where AI fits in

    As technology continues to advance, AI will become a game-changer by making analytics substantially more powerful. A decade ago, analytics solutions only provided descriptive analytics.  As the amount of data generated increased, solutions started to develop predictive analytics. As AI evolves, data analytics solutions are also changing and becoming more sophisticated. BI software vendors are currently posturing to be the first to market with an AI offering to enhance prescriptive analytics. 

    AI can help sales-based organizations by providing specific recommendations that sales representatives can act on immediately. Insight into customer buying patterns will allow prescriptive analytics to suggest products to bundle which ultimately leads to an increase in the size of an order, reduce delivery costs and number of invoices.

    Predictive ordering has enabled companies to send products you need before you order them. For example, some toothbrush or razor companies will send replacement heads in this way. They predict when the heads will begin to fail and order the replacement for you. 

    Improving data analytics for your business

    If you are considering enhancing your data analytics capability and adding artificial intelligence, we encourage you to seek out a software vendor that offers you industry-matched data analytics that is easy and intuitive for everyone to use. This means dashboards, scorecards, alerts developed with the standard KPIs for your industry, pre-built.

    Collaborating to customize the software to fit your business and augmenting with newer predictive analytics and machine learning-based AI happens next.

    Source: Phocas Software

  • Data conscious consumers: What about them and how to target them?

    Data conscious consumers: What about them and how to target them?

    Data is big news these days. Tech behemoths like Google and Facebook make a lot of money from our data, and people are growing increasingly aware of how valuable their data is to companies of all sizes.

    But as data becomes increasingly valuable for companies, consumers are starting to question how much data they are prepared to give away. Many people are becoming uncomfortable with the idea of giving away their personal information.

    It’s easy to understand their concerns. Huge data breaches each the headlines on a regular basis. It seems like every week, a large and respected brand loses millions of passwords or credit card details.

    As consumers become warier about handing over their data, this poses a challenge for brands. How can you persuade your customers that it’s in their interests to hand over their data? And how can you market to them more effectively as a result?

    Focus on the value exchange

    If a consumer sees little value in handing over their data, they will be far less likely to do so. As such, your focus should be on trading data for something of value.

    This idea has been around for a long time. Every time you sign up for an email list in return for a voucher or free eBook, this is the value exchange at work. Some companies use the concept of gated content whereby the consumer is given access to valuable content on a website in return for their data.

    One of the most common ways that companies use this value exchange is to provide a better experience for the consumer in return for their data. In this case, the consumer may provide an app with permission to access their location, and the app then provides them with directions or specific products based on where they are.

    In short, value exchange needs to be evident in some form. You need to convince your customers that they will enjoy a better experience or receive something of value in return for their data.

    Understand different types of data consciousness

    Consumers are all different, and they have different ideas about how their data should be used. Some consumers are perfectly happy to hand over their data, while others hold the opposite view.

    Brands need to understand the differences between consumers before they can start marketing to them effectively. Consumers can broadly be separated into three groups:

    1. Data unconcerned:These consumers do not care how their data is used and they are happy to hand over more data more readily.

    2. Data pragmatists: These consumers are more guarded about their data, but they are willing to give it away if they can see a clear value exchange.

    3. Data fundamentalists: These consumers are not willing to give away their data under any circumstances.

    According to research from the Data & Marketing Association (DMA), the percentages of the population in each group are roughly as follows:

    • Data unconcerned: 25%
    • Data pragmatists: 50%
    • Data fundamentalists: 25%

    Clearly, when it comes to your marketing efforts, you want to be targeting those consumers in the 75% of the population under the ‘Data Unconcerned’ and ‘Data Pragmatists’ groups.

    So how should you do this? There are three key principles to focus on.

    1. Hyper-Personalization

    Personalization has long been an important concept in marketing. But these days, businesses need to go beyond basic information like the consumer’s name and location. The focus should be on hyper-personalization.

    Hyper-personalization uses data like browsing behavior, purchasing behavior and real-time data to change the message you send to your customers.

    The first thing you will need to do is collect the data. You need quality data to personalize effectively, and that means you need to know the types of people who buy specific products, how much they spend, the types of models they are interested in, which brands they like, and more. Look at Spotify’s annual ‘Wrapped’ campaign, where the company sends users an annual roundup of their yearly listening. At first sight it’s a fun, quirky way to see user data in action. But it also shows exactly which data is being collected.

    Context also comes into it. Factors like location, the weather, important events, seasons and real-time pricing can all be used in your messaging.

    You could launch a browser abandonment campaign where you target people who were looking at a product but did not make a purchase, perhaps offering them a discount if they buy it now.

    In short, the more personalized you can make your messaging, the more effective it will be.

    2. Convenience

    Other than personalization, businesses also need to focus on the convenience of your messaging. The hyper personalized communications need to be delivered through the right channels at the right times.

    This means gathering data about the engagement techniques that work best for different types of consumers, and then using these to provide greater convenience for them.

    3. Relevance

    With all this collecting of data, there is a genuine concern that your business will annoy your customers and they will opt out of your communications.

    As a result, it is necessary be careful about how data is gathered and how consumers are contacted. Data conscious consumers will have strict preferences about how they want to be communicated with. This preference data is essential to avoid alienating them.

    By setting up a preference center where customers are asked how they want to communicate and the types of messages they want to receive. This can be done at the sign-up stage or later if preferred, perhaps by sending an email requesting the information.

    Brands must also work to clarify the value of signing up to a service. If you can’t explain why you want their details, you shouldn’t have them. Use the Abercrombie & Fitch app for inspiration. Users who download it are rewarded with points that can add up to substantial discounts, and there’s a clear correlation between performing an action (i.e. registering) and receiving a gift.

    Give consumers the option to opt-out of communications as well. Consumers may decide they don’t want your weekly newsletter, but rather than unsubscribing from all your communications, they may want to keep the promotions. Having the option to choose different types of emails they want to receive can be helpful in this case.

    By managing customers’ data preferences effectively, businesses can ensure the right messages get to them more often.

    Gather data and use it wisely

    Marketing is changing all the time. Customers are more data conscious than ever, and this shows no signs of changing. To reach the right people with the right messages at the right time, you need to focus on gathering as much data as possible, without annoying your customers, and understanding your customers’ preferences.

    That way, you can continue to reach them with (hyper-) personalizedand marketing messages that generate sales for your company.

    Author: Jason Lark

    Source: Dataversity

  • DataOps and the path from raw to analytics-ready data

    DataOps and the path from raw to analytics-ready data

    For the first time in human history, we have access to the second-by-second creation of vast quantities of information from nearly every activity of human life. It’s a tectonic shift that’s transforming human society. And among the myriad impacts is an important one for every business: the shift in data users’ expectations. In the same way that the advent of smartphones triggered expectations of access and convenience, the explosion in data volume is now creating expectations of availability, speed, and readiness. The scalability of the internet of things (IoT), AI in the data center, and software-embedded machine learning are together generating an ever-growing demand in the enterprise for immediate, trusted, analytics-ready data from every source possible.

    It makes complete sense, since there’s a direct correlation between your business’s ability to deliver analytics-ready data and your potential to grow your business. But as every data manager knows, yesterday’s infrastructure wasn’t built to deliver on today’s demands. Traditional data pipelines using batch and extended cycles are not up to the task. Neither are the legacy processes and lack of coordination that grew out of the siloed way we’ve traditionally set up our organizations, where data scientists and analysts are separate from line-of-business teams.

    As a result, enterprises everywhere are suffering from a data bottleneck. You know there’s tremendous value in raw data, waiting to be tapped. And you understand that in today’s data-driven era, success and growth depend on your ability to leverage it for outcomes. But the integration challenges presented by multi-cloud architecture put you in a difficult position. How can you manage the vast influx of data into a streamlined, trusted, available state, in enough time to act? How can you go from raw to ready for all users, in every business area, to uncover insights when they’re most impactful? And perhaps most importantly, how can you make sure that your competitors don’t figure it all out first?

    The raw-to-ready data supply chain

    There’s good news for everyone struggling with this issue.

    First, the technology is finally here. Todays’ data integration solutions have the power to collect and interpret multiple data sets; eliminate information silos; democratize data access; and provide a consistent view of governed, real-time data to every user across the business. At the same time, the industry trend of consolidating data management and analytics functions into streamlined, end-to-end platforms is making it possible for businesses to advance the speed and the accuracy of data delivery. And that, in turn, is advancing the speed and accuracy of insights that can lead to new revenue creation.

    And second, we’re seeing the emergence of DataOps, a powerful new discipline that brings together people, processes, and technologies to optimize data pipelines for meeting today’s considerable demands. Through a combination of agile development methodology, rapid responses to user feedback, and continuous data integration, DataOps makes the data supply chain faster, more efficient, more reliable, and more flexible. As a result, modern data and analytics initiatives become truly scalable, and businesses can take even greater advantage of the data revolution to pull ahead.

    What is DataOps for analytics?

    Like DevOps before it, which ignited a faster-leaner-more-agile revolution in app development, DataOps accelerates the entire ingestion-to-insight analytics value chain. Also like DevOps, DataOps is neither a product nor a platform; it’s a methodology that encompasses the adoption of modern technologies, the processes that bring the data from its raw to ready state, and the teams that work with and use data.

    By using real-time integration technologies like change data capture and streaming data pipelines, DataOps disrupts how data is made available across the enterprise. Instead of relying on the stutter of batch orientation, it moves data in a real-time flow for shorter cycles. Additionally, DataOps introduces new processes for streamlining the interaction among data owners, database administrators, data engineers, and data consumers. In fact, DataOps ignites a collaboration mentality (and a big cultural change) among every role that touches data, ultimately permeating the entire organization.

    What does DataOps look like from a data-user perspective?

    In a subsequent post, I’ll delve more granularly into the technical and procedural components of DataOps for Analytics, looking at it from an operational perspective. For this post, where I want to highlight the business impact, I’ll start with a quick overview of what DataOps looks like from a data-user perspective.

    • All data, trusted, in one simplified view: Every data-user in the enterprise has 24/7 access to the data (and combinations of data) they need, in an intuitive and centralized marketplace experience. Analysts of every skill level can load, access, prepare, and analyze data in minutes without ever having to contact IT.
    • Ease of collaboration: It becomes faster and easier for data scientists and business analysts to connect and collaborate, and crowd-sourcing of key information. For example, the identification and surfacing of the most popular and reliable data sets becomes possible.
    • Reliability and accuracy: Because the data is governed and continuously updated, with all users drawing from the same data catalogue, trust is high, teams are aligned, and insights are reliable.
    • Automation: Users are freed to ask deeper questions sooner, thanks to the automation of key repeatable requests. And with AI-enabled technologies that suggest the best visualization options for a given data set, chart creation is faster and easier, too. Other AI technologies point users toward potential new insights to explore, prompting them to reach relevant and previously undiscovered insights.
    • Ease of reuse: Data sets do not have to be generated again and again, for every application, but rather can be reused as needs arise and relevance expands – from planning and strategy to forecasting and identifying future opportunities in an existing client base.
    • Increased data literacy: DataOps fosters the easiest kind of data literacy boost by automating, streamlining, and simplifying data delivery. Regardless of existing skill levels, every member of your team will find it much more intuitive to work with data that’s readily available and trusted. At the same time, DataOps buttresses the more active efforts of skills training by delivering reliable data in real time. Getting the right data to the right people at the right time keeps even the most advanced analysts moving forward in new directions.

     What are the business outcomes?

    In every era, speed has given businesses a competitive advantage. In the data-driven era, where consumers expect real-time experiences and where business advantage can be measured in fractions of a second, speed has become more valuable than ever. One of the fundamental advantages of DataOps for Analytics is the speed of quality data delivery. The faster you can get data from raw to ready (ready for analysis, monetization, and productization), the faster you can reap all the benefits data promises to deliver.

    But speed is just the beginning. By delivering governed, reliable, analytics-ready data from a vast array of sources to every user in the enterprise, the raw-to-ready data supply chain becomes an elegant lever for business transformation and growth. Here are four key areas where DataOps galvanizes transformation:

    1. Customer intelligence: With an agile data supply chain, you can much more efficiently use analytics to improve customer experience and drive increased lifetime value. Discover deeper customer insights faster, and use them to customize interactions; increase conversion; and build long-term, one-to-one customer relationships by offering personalized experiences at scale.
    2. Reimagined processes: Accelerating, streamlining, and automating your data pipelines enables teams across your organization to more quickly and effectively optimize every aspect of business for efficiency and productivity. This includes automating processes, reducing costs, optimizing the overall supply chain, freeing up scarce resources, improving field operations, and boosting performance.
    3. Balanced risk and reward: Nimble data-delivery empowers analytics users to get timely insight into internal and external factors to make faster, smarter decisions around risk. Leaders can manage production; keep data current, consistent, and in the right hands; and stay compliant while preparing for the future.
    4. New business opportunities: And finally, a raw-to-ready data supply chain gives you the power to develop new products, services, and revenue streams with insights gleaned from data and/or to monetize the data itself. This may be the most exciting opportunity we’re seeing with DataOps for Analytics today; it’s certainly the most transformative. For example, consider how storied American conglomerate GE has transformed a century-old business model (selling hardware) to create a digital platform for commodifying their data. And think about how tech behemoths like Amazon and Google have used their massive stores of data and agile analytics capabilities to attack and disrupt traditional markets like insurance, banking and retail.

    The heart of digital transformation

    If you’re launching or underway with strategic digital transformation programs for competitive viability and if you’re a CIO or CDO, data is the key. To thrive, your initiatives need an agile, integrated data and analytics ecosystem that provides a raw-to-ready data supply chain, accelerates time-to-insight, and enables a rapid test-and-learn cycle. That’s DataOps for Analytics, and it’s the dawn of a new era in the evolution of the data-driven organization.

    Author: Mike Capone

    Source: Qlik

  • Dealing with data preparation: best practices - Part 1

    Dealing with data preparation: best practices - Part 1

    IBM is reporting that data quality challenges are a top reason why organizations are reassessing (or ending) artificial intelligence (AI) and business intelligence (BI) projects.

    Arvind Krishna, IBM’s senior vice president of cloud and cognitive software, stated in a recent interview with the Wall Street Journal that 'about 80% of the work with an AI project is collecting and preparing data. Some companies aren’t prepared for the cost and work associated with that going in. And you say: ‘Hey, wait a moment, where’s the AI? I’m not getting the benefit.’ And you kind of bail on it'.

    Many businesses are not prepared for the cost and effort associated with data preparation (DP) when starting AI and BI projects. To compound matters, hundreds of data and record types and billions of records are often involved in a project’s DP effort.

    However, data analytics projects are increasingly imperative to organizational success in the digital economy, hence the need for DP solutions.

    What is AI/BI data preparation?

    Gartner defines data preparation as 'an iterative and agile process for exploring, combining, cleaning, and transforming raw data into curated datasets for data integration, data science, data discovery, and analytics/business intelligence (BI) use cases'. 

    A 2019 International Data Corporation (IDC) study reports that data workers spend a remarkable time each week on data-related activities: 33% on data preparation compared to 32 % on analytics (and, sadly, just 13% on data science). The top challenge cited by more than 30% of all data workers in this study was that 'too much time is spent on data preparation'.

    The variety of data sources, the multiplicity of data types, the enormity of data volumes, and the numerous uses for data analytics and business intelligence, all result in multiple data sources and complexity for each project. Consequently, today’s data workers often use numerous tools for DP success.

    Capabilities needed in data preparation tools

    Evidence in the Gartner Research report Market Guide for Data Preparation Tools shows that data preparation time and reporting of information discovered during DP can be reduced by more than half when DP tools are implemented.

    In the same research report, Gartner lists details of vendors and DP tools. The analyst firm predicts that the market for DP solutions will reach $1 billion this year, with nearly a third (30%) of IT organizations employing some type of self-service data preparation tool set.

    Another Gartner Research Circle Survey on data and analytics trends revealed that over half (54%) of respondents want and need to automate their data preparation and cleansing tasks during the next 12 to 24 months.

    To accelerate data understandings and improve trust, data preparation tools should have certain key capabilities, including the ability to:

    • Extract and profile data. Typically, a data prep tool uses a visual environment that enables users to extract interactively, search, sample, and prepare data assets.
    • Create and manage data catalogs and metadata. Tools should be able to create and search metadata as well as track data sources, data transformations, and user activity against each data source. It should also keep track of data source attributes, data lineage, relationships, and APIs. All of this enables access to a metadata catalog for data auditing, analytics/BI, data science, and other operational use cases.
    • Support basic data quality and governance features. Tools must be able to integrate with other tools that support data governance/stewardship and data quality criteria.

    Keep an eye out for part 2 of this article, where ake a deeper dive into best practices for data preparation.

    Author: Wayne Yaddow

    Source: TDWI

  • Dealing with data preparation: best practices - Part 2

    Dealing with data preparation: best practices - Part 2

    If you haven't read yesterday's part 1 of this article, be sure to check it out before reading this article.

    Getting started with data preparation: best practices

    The challenge is getting good at DP. As a recent report by business intelligence pioneer Howard Dresner found, 64% of respondents constantly or frequently perform end-user DP, but only 12% reported they were very effective. Nearly 40% of data professionals spend half of their time prepping data rather than analyzing it.

    Following are a few of the practices that help assure optimal DP for your AI and BI projects. Many more can be found from data preparation service and product suppliers.

    Best practice 1: Decide which data sources are needed to meet AI and BI requirements

    Take these three general steps to data discovery:

    1. Identify the data needed to meet required business tasks.
    2. Identify potential internal and external sources of that data (and include its owners).
    3. Assure that each source will be available according to required frequencies.

    Best practice 2: Identify tools for data analysis and preparation

    It will be necessary to load data sources into DP tools so the data can be analyzed and manipulated. It’s important to get the data into an environment where it can be closely examined and readied for the next steps.

    Best practice 3: Profile data for potential and selected source data

    This is a vital (but often discounted) step in DP. A project must analyze source data before it can be properly prepared for downstream consumption. Beyond simple visual examination, you need to profile data, detect outliers, and find null values (and other unwanted data) among sources.

    The primary purpose of this profiling analysis is to decide which data sources are even worth including in your project. As data warehouse guru Ralph Kimball writes in his book, The Data Warehouse Toolkit , 'Early disqualification of a data source is a responsible step that can earn you respect from the rest of the team'.

    Best practice 4: Cleansing and screening source data

    Based on your knowledge of the end business analytics goal, experiment with different data cleansing strategies that will get the relevant data into a usable format. Start with a small, statistically-valid sample to iteratively experiment with different data prep strategies, refine your record filters, and discuss the results with business stakeholders.

    When discovering what seems to be a good DP approach, take time to rethink the subset of data you really need to meet the business objective. Running your data prep rules on the entire data set will be very time consuming, so think critically with business stakeholders about which entities and attributes you do and don’t need and which records you can safely filter out.

    Final thoughts

    Proper and thorough data preparation, conducted from the start of an AI/BI project, leads to faster, more efficient AI and BI down the line. DP steps and processes outlined here apply to whatever technical setup you are using, and they will get you better results.

    Note that DP is not a 'do once and forget' task. Data is constantly generated from multiple sources that may change over time, and the context of your business decisions will certainly change over time. Partnering with data preparation solution providers is an important consideration for the long-term capability of your DP infrastructure.

    Author: Wayne Yaddow

    Source: TDWI

  • Drawing value from data with BI: Data Discovery

    Drawing value from data with BI: Data Discovery

    'We are drowning in information but starved for knowledge' according to best selling author, John Naisbitt. Today’s businesses have the ability to collect an extraordinary amount of information on everything from customer buying patterns and feedback to supply chain management and marketing efforts. Are you drawing value from your data?

    It is nearly impossible to draw value from the massive amount of data your business collects without a data discovery system in place. So, what is data discovery?

    Data discovery

    Data discovery is a term related to business intelligence technology. It is the process of collecting data from your various databases and silos, and consolidating it into a single source that can be easily and instantly evaluated. Once your raw data is converted, you can follow your train of thought by drilling down into the data with just few clicks. Once a trend is identified, the software empowers you to unearth the contributing factors.

    For instance, BI enables you to explore the data by region, different employees, product type, and more. In a matter of seconds, you have access to actionable insights to make rapid, fact-based decisions in response to your discoveries. Without BI, discovering a trend is usually a case of  coincidence.

    With data discovery, the user searches for specific items or patterns in a data set. Visual tools make the process fun, easy-to-use, swift, and intuitive. Visualization of data now goes beyond traditional static reports. BI visualizations have expanded to include geographical maps, pivot-tables, heat maps, and more, giving you the ability to create high-fidelity presentations of your discoveries.

    Discover trends you did not know where there

    With data discovery, executives are often shocked to discover trends they didn’t know were there. Michael Smith of the Johnston Corporation had this to say after implementing BI:

    'Five minutes into the demo, I had found items that didn't have the margin I was expecting, customers that didn't have the profitability I was expecting and vendors that weren't performing the way I expected. I realised that we were onto something that would be very impactful to our business'.

    These discoveries allow companies to discover unfavourable trends before they become a problem and take action to avoid losses.

    Take action now

    Many of the most successful companies today are using BI to inform their strategies and day-to-day operations. With relevant insights, a company can now make the most knowledgeable decisions about effective (digital) strategies to acquire, serve, and retain valuable customers. Having a mountain of data is useless unless it is converted into meaningful information. The ability to discover the truth behind your data will go a long way to guarantee your company achieves and maintains its competitive edge.

    Source: Phocas Software

  • Five Mistakes That Can Kill Analytics Projects

    Launching an effective digital analytics strategy is a must-do to understand your customers. But many organizations are still trying to figure out how to get business values from expensive analytics programs. Here are 5 common analytics mistakes that can kill any predictive analytics effort.

    Why predictive analytics projects fail

    failure of analytics

    Predictive Analytics is becoming the next big buzzword in the industry. But according to Mike Le, co-founder and chief operating officer at CB/I Digital in New York, implementing an effective digital analytics strategy has proven to be very challenging for many organizations. “First, the knowledge and expertise required to setup and analyze digital analytics programs is complicated,” Le notes. “Second, the investment for the tools and such required expertise could be high. Third, many clients see unclear returns from such analytics programs. Learning to avoid common analytics mistakes will help you save a lot of resources to focus on core metrics and factors that can drive your business ahead.” Here are 5 common mistakes that Le says cause many predictive analytics projects to fail.

    Mistake 1: Starting digital analytics without a goal

    “The first challenge of digital analytics is knowing what metrics to track, and what value to get out of them,” Le says. “As a result, we see too many web businesses that don’t have basic conversion tracking setup, or can’t link the business results with the factors that drive those results. This problem happens because these companies don’t set a specific goal for their analytics. When you do not know what to ask, you cannot know what you'll get. The purpose of analytics is to understand and to optimize. Every analytics program should answer specific business questions and concerns. If your goal is to maximize online sales, naturally you’ll want to track the order volume, cost-per-order, conversion rate and average order value. If you want to optimize your digital product, you’ll want to track how users are interact with your product, the usage frequency and the churn rate of people leaving the site. When you know your goal, the path becomes clear.”

    Mistake 2: Ignoring core metrics to chase noise

    “When you have advanced analytics tools and strong computational power, it’s tempting to capture every data point possible to ‘get a better understanding’ and ‘make the most of the tool,’” Le explains. “However, following too many metrics may dilute your focus on the core metrics that reveal the pressing needs of the business. I've seen digital campaigns that fail to convert new users, but the managers still setup advanced tracking programs to understand user 

    behaviors in order to serve them better. When you cannot acquire new users, your targeting could be wrong, your messaging could be wrong or there is even no market for your product - those problems are much bigger to solve than trying to understand your user engagement. Therefore, it would be a waste of time and resources to chase fancy data and insights while the fundamental metrics are overlooked. Make sure you always stay focus on the most important business metrics before looking broader.”

    Mistake 3: Choosing overkill analytics tools

    “When selecting analytics tools, many clients tend to believe that more advanced and expensive tools can give deeper insights and solve their problems better,” Le says. “Advanced analytics tools may offer more sophisticated analytic capabilities over some fundamental tracking tools. But whether your business needs all those capabilities is a different story. That's why the decision to select an analytics tool should be based on your analytics goals and business needs, not by how advanced the tools are. There’s no need to invest a lot of money on big analytics tools and a team of experts for an analytics program while some advanced features of free tools like Google Analytics can already give you the answers you need.”

    Mistake 4: Creating beautiful reports with little business value

    “Many times you see reports that simply present a bunch of numbers exported from tools, or state some ‘insights’ that has little relevance to the business goal,” Le notes. “This problem is so common in the analytics world, because a lot of people create reports for the sake of reporting. They don’t think about why those reports should exist, what questions they answer and how those reports can add value to the business. Any report must be created to answer a business concern. Any metrics that do not help answer business questions should be left out. Making sense of data is hard. Asking right questions early will


    Mistake 5: Failing to detect tracking errors

    “Tracking errors can be devastating to businesses, because they produce unreliable data and misleading analysis,” Le cautions. “But many companies do not have the skills to setup tracking properly, and worse, to detect tracking issues when they happen. There are many things that can go wrong, such as a developer mistakenly removing the tracking pixels, transferring incorrect values, the tracking code firing unstably or multiple times, wrong tracking rule's logic, etc. The difference could be so subtle that the reports look normal, or are only wrong in certain scenarios. Tracking errors easily go undetected because it takes a mix of marketing and tech skills. Marketing teams usually don’t understand how tracking works, and development teams often don’t know what ‘correct’ means. To tackle this problem, you should frequently check your data accuracy and look for unusual signs in reports. Analysts should take an extra step to learn the technical aspect of tracking, so they can better sense the problems and raise smart questions for the technical team when the data looks suspicious.”

    Author: Mike Le

    Source: Information Management

  • Gaining real value for you company with data analytics

    Gaining real value for you company with data analytics

    Experienced business managers know that reliable data is a requirement for success. Accessing complete and accurate data can help your team determine if your business is achieving its key performance indicators (KPIs).

    Data analysis is one of the most valuable practices for measuring business performance in today's competitive market. If you’re unable to gain a clear understanding of your business through data analysis, chances are you’re working within an outdated and limited data analysis reporting system.

    Regardless of your sector, having access to timely, quality data means the difference between generating static reports and generating true business intelligence (BI) that conveys critical information about your business.

    If you’re looking to get more out of your data and ensure your team is making decisions based on comprehensive reports that tell the whole story, consider taking your reporting and analysis in a new direction by implementing the following practices:

    Maintain a single source of truth

    When it comes to data analysis and producing accurate reports, accessing consolidated data is one of the biggest pain points facing businesses today. The next time you touch base with your finance and IT team, ask them how many data sources that have to manually add together to generate reports. Chances are that’s a job within itself.

    This silo-based data system may have worked well when your business was starting out; however, as a business expands and its needs become more complex, outdated solutions could easily stand in the way of profit.

    For many teams, it’s often only a matter of time before there are multiple versions of one spreadsheet being passed around among colleagues, compromising data integrity. A single, modern platform can ensure your data is processed in a seamless, efficient environment that keeps everyone on the same page.

    Aim for real-time data 

    Staying competitive means understanding your business and the needs of your customers in real-time. When it’s time to run reports, where do they, and the team, have to go to access all of that data? Is your finance team searching for data, and making corrections along the way, instead of meeting deadlines and producing up-to-date, dynamic reports?

    Fast access to data means having the ability to collect and analyze critical data on demand. Enterprise Resource Planning systems are an excellent way to store data and plenty of businesses may already have a  reporting system in place that “just works".

    Of course, the job gets done, but consider that keeping your data in ERPs may also be preventing access to the full power of data analytics. This is the difference between actionable data that your team can analyze and use to generate business intelligence and static data that doesn’t reflect your business's current state.

    KPIs vs. metrics

    When it comes to KPIs vs. metrics, it's important to know what you’re measuring and what you’re missing. Every industry has specific metrics that business managers must pay close attention to in order to understand whether their business is succeeding. Different reports detail P&L, customer information and sales. A single spreadsheet can contain valuable information about a business.

    However, some business managers may not realize that they are missing the opportunity to perform deeper data analysis beyond preparing financial statements simply because they lack the most modern tools that can show them how their whole business is performing.

    Once you have a more accurate picture of your business, you and your team may decide it’s time to reset your KPIs. New intelligence could mean new goals.

    Redefine collaboration

    Over the past year, countless businesses have had to switch gears, moving into a full telework environment. Automation can help your business overcome the limits of this environment where resources may also be stretched thin.

    Most managers would agree: Scrambling to find missing data at 5 p.m. is not putting your team’s collective experience and skills to good use. Instead, hand that work over to a platform so your team can focus on collaboration and find new synergies between departments. Revisit workflow with your team to gain a better understanding of where the barriers lie.

    Aim for a truly inclusive workflow that encourages all team members to contribute rather than solely relying on a few people who seem to hold the secrets to generating reports only the finance team can fully understand and utilize.

    By capitalizing on the subject matter expertise of your individual team members across your organization, business managers can use data to gain a clear picture of not only your P&L through financial statements but also your company’s potential for growth.

    Source: Phocas Software

  • Graph Analytics keeps growing in popularity and possibilities

    Graph Analytics keeps growing in popularity and possibilities

    Graph continues to be the fastest growing segment of data management. The benefit: the ability to offer deeper insights on data and in real-time, and therefore enabling better business outcomes. A number of graph solution providers are continuing to innovate by taking their technology to the cloud. Specifically, we’re seeing enterprise-class, pay-as-you-go graph analytics solutions in the cloud based on Amazon Web Services.

    Take TigerGraph for example. This company offers a scalable graph database for the enterprise, and is accelerating its cloud strategy with the availability of its platform as a pay-as-you-go offering on Amazon Web Services (AWS). This move broadens the company’s global reach by providing AWS users with instant access to the world’s fastest and most scalable graph database.

    TigerGraph also achieved advanced partner status in the AWS Partner Network (APN). To obtain this status, TigerGraph’s platform passed AWS’ stringent technical certification process. In addition, TigerGraph had to validate its expertise through a wide range of enterprise references that demonstrated strong customer value. TigerGraph’s customers run applications in AWS for some of the largest brands in the world across financial services, healthcare, and retail.

    With the launch of the pay-as-you-go offering, customers will be able to enjoy a friction-free experience for using TigerGraph’s fast and powerful graph database, without cumbersome acquisition and deployment processes. AWS users can now get an Amazon Machine Image (AMI), which includes the TigerGraph DB and GraphStudio SDK. TigerGraph’s launch in the AWS Marketplace provides an easy-to-use and powerful cloud-based solution with fast deployment and pay-as-you-go pricing.

    One of the key benefits of using the new TigerGraph AMI on AWS is the ease of getting started on the highly performant and scalable platform without having to manage underlying infrastructure. The platform makes it possible for anyone to quickly load data, select a graph algorithm from TigerGraph’s library and explore graph analytics within minutes. With TigerGraph’s speed and performance, users have the ability to go 10 or more levels of connection deep into their data and to compute and reveal insights based on multi-dimensional criteria in real time.

    Author: Daniel Gutierrez

    Source: Insidebigdata


  • Het takenpakket van de CIO gaat ook customer experience omvatten

    Het takenpakket van de CIO gaat ook customer experience omvatten

    De rol van de CIO verandert, dat weten we allemaal. Maar steeds vaker wordt de technologische leider ook leidend in het optimaliseren van de klantervaring.

    IT wordt steeds complexer en de rol van de CIO verandert continu, daar is iedereen het over eens, maar hoe ziet de CIO van de toekomst eruit? De toekomstige CIO is de absolute leider van digitale transformatie en drijver van innovatie en groei. Dit komt omdat de CIO van de toekomst goed in staat is om technische expertise te verbinden met organisatorische skills. Maar wist je ook dat de CIO de persoon is die verantwoordelijk is, of zou moeten zijn, voor de customer experience?

    CIO: de trusted operator en business cocreator

    Onderzoek van Deloitte laat zien dat de rol van de CIO op verschillende manieren verandert. Er wordt in dit onderzoek verschil gemaakt in verschillende rollen die de CIO kan spelen. De trusted operator is de CIO die zich concentreert op efficiëntie, betrouwbaarheid en kosten. Hij of zij levert ondersteunende technologieën en sluit aan bij de bedrijfsstrategie. Een andere rol is de business cocreator, die zijn tijd vooral besteedt aan het sturen van de bedrijfsstrategie en mogelijk maken van verandering. Met als doel een effectieve uitvoering van de strategie.

    Customer experience valt onder takenpakket CIO

    Bij Salesforce helpt men steeds vaker organisaties die van hun CIO verwachten dat hij of zij bedrijfsprocessen, zoals de customer experience, kan transformeren. Uiteraard is hij of zij verantwoordelijk voor het (laten) bouwen van back-endsystemen, maar daarnaast zal de CIO ook bedrijfsprocessen moeten stroomlijnen door middel van technologie. Kijkende naar het onderzoek van Deloitte kun je concluderen dat het optimaliseren van de customer experience steeds meer een taak wordt van de CIO daar dat linkt aan bedrijfsprocessen en dus de rol van de trusted operator.

    Een mooi voorbeeld hiervan is van het bedrijf KONE, waar de CIO verantwoordelijk is voor het steeds slimmer worden van productie en onderhoud van roltrappen, liften en rolpaden met hulp van IoT en realtime-technologie. Het systeem kan storingen opsporen en automatisch een werkorder afgeven, waardoor het probleem al kan worden verholpen voordat de klant door heeft dat er iets mis is. Daarnaast heeft de buitendienst van KONE toegang tot goede en actuele informatie, en kan het door deze inzichten beter inspelen op de behoeften van de klant. Hierdoor kan een bedrijf proactieve, betere service verlenen waardoor de customer experience verbetert en het meer tevreden klanten oplevert.

    De ultieme customer experience

    De ultieme customer experience is gebaseerd op diepe en brede inzichten in de klant; een uniform klantbeeld is hiervoor noodzakelijk. Bij veel organisaties verzamelt elke afdeling eigen data en vormt zijn eigen inzichten over de klant. Het probleem is echter dat, zonder een customer data platform, deze data gefragmenteerd is en niet volledig kan worden benut. Wanneer systemen en databronnen niet samenwerken, kan de ene afdeling niet bij de waardevolle inzichten van de andere afdeling. Inzichten die kunnen helpen om de klant nog beter van dienst te zijn. Aan de CIO de taak om alle verschillende databronnen te integreren, zodat elke afdeling op elk moment over alle informatie van de klant beschikt en een betere klantervaring kan bieden. Dit stelt een verkoper bijvoorbeeld in staat om mogelijkheden voor cross-selling en up-selling te identificeren op basis van de geschiedenis van elke klant.

    Wanneer alle data van de klant beschikbaar is, kan deze data geanalyseerd worden en kunnen er aanbevelingen op worden gedaan. Aan de hand van eerder gekochte producten kun je er dan bijvoorbeeld achter komen of en wát iemand hoogstwaarschijnlijk de volgende keer zal kopen. Zo stelt het vervolgens marketing weer in staat om de juiste boodschap op het juiste moment bij de juiste klant te brengen, zodat deze sneller converteert. Dat is de belangrijke rol van de CIO binnen de customer experience waarbij de CIO zowel de trusted operator als ook de business cocreator-rol uitoefent en combineert. Dus CIO van de toekomst; zorg dat je zowel de trusted operator bent áls de business cocreator zodat je in staat bent de customer experience van begin tot het eind te optimaliseren.

    Auteur: Onno Tjeerdsma

    Bron: CIO

  • How automated data analytics can improve performance

    How automated data analytics can improve performance

    Data, data, data. Something very valuable to brands. They need it in order to make informed decisions and in the long term, make their brand grow. That part is probably common knowledge, right? What you are probably wondering is how big brands are choosing and using the right data analytics that will bring results. Find out the answer to that question here.

    Data analytics to learn more about brand performance

    More and more companies are investing in brand. The problem is that they don’t know if their investment is bringing results or not. Of course they can work off their gut feeling or some numbers here and there from Google Analytics or the like, but what does that really tell them about the impact of their brand campaigns? Not much. That’s why big brands are using MRP-based data analytics coming from brand tracking. They are using the precise and reliable data that advanced data science can bring them in order to make sure the decisions they make are indeed based on fact.

    Data analytics for risk management

    Following on from the last point of big brands needing precise data to make informed decisions, they also need such data for risk management. Being able to grow as a brand is not just about knowing who their customers are, their intention to buy their product, etc., it is also about being able to foresee any potential risks and knocking them out of the park before they can cause any damage. Take for instance UOB bank in Singapore, who have devised a risk management system based on big data.

    Data analytics to predict consumer behavior

    As much as big brands need to look into the future, they also need to look to the past. Historical data can do wonders for future growth. Data analytics can be used to pinpoint patterns in consumer behavior. Using the data, they can potentially predict when a certain market may take a nosedive, as well as markets on an upward trend that are worth investing money into right now.

    Data analytics for better marketing

    A combination of data analytics looking at the past, present, and future of a big brand can make for better marketing, and in turn, more profit. By using data analytics to identify consumer needs and purchasing patterns, big brands can target with more personalized marketing, refine the overall consumer experience, and develop better products. Pay attention in your everyday life and you can already see examples of such data being used to market a product at you. A product you Googled once now appearing in your Facebook feed? Retargeting. Emails sounding like they are speaking directly to your needs? That’s because they are, since there are more than a few email marketing approaches. Data analytics was used to figure out exactly what you need.

    There is one important trend occurring across the different ways that big brands are using data analytics to bring results. They all aim to understand consumers, in particular, the brands’ target audience. Whether that be what consumers think of their brand now, how they reacted toward them in the past, and how brands think consumers will act in the future because of detected patterns.

    So, how are big brands using data analytics that will bring results? They are using them in a way that will help them better understand the consumer. 

    Author: Steve Habazin

    Source: Insidebigdata

  • How data analytics is affecting the insurance industry

    How data analytics is affecting the insurance industry

    Data analytics in the insurance industry is transforming the way insurance businesses operate. Here's why that is important.

    Technology has had a profound impact on the insurance industry recently. Insurers are relying heavily on big data as the number of insurance policyholders also grow. Big data analytics can help to solve a lot of data issues that insurance companies face, but the process is a bit daunting. It can be challenging for insurance companies who have not adjusted to this just yet.

    Effect of big data analytics on customer loyalty

    One of the reasons why some insurance companies get more customers as compared to others is because they can provide the things that their customers need. The more that they can give what the customers expect, the more loyalty customers reciprocate in return.

    Instead of just aggregating one policy from their insurer at a time, they may get all of their insurance policies in a single, centric dashboard. Even if people solicit an anonymous car insurance quote from a different company that is lower than others, they would still stick to a company that they are fiercely loyal to. This means that they will need to consider other factors, such as whether they have been unfairly prejudicing customers based on characteristics like gender or race. Big data may be able to help address this.

    Big data analytics can be very useful in acquiring all of the necessary data in a short amount of time. This means that insurance companies will know what their customers want and will offer these wants immediately. Insurance companies will also have the ability to provide personalized plans depending on their customer’s needs.

    Big data analytics in fraud cases

    One of the biggest issues that insurance companies are facing nowadays is fraud. According to industry findings, 1 out of 10 claims is fraudulently filed. This is an alarming rate, especially with the number of policyholders that an insurance company may have. Some consumers filing fraudulent claims have done so sloppily, which makes it easier for the company to seek restitution and prosecute the offenders before they can drive premiums up on other drivers. Some may be meticulously done and people can get away with it.

    With big data analytics, a large amount of data can be checked in a short amount of time. It includes a variety of big data solutions, such as social network analysis and telemetrics. This is the biggest weapon insurers have against insurance fraud.


    A large amount of data that is needed and received for subrogation cases. The data can come from police records, medical records, and even notes regarding cases. Through big data analytics, it will be possible to get phrases that will show that the cases that are being investigated are subrogation cases.

    Settlement cases

    There are a lot of customers who may complain that lawsuit settlements often take a long time, because there is a lot of analysis that needs to be done. With the use of big data analytics, the processes can help settle the needed claims instantly. It will also be possible to check and analyze the history of the claims and the claims history of each customer. This can help reduce labor costs as the employees do not have to put all of their time into checking and finalizing each data regarding the claim. It can also give the payouts to the customer faster which means that customer satisfaction will also greatly increase.

    Checking more complex cases

    There are some people who have acquired anonymous car insurance quote and have gotten insurance in order to file claims to acquire money from the insurance company. Some cases are obvious frauds and the authentic ones can be immediately analyzed with the use of big data analytics. Yet, there are some cases that are just too complex that it would take a lot of checking to see if the data received coincide with what the customer claims. Big data analytics use data mining techniques. These techniques allow the various claims to be categorized and scored depending on their importance. There are even some that will allow the claims to be settled accordingly.

    Some common issues in using big data analytics

    It is always important for insurance companies to consider both the good and the bad details about using analytics. Some of the good things have been tackled above. These are just some concerns that you need to be familiar with:

    • You still need to use multiple tools in order to process the data which can be problematic as data may get lost along the way.
    • Getting too many data analysts when a few will be enough.
    • Not unifying the gathered information.

    Take note of these issues so that they can be avoided.

    With all of the things that big data analytics can do, it is not surprising why a lot of insurance companies would need to start using this soon. This can be integrated little by little so that it will not be too overwhelming for everyone who is involved. The sooner that this can be done, the better. Not only for the customers but for the insurance company as a whole.

    Big data will address countless insurance industry challenges

    The insurance industry is more dependent on big data than many other sectors. Their entire business model is built around actuarial analyses. As a result, they will need to rely on big data to solve many of the challenges that have plagued them for years. Big data will also help them fight fraud and process lawsuit settlements more quickly.

    Author: Diana Hope

     Source: Smart Data Collective

  • How the data-based gig economy affects all markets

    How the data-based gig economy affects all markets

    Data is infinite. Any organization that wants to grow at a meaningful pace would be wise to learn how to leverage the vast amount of data available to drive growth. Just ask the top five companies in the world today: Apple, Amazon, Google, Facebook, and Microsoft. All these technology giants either process or produce data.

    Companies like these with massive stockpiles of data often find themselves surrounded by other businesses that use that data to operate.Salesforce is a great example: Each year at its Dreamforce conference in San Francisco, hundreds of thousands of attendees and millions of viewers worldwide prove just how many jobs the platform has created.

    Other companies are using vast amounts of information from associated companies to enhance their own data or to provide solutions for their clients to do so. When Microsoft acquired LinkedIn, for instance, it acquired 500 million user profiles and all of the data that each profile has generated on the platform. All ripe for analysis.

    With so much growth evolving from a seemingly infinite ocean of data, tomorrow’s leading companies will be those that understand how to capture, connect, and leverage information into actionable insight. Unless they’re already on the top 10 list of the largest organizations, the problem most companies face is a shortage of highly skilled talent that can do it for them. Enter the data scientist.

    More data, more analysts

    The sheer amount of data at our fingertips isn’t the only thing that’s growing. According to an Evans Data report, more than 6 million developers across the world are officially involved in analyzing big data. Even traditionally brick-and-mortar retail giant Walmart plans to hire 2,000 tech experts, including data scientists, for that specific purpose.

    Companies old and new learned long ago that data analysis is vital to understanding customers’ behavior. Sophisticated data analytics can reveal when customers are likely to buy certain products and what marketing methods would be effective in certain subgroups of their customer base.

    Outside of traditional corporations, companies in the gig economy are relying even more on data to utilize their resources and workforce more efficiently. For example, Uber deploys real-time user data to determine how many drivers are on the road at any given time, where more drivers are needed, and when to enact a surge charge to attract more drivers.

    Data scientists are in demand and being hired by the thousands. Some of the most skilled data scientists are going the freelance route because their expertise allows them to choose more flexible work styles. But how can data scientists who aren’t interested in becoming full-time, in-house hires ensure that the companies for which they freelance are ready for their help?

    The data-based gig economy

    Gartner reports that the number of freelance data scientists will grow five times faster than that of traditionally employed ones by next year. The data-based gig economy can offer access to top talent on flexible schedules. But before data scientists sign on for a project, they should check to see that companies are prepared in the following areas:

    • Companies need to understand their data before they decide what to do with it. That data could include inventory, peak store hours, customer data, or other health metrics.
    • Next, businesses should have streamlined the way they collect and store their data to make it easy to analyze. Use of a CRM platform is a good indicator of preparedness at this stage.
    • Finally, companies need to be able to act on the insights they glean. After freelancers are able to use organizations’ collected and organized data to find valuable connections and actionable insights, those organizations should have a process for implementing the discoveries.

    Today’s organizations need data in order to be successful, and they need data scientists to make use of that data. In order for both parties to thrive in this era, companies need to have the right strategies in place before they invest in freelance talent. When they do, freelance data scientists will have the opportunity to gather critical knowledge from the data and use their talents to drive innovation and success.

    Author: Marcus Sawyerr

    Source: Insidebigdata

  • How to use data science to get the most useful insights out of your data

    How to use data science to get the most useful insights out of your data

    Big data has been touted as the answer to many of the questions and problems businesses have encountered for years. Granular touch-points should simplify making predictions, solving problems, and anticipating the big picture down the road. The theory behind data science is a law of large numbers; similar to quantum physics, when we try to predict or analyze data lakes to draw a conclusion, it can only be a probability. Data cannot simply be read, it’s like a code that needs to be cracked.

    There’s an incredible amount of insight that can be gleaned from this type of information, including using consumer data to better inform their strategies and bottom lines. But the number of businesses that are actually implementing actionable steps from their data is minimal. So, how can companies ensure that they’re effectively managing the data they’re collecting in order to improve business practices?

    Identify what you’re looking to learn

    Too many companies invest heavily in software and people in a quest for big data and analytics without truly defining the problems that they’re looking to solve. Business leaders expect to instantly throw a wide net over all datasets, but they won’t necessarily get something useful in return.

    Take, for example, a doctor that spent over a year and a half implementing a new system that was supposed to give his colleagues meaningful medical insights.

    After collecting the data without truly defining the problem they wanted to solve, they ended up with the following insight: “Those who have had cancer have had a cancer test.” This, obviously, is a true statement culled from the data. The problem is it’s useless information.

    The theory behind data science was never meant for small data sets, and scaling to do so comes with a host of issues and irregularities; however, more data doesn’t necessarily mean better insights. Knowing what questions to ask is as important for a company as having the best tools for thorough data analysis.

    Prepare your data to be functional

    They say practice makes perfect, but with data science, practice makes permanent if you’re doing it the wrong way.

    The systems that companies use to keep track of data don’t have a lot of validation. Once you start diving into big data for insights, you realize there’s a whole layer of “sanitization” and transformation that needs to happen before you can start running reports and gleaning useful information.

    We’ve seen major companies doing data migration, but with an accuracy rate of 53%. Imagine if you went to the doctor mentioned in the previous section and he admitted his recommendations were only 53% correct. We can make a big bet you’re not going to that doctor anymore.

    To get quality data, you have to understand what quality data looks like. The human element and the machine have to work together; there needs to be an actionable balance. Data sources are constantly in flux, grabbing from new inputs from the outside world, ensuring a useful level of quality on the data coming in is critical or you’ll get questionable results.

    Depend on a reliable tech solution

    Once you have a clear path of checks and balances to ensure you’re on the right track, establishing a minimum viable product — potentially with a more efficient outsourced team — is what will truly drive actionable results. It makes sure the assumptions and projections derived from the insights are continually up to date, and looks from different angles to anticipate major trend changes.

    It’s important to see the big picture, but also be able to change a model’s behavior if it’s not delivering the most valuable insights. Whatever solution you settle on might not necessarily be the most sophisticated, but as long as it’s providing the answers to the right questions, it will be more impactful than something complex and obscure.

    When companies employ tools to untangle their stores of data without having a deep understanding of the limitations of data science, they risk making decisions based on faulty predictions, resulting in detriment to their organization. That means higher costs, incorrect success metrics and errors across marketing initiatives.

    Data science is still evolving very quickly. Although we will never get to the point that we can predict everything accurately, we will get a better understanding of problems to provide even more useful insights from data.

    Author: Luming Wang

    Source: Insidebigdata

  • Inergy en Frontin slaan de handen in één

    Inergy en Frontin slaan de handen in één

    Inergy, een software company actief in performance management en data analytics en Frontin, een leverancier van budgeting software voor gemeenten, hebben op 5 juni 2020 de handen ineengeslagen. De twee bedrijven gaan samen verder onder de naam Inergy. De productnamen PAUW, BUIG en INFO blijven bestaan in de markt.

    Mathijs van Houweninge, CEO van Inergy: “De combinatie van Frontin met Inergy is een grote stap voorwaarts in het realiseren van onze ambities in de overheidsmarkt. De combinatie met Frontin past goed in ons portfolio en onze strategie voor (semi-) overheid. Die is er onder meer op gericht om gemeenten te ondersteunen, meer data gedreven te werken en over betere stuurinformatie te beschikken op belangrijke domeinen als bedrijfsvoering, sociaal domein en buitenruimte.

    Met de LIAS productsuite is Inergy marktleider op het gebied van planning en control bij gemeenten. PAUW en BUIG sluiten daar goed op aan en worden door vrijwel alle Nederlandse gemeenten gebruikt. Ik ben verheugd dat Dirk Jans, het boegbeeld van Frontin, na de transactie bij Inergy betrokken blijft. Hij wordt immers landelijk erkend als dé expert van het Gemeentefonds.”

    Dirk Jans: “Ik ben trots op onze producten en blij met deze stap. Deze samenwerking zorgt voor verbreding van de dienstverlening aan onze klanten. Ik kan niet wachten om de meerwaarde te gaan uitleggen in het land.”


    Frontin heeft drie online producten. In 2001 werd het Product Algemene Uitkering op het Web, kortweg PAUW, gelanceerd. Deze internettoepassing ondersteunt gemeenten bij het vaststellen van de bijdragen uit het Gemeentefonds. Vanuit het Gemeentefonds, met een omvang van om en nabij 31 miljard euro, ontvangen de 355 Nederlandse gemeenten geld van de Rijksoverheid.

    Gemeenten ontvangen van het Rijk een gebundelde uitkering (BUIG) voor het bekostigen van uitkeringen en voor de inzet van loonkostensubsidie. Frontin BUIG is hét product waarmee gemeenten inzicht krijgen in de opbouw van hun BUIG-budgetten en in hun financiële perspectieven voor de komende jaren.

    INFO is een kennisplatform voor gemeenten op het vakgebied 'financiële verhouding tussen overheden'. Kennisexperts uit de markt becommentariëren die berichtgeving. Vervolgens wordt de informatie op een gebruiksvriendelijke manier toegankelijk gemaakt via een app, nieuwsbrief en website.


    De LIAS productsuite biedt een brede functionaliteit op het gebied van planning en control voor lokale overheden, provincies, waterschappen en onderwijs. Met LIAS is Inergy marktleider bij de Nederlandse gemeenten. LIAS helpt gemeenten doelen te stellen, resultaten te analyseren en rapportages online inzichtelijk te maken voor een grote groep stakeholders – van beleidsmedewerkers tot raadsleden en burgers.

    Inergy biedt daarnaast full service, non-stop data en analytics op basis van de meest effectieve technologieën van dit moment. Inergy levert alle relevante diensten voor het realiseren en onderhouden van data analytics oplossingen en streeft naar langdurige en succesvolle klantrelaties op basis van een partnershipvisie en cocreatie. Beter presteren door slim met data om te gaan, is daarbij het motto.

    Kijk hier ook de aankondiging van Mathijs van Houweninge en Dirk Jans.


    Bron: Inergy

  • Key components of developing the right data strategy

    Key components of developing the right data strategy

    What does your company do? 

    That was a trick question. It doesn’t matter what you think your company does, it’s going to have to turn into a data company soon, if it hasn’t started already, in addition to continuing to provide your core product or service. This may sound daunting, but it’s a good challenge to have and one that will ultimately improve your offering, delight your customers, increase stickiness and adoption, and keep you competitive in a changing data landscape. 

    In this article you will read a brief overview a data strategy's key components: what a data strategy has to encompass, vital considerations when dealing with data, and who the main players are when it comes to executing your data strategy.

    Data strategies for the uninitiated

    First off, 'So, what even is a data strategy anyway?' Everyone knows that data is important for organizations to make money, but just having a bunch of data is useless without a data strategy. A data strategy deals with all aspects of your data: where it comes from, where it’s stored, how you interact with it, who gets to see what, and who is ultimately in charge of it. This sounds like a tall order and you may be thinking 'Oh man! Is that my job?' Depending on your company’s level of data maturity, it might not be any one person or department’s job (yet). But you do need to start coming up with answers to all of these tough questions.

    “Everybody is going to assume that somebody else is taking care of the data, and the result is, nobody does”. - Charles Holive, Managing Director for Sisense’s Strategy Consulting Business.

    That’s a bad situation, and you definitely need to know who’s in charge of what data. However, one of the first questions you need to answer as you build your strategy is 'So, what do we want to do with all this data? Why? And how will this make us money/delight our customers?' Those answers ultimately have to come from the business unit that has the idea for making money/delighting customers in the first place: 'Internal data is owned by the function that creates it. It all sits within IT, but sales should own sales data, marketing should own the marketing data…' 

    These departments should also own the efforts to use that data to create new revenue, engagement, etc. A common misconception when it comes to data strategies is that they should be these all-encompassing, top-down initiatives that come from an all-seeing, all-knowing Chief Data Officer (more on this later), when actually you can, and should, build your strategy piece by piece and that the process should be driven by the areas who have the data in the first place. Whatever the initiative is (surfacing user data to inform them about their buying habits, etc.), the department with the data and the idea for using it should drive it. This increases ownership within the department and prevents the 'whose job is this?' question.

    Diversifying your data

    Once you’ve got your initiative in mind, it’s important to think about what data you need for it. The two main kinds of data your company has will be the data you generate and own and the data your customers generate, which you are only the custodians of (they own it). Whatever you plan on doing with data, this is the time to make sure that you are legally within your rights (consult your company’s legal department, counsel, etc.) and make sure that your user agreement contracts are properly worded to allow you to do what you want with the data you have. 

    There’s a third type of data your company can and should be thinking about for your data projects, and that’s third-party data, which can be used to add context to your datasets. More and more companies want to augment the context of their data. In healthcare, for instance, a hospital only has access to about 2% of the data on its patients, which is created while they are physically in the hospital. They are missing the other 98% of the data that is generated everywhere else. Their eating habits, buying habits, some of this could be useful to help provide better care. 

    As the outlook on data shifts from a company-centric to an ecosystem-spanning view, more and more companies will buy, sell, trade, and partner with other companies for access to the data they want and need to augment their datasets, deliver more value, and maintain a dominant position in their industries.

    Key players for implementing your data strategy

    Now that you know where the data strategy starts, who’s responsible for implementing it at the department level, and how to safely and responsibly use the data you’ve got, it’s time to talk about the key players within your organization who will help keep everything running smoothly. These are the business unit stakeholders, data professionals pulling the data together, and maybe the Chief Data Officer if your organization has one. The first one, we already covered: whoever came up with the idea for how to use your data (and whatever data you can get access to) should own the execution of that plan.

    They’ll need support from your company’s data experts: the IT department and data engineers (if you have them). These folks will walk the team executing the plan through the specifics of where the data is and how to access it. Additionally, they’ll make sure that the company has the analytics platform needed to pull it all together and present meaningful insights to your users. They may even be instrumental, along with product team members, in helping create embedded analytics that will live right inside your product or service.

    Lastly, we should discuss the Chief Data Officer (CDO). As previously discussed, this person is not the be-all-end-all of your data strategy. Many businesses, right now, may not even have a CDO, but when you do get one, they will wear a lot of hats within the organization. Their first job will be to look at all the data your company has and how it’s all being used and make sure that the processes in place make sense and are working. They will also check in with legal and make sure that data is being used in a way that’s compliant and that all user agreements are properly worded to protect users and the company. The CDO will also look for ways to augment your data holdings (through buying, partnering, etc.) to keep expanding the ways your company can use data to increase revenue. 

    Data strategies and culture

    A final, vital aspect of the CDO’s role is a cultural one: they have to assess the organization and make sure that everyone using data has a mindset that prioritizes the security of the data, but also the opportunity that it represents for the company. Every company is becoming a data company and the financial incentives are too huge to ignore: ´The market for monetizing data and insights is getting so big. Depending on what you read, it’s between 20 and 36 billion dollars over the next three or four years´. 

    Business teams need to understand this and be serious about getting the most out of their data. Dragging your feet or being half-hearted about it will not do: 'If someone says ‘the way I’ve made money before is the way I will make money tomorrow,’ I say ‘well, I’m not going to invest in your company.’ I know five years from now, someone’s going to get to your data and create much more value than you do with your transactions'. 

    Encouraging a culture of experimentation is key to finding new ways to use data to drive revenue and keep your company competitive. Charles suggested finding ways to make building new apps and projects with data as easy as possible, so that people across the company can build quickly and fail quickly, to find their way to solutions that will ultimately pay off for users and the company. 

    What will your company do?

    By now your head is probably spinning with all the potential challenges and opportunities of your data strategy (whether you had one when you started reading this article or not). If your team isn’t doing stuff with data right now, start asking the hard questions as to why that is and how you can change it. If your company doesn’t have the tools to build the analytics functionality you need, figure out how to get them. Whatever you have in your imagination, start building it. If you don’t, someone else will. 

    Author: Jack Cieslak

    Source: Sisense

  • Self-service reporting as part of a company-wide data analytics solution

    Self-service reporting as part of a company-wide data analytics solution

    In today’s fast-paced business environment, it can be difficult to predict and prepare for the future. This is why many companies are arming themselves with on-demand reporting. Self-service reporting allows users to produce reports and visualizations on-the-go.

    Whether you want to stay ahead of your competition, increase profits or improve performance, a quality data analytics solution is a must. The following three indicators strongly suggest that you are ready to implement a data analytics solution that provides self-service reporting:

    1. Reports lead to more reports

    Traditional reporting is often frustrating and time-consuming. Waiting multiple days for IT to generate a report frequently results in outdated information, further delaying decision-making. In addition, IT reports are static. They may answer some high-summary questions but lack the ability to answer additional questions on a granular level. When more information is needed, you find it necessary to go back to IT and request additional reports.

    Self-service data analytics enables anyone, even non-technical users, to access, query, and generate reports on demand, such as during business meetings. The nature of dynamic reporting means that if more information is needed, users can quickly drill down for more detail.

    2. Desire for visual charts

    Would visualizations help you present complex data with better clarity and efficiency? A graphical depiction of numbers presents your data in a way that people can easily digest, retain, and recall. Like a well-told story, a visualization allows you to set the scene and explain what happened, why it matters and how users can turn it into something actionable.

    With clear visualizations, it is easier to guide stakeholders from a broad overview down to the granular details. Graphic representations of data make it easy to see existing patterns and project future trends, which can help drive decision-making. Depending on your needs, visualizations might be simple such as bar charts, pie charts, and maps. Or they may be more complex models such as waterfalls, funnels, gauges, and many other components. Whatever the case, users should be able to build a customized dashboard to fit any scenario.

    3. People in the company are already doing it

    You may know some colleagues who are already using analytics at work. Many department heads and top employees understand that the IT department is stretched, yet they have important projects to deliver. As such, they may have already adopted and easy-to-use analytics solution for their personal use. Ask around or take notice at your next business meeting and you are likely to find resourceful employees who are already using self-service analytics to quickly make informed decisions.

    A study by the Harvard Business Review revealed that 'high-performing, customer-oriented workforces' have a high prevalence of employees using the 'bring your own technology' idea. For instance, American-based big box store Walmart realized that employees were using their phones to help them at work. Consequently, Walmart has embraced this trend by creating an employee app so they can help customers locate items within the store. So implementing a company-wide data analytics solution may not be difficult at all, perhaps you already have many users and advocates.

    Source: Phocas Software

  • Software kiest de beste sollicitant

    hh-6379374Sollicitanten interviewen is tijdverspilling. Wie beschikt over voldoende historische data en de juiste rekenmodellen, kan uit een stapel cv’s haarfijn destilleren wie er het meest geschikt is voor een bepaalde vacature. Sterker nog: als een wervingsspecialist maar voldoende gegevens heeft, kan hij voorspellen hoe goed iemand zal worden in zijn baan zonder diegene ooit gezien te hebben.

    Geraffineerd rekenmodel

    Voor de meeste bedrijven is het bovenstaande een verre toekomstschets, maar de technologie is er al, betoogt wetenschapper Colin Lee in zijn proefschrift. Hij promoveerde deze maand aan de Rotterdam School of Management (Erasmus Universiteit) op onderzoek waarin hij een geraffineerd rekenmodel gebruikt om patronen in meer dan 440.000 bestaande cv’s en sollicitaties te analyseren. Het model blijkt met 70% nauwkeurigheid te kunnen voorspellen wie er uiteindelijk werkelijk wordt uitgenodigd op gesprek, op basis van zaken als werkervaring, opleidingsniveau en vaardigheden.


    ‘Belangrijke voorspellers zijn relevantie van de werkervaring en het aantal dienstjaren. Je kunt die samenvoegen in een formule, en zo de beste match bepalen’, zegt Lee. Hoewel werkervaring bepalend is, zijn recruiters verder niet erg consequent in wat zij de doorslag laten geven, zo concludeert hij uit de patronen. ‘We kunnen daar wel een rode draad in herkennen, maar veel lijkt op basis van intuïtie te gebeuren.’


    Waar Nederlandse bedrijven huiverig zijn om de analyse van 'big data' een centrale rol te geven bij werving en selectie, is die praktijk al jaren gemeengoed in Silicon Valley. Voorlopers als Google baseren hun aannamebeleid in de eerste plaats op harde data en algoritmen, gebaseerd op succesvolle wervingen uit het verleden. ‘Bedrijven zijn vaak extreem slecht in werving en mensen interviewen. Ze varen op gevoel en ongefundeerde theorieën’, zei directeur human resources Laszlo Bock van Google vorig jaar in een interview met het FD.

    Kan een bedrijf zich met louter data een weg banen naar de perfecte kandidaat? In Nederland heerst de nodige argwaan, en niet alleen over de nog onbewezen technologie. Ook ethische vraagstukken spelen een rol, zegt Lee. ‘De toekomst is dat je exact kunt becijferen hoe iemand gaat presteren op basis van de parameters in zijn cv. Dat is eng omdat je mensen op voorhand uitvlakt.’

    Optimale match

    Wervingssoftware wordt wel al langer in minder extreme vormen toegepast, bijvoorbeeld door grote uitzenders als Randstad, USG en Adecco. Die maken met speciale software een eerste voorselectie uit honderden, of zelfs duizenden cv’s. Dat gebeurt met behulp van zogenaamde 'applicant tracking systemen' (ATS). Dat zijn filters die zowel openbare gegevens op sociale media als interne databases van klanten gebruikt om te werven, of om te bepalen of een werknemer wel de optimale ‘match’ is in zijn huidige functie.

    ‘Vaak kunnen wij beter zien of iedereen binnen een bedrijf tot zijn recht komt dan dat bedrijf zelf’, zegt Jan van Goch van Connexys, een maker van wervingssoftware. De belangrijkste barrière voor verdere ontwikkeling van dit soort toepassingen is volgens hem niet zozeer de technologie, als wel de angst van klanten voor privacyinbreuk en aansprakelijkheid. Zij zitten vaak op bergen aan waardevolle historische informatie over hun sollicitanten, maar weigeren die te ontsluiten voor gebruik in grotere databases.


    Van Goch: ‘Als al die informatie bij elkaar komt, kunnen we nog veel slimmer matchen en werven. Klanten willen dat wel, maar ze geven zelf niet altijd toestemming om eigen gegevens te gebruiken en blijven er dus op zitten en dat is doodzonde. Een deel is bang om aangeklaagd te worden op het moment dat het op straat komt te liggen, des te meer sinds de wetgeving voor dataopslag is aangescherpt.’

    Source: FD

  • Starting a BI project in 4 simple steps

    Starting a BI project in 4 simple steps

    What would it mean to you and your enterprise, if you could start getting useful business insights from your data in literally five days or less, using four simple steps?

    As exciting as this seems, it’s actually just what a good business intelligence platform should be able to do for you. While BI projects can be short term or long term, straightforward or sophisticated, they should all bring actionable results as soon as possible. Business moves fast nowadays, and there isn’t enough time for months of preparation, data modeling, IT platform planning, management decisions, and implementation.

    Fortunately, these four clear, do-able steps will allow you to publish your first BI dashboard in five days, keeping up with the pace of your business without needing specialist help or extensive resources.

    STEP 1: Map out your BI project with small, practical milestones (half-a-day)

    Why do certain BI projects fail? Often because they try to bite off more than they can chew. Start off by focusing on one insight of value, and your BI project can already be a success in just days. Afterwards, there will be plenty of opportunities to derive further insights, making sure each additional step brings you a measurable benefit.

    So, let’s begin! Here’s how to do step one:

    • Start with a standard business process you want to understand better or improve
    • Keep data sources few at first, with a just 2-3 reports that hold the answers
    • Get an initial, useful result, before iterating to go deeper or wider into your business processes

    This also means using a business intelligence system that lets you start simply, and then scale to any level of BI that makes sense for your organization.

    Allowing half-a-day for step one, your BI project map will then look like the following steps for the rest of the week (the 4.5 days left)

    • Business planning to define useful questions to answer (step two, below)
    • Setting up your data model to bring your data sources together properly (step three)
    • Designing and publishing a dashboard to display the results (step four)

    Remember that as you progress with your BI projects, your BI tool should let you go beyond just automating any manual business reporting you are doing currently (Excel spreadsheets included). A little business rethinking may show you even more important questions to answer, for which your BI tool will then become even more valuable. That’s when you start reaching beyond the realm of standard reports and into the realm of BI.

    STEP 2: Collect requirements (half-a-day)

    To get your first successful BI project off the ground in five days, requirements should be modest. On the other hand, business, result, and technical requirements should be stated clearly and precisely enough to keep your BI project on track for success:

    • Business requirement: state the question that is to be answered. For example, 'what are the trends in the monthly revenues of the organization?' Or, 'which product lines can use more marketing budget to generate higher profits?'
    • Result requirement: decide how a result from the BI system should be displayed or communicated, so that the business teams involved can understand and act on it as quickly and as easily as possible
    • Technical requirement: what hardware and software will be needed for the BI project? If you can use standard PC hardware, for instance, you can meet technical requirements that much more easily. Sisense, for example, both runs and scales on a standard PC, handling up to terabytes or billions of rows of data with full BI functionality quickly and efficiently, as needed.

    STEP 3: Select and compile your data sources (2 days)

    Business intelligence needs inputs of data to produce outputs of results and business insights. Data can come from many different sources (some BI tools have built-in data connectors that make it super easy use data from different places). Remember, data must be correct to start with. Otherwise, the end results will be flawed. Here’s your to-do list with detailed examples:

    • Select the data sources you want to use to answer your business question (see step two above). You might choose your organization’s sales database, existing Excel spreadsheets with financial data, Google Analytics data on the number and type of visits to your enterprise web site, or some combination of such data sources.
    • Understand the correlation between the data sources you want to use. For example, your sales database and your financial spreadsheets might both list your products: the sales database showing how well they are selling, and the spreadsheets showing how much they cost to make. Using the two data sources, your BI tool could show you how to maximize profit by putting more marketing resources on specific products.
    • Join the data from different sources for one version of the truth. Sisense lets you use simple 'drag and drop' to bring different data sources and tables into the same central, high-performance database, called an ElastiCube. Everybody then uses the same version of the collected data, avoiding arguments and allowing people to focus on the results and conclusions of the data analysis.

    STEP 4: Build and display your first BI dashboard (2 days)

    Remember the result requirement from step two above? In this final step, it’s time to create the displays that help your own users understand the results from the BI tool and the data it has analyzed.

    Sisense gives you numerous options to produce web-based dashboard displays, reports that can be distributed to groups of users, and interactive analytics to let users ask new questions and explore further. Here are some great dashboard templates by industry. Your goals in step four are:

    • Identify your target audience. Seek to understand, before trying to be understood! A business management audience may want more intuitive overviews and indications of trends, compared to a technical audience looking for more detail. So, use a corresponding approach to your dashboard.
    • Design your dashboard. Sisense provides options for graphs, charts, and filters that can also be accessed by dashboard viewers to make the dashboard as useful and as engaging as possible. Dashboards are also accessible using a standard web browser, meaning that your viewers do not have to use any additional plugin or download.
    • Information design. Common sense will play an important role here. Looking to show a trend over time? A line chart might be the simplest and most effective way. Or perhaps you want to show how overall sales of different products compare? A pie chart may be the right choice. If in doubt, remember the KISS principle (Keep It Simple, Stupid!).

    Actionable results from data using BI in one week

    By following the steps above, business users can start their business intelligence journey simply and effectively. They can also rapidly accomplish data management and analysis tasks that would otherwise have taken months of IT resources.

    Author: Elana Roth

    Source: Sisense

  • The art of looking beyond vanity metrics

    The art of looking beyond vanity metrics

    B2B marketers beware: Marketing vanity metrics are easy on the eyes but only skim the surface when it comes to actual value. Although vanity metrics may make you feel good about your marketing efforts, these surface-level metrics only reveal part of the story.

    But, fear not dear marketer! If you turn your attention to the metrics that matter, you can improve your marketing strategy and communicate the important insights to leadership.

    Before we get into it, here’s a quick definition of a vanity metric: a vanity metric is data that looks good at first glance, but provides little insight into business success, company revenue, and ROI.

    So, which data points are the common culprits? Examples of marketing vanity metrics include:

    • Page views
    • Downloads
    • Facebook likes
    • Twitter followers

    An alternative to marketing vanity metrics

    In order to communicate the value of marketing initiatives, marketers must hone in on actionable metrics: metrics that can guide decision-making. These types of metrics are often referred to as engagement metrics. Engagement metrics can tell you more about what’s working, what’s not working, and what information you need to test further. In fact, 91% of marketers named engagement metrics, such as social media interactions, time on site, and bounce rate, as the number one way to measure success.

    But let’s face it, executives and board members can get stuck on marketing vanity metrics. So, how can you manage the ever-increasing expectations around marketing vanity metrics? Today, we take a closer look at three common marketing vanity metrics and explore the different ways to steer the conversation towards more meaningful metrics. Let’s jump right in!

    1. Social media followers

    Many marketers rely too heavily on their social media followers to measure their social media success. And we get it! All marketers want to see an increase in social media followers, but, these numbers don’t necessarily equal an engaged audience.

    Think about it this way: you may have thousands of Twitter followers but if only one of them engages with your social content regularly, what is your following really worth? On the other hand, you may have a small but dedicated following on LinkedIn with your social posts often leading to big sales. Yes, your LinkedIn audience is smaller, but it turns out these users engage more with your content, ultimately bring in more value. Just by digging into the data, you’ve zeroed in on actionable information to guide your social media efforts.

    The next time higher-ups inquire about your social media following, be sure to shift the focus to more important engagement metrics. It’s important to note that your marketing and business goals will dictate which metrics are most important to your executive team. Here’s what we recommend:

    Brand awareness:

    An easy way to show brand awareness on social media is through the number of brand mentions or tags you receive. During your next marketing campaign or product launch, keep a close eye on branded keywords. Next, keep an eye on the competition’s branded keywords to reveal how often social media users interact with competing businesses. Use this information as a benchmark to measure and understand your own performance.

    Lead generation:

    When tracking lead generation, focus on conversions for maximum impact. As you review conversion data in your preferred analytics platform, take note of the social networks that deliver the highest number of qualified leads.

    Website traffic:

    If your goal is to generate website traffic from your social presence, look closely at metrics that demonstrate real social engagement. For instance, check out where your social media leads enter your website, track the pages you visit, and where they drop off. Also, take a look at the specific posts and channels that garner the most clicks so you can scale your success and serve more content that resonates with your followers.

    Customer experience:

    If you use social media as a customer support channel, the number of followers you accumulate won’t give you any information about how you are doing. Instead look at metrics like the ratio of questions asked to questions answered or responsiveness. Then, work to improve how many cases or complaints you solve.

    Event or webinar registrants:

    If your goal is to generate event participation, break your reports down by social channel. This shows you where users are the most active and engaged in your webinar or event. Simply include campaign tracking information in your social links.

    Content downloads:

    Not all content is created equal. For instance, a high conversion on gated content signals a high-quality piece of content. Use this metric to strategize on future content offerings and bring those insights to leadership.

    The list above is a good starting point to show the senior team how your social efforts meet specific business goals. Roll up your sleeves, and start tracking!

    2. Total app, product, or software downloads

    Total downloads. This number can be impressive on the surface but it isn’t a clear way to gauge the impact your marketing efforts have on product adoption. Instead of looking at total number of downloads, look to yearly and monthly download trends to reveal if downloads are increasing or decreasing over time. Then, look at this timeline in comparison to a timeline of major marketing campaigns. That way, you can pinpoint which efforts had an impact on downloads and which did not.

    Another issue with total downloads, is that it doesn’t paint a complete picture of product usage or adoption. Instead, look at these key usage metrics for a clear understanding of how your customers and prospects engage with your offers:

    • Uninstall rate
    • Renewal rate
    • Trial conversion rate
    • Time users spend using the software

    Although higher-ups and executives may only express interest in total downloads, it’s your job as a marketer to paint a more complete picture for them. For example, you could explain that total downloads are up after a recent marketing campaign, but usage metrics stayed level. This indicates that your campaign was faulty in some way. Maybe you didn’t give an accurate description of your product, or maybe it was too difficult for users to figure out. These are important insights to highlight to upper management.

    3. Website pageviews

    A high number of pageviews is an ego boost, but pageviews are another metric to be wary of. When you report this data to management, it’s important to provide pageviews along with actionable engagement metrics to fully show user behavior. Focus on how users engage with your website content rather than how many pageviews each webpage garners. Important engagement metrics include:

    • Time spent on site
    • Unique users
    • Bounce rate
    • Pages per visitor
    • Conversion rate

    Some questions to think about when reviewing website analytics:

    • Which pages keep people engaged, and which ones do users abandon quickly?
    • Which elements and CTAs convert best?
    • Can you identify which traffic sources perform best and why?
    • Or, can you determine which campaigns generate the most traffic and why?
    • Is your website content mapped to your sales journey in a way that makes sense?
    • Can you pinpoint at which stage of the buyer’s journey users leave your website?

    Take an in-depth look at these engagement metrics to really focus your online marketing initiatives on engagement over pageviews. Use your findings to build best practices and reduce bounce rate to ultimately keep users coming back for more great content.

    Final thoughts on marketing vanity metrics

    While higher-ups may ask for marketing vanity metrics, it’s your job to refocus on data points that correlate to sales and revenue, improving your business' KPI's.

    Know that you can still report on vanity metrics to management, but don’t spend much time there. Instead, focus the conversation on more actionable, advanced metrics, highlighting the value they offer your company.

    Source: Zoominfo

  • The benefits of analyzing the customer journey of your users

    The benefits of analyzing the customer journey of your users

    Skills related to User Experience (UX) design are high in demand. They are among the top 10 most demanded skills in 2019. ranked by a recent LinkedIn study. Finding qualified UX designers is tied with finding software engineers in terms of hiring priorities, according to a recent Adobe study. Within that UX bucket, designers who have skills related to data analytics and research are particularly sought after, with those qualities being named as a must-have.

    But the ability to analyze the user journey to create delightful experiences for end-users isn’t just a skill that is exclusive to (nor required only by) UX professionals. For stakeholders across the spectrum of software development and delivery, access to interactive data visualizations on how the user is moving through a task can help each group more successfully deliver on their own goals. From engineering, to product management, to marketing. And while access to this data may be expected in a cloud-based application, it’s equally (if not more) important for on-premise software publishers to enable this type of analysis in their products.

    By looking at data related to user flow (also known as ´path analytics´), product stakeholders begin to identify the series of steps it takes users to reach their goals. With a deep view into the steps surrounding a key task, several helpful pieces of information that may have been difficult or impossible to visualize now become readily apparent. Things like unanticipated actions, broken workflows, or shortcuts that power users have discovered that could be promoted or productized. 

    Having this knowledge has benefits that extend beyond streamlining and optimizing the user interface. This insight can help better determine training requirements and guide users, and also provide points for comparison between old and new user interfaces that inform product development.

    How does user flow analysis work?

    It starts with choosing a ´hotspot´ event to analyze. This can range from launching the application, to launching any event within it such as using a wizard, opening a menu, or accessing a particular feature. Next, pick a path direction within the hotspot to drill further into. This can be the start, the end, or somewhere in between. This is where it is crucial to understand the question you’re trying to answer. For instance, the hotspot would be the starting point if the goal is to understand where users go from a particular point, the steps taken, and whether that meets expectations. The hotspot would be the endpoint if you’re trying to answer a broader question about the value of the experience, such as the steps leading up to the user clicking on a link to explore upgraded functionality.

    Choose the number of steps to analyze, and the number of events within each step, as well as any paths that you don’t want to look atAs you audit the events you have tagged, there are a couple of best practices you can follow.

    First, make sure to have a naming convention for events that makes interpreting them easier in user flow reports and visualizations. Secondly, make sure that all of the high value events are tagged, to get data on them as soon as possible or before a specific marketing campaign or product roadmap decision.

    Having a window into these user flows has several key benefits, as it enables the organization to:

    Validate design: Confirm that users are taking the path designed for them or identify if different workflows may produce a better result.

    Visualize the journey: Quickly navigate through path reports to see traffic patterns through events and relative popularity of next/previous steps with a single click. This includes the ability to filter reports to view paths of specific sets of users based on their properties, and exclude noise events such as system generated events that are not user-initiated for clean user paths. The best tools will enable chart-based analysis, and provide the ability to export the data to CSV for offline analysis.

    Verify campaign effectiveness: User flow analysis can also be applied to measuring the effectiveness of marketing campaigns being pushed out through in-application messaging, with the ability to see the path a user took after seeing that message. User flow analysis lends the ability not only to see click-throughs, but also drill down within that to see the exact path users took.

    Author: Victor DeMarines

    Source: Dataversity

  • The differences between data lakes and data warehouses: a brief explanation

    The differences between data lakes and data warehouses: a brief explanation

    When comparing data lake vs. data warehouse, it's important to know that these two things actually serve quite different roles. They manage data differently and serve their own types of functions.

    The market for data warehouses is booming. One study forecasts that the market will be worth $23.8 billion by 2030. Demand is growing at an annual pace of 29%.

    While there is a lot of discussion about the merits of data warehouses, not enough discussion centers around data lakes. 

    Both data warehouses and data lakes are used when storing big data. On the other hand, they are not the same. A data warehouse is a storage area for filtered, structured data that has been processed already for a particular use, while Data Lake is a massive pool of raw data and the aim is still unknown.

    Many people are confused about these two, but the only similarity between them is the high-level principle of data storing.  It is vital to know the difference between the two as they serve different principles and need diverse sets of eyes to be adequately optimized. However, a data lake functions for one specific company, the data warehouse, on the other hand, is fitted for another.

    This blog will reveal or show the difference between the data warehouse and the data lake. Below are their notable differences.

    Data Lake

    • Type of Data: structured and unstructured from different sources of data
    • Purpose: Cost-efficient big data storage
    • Users: Engineers and scientists
    • Tasks: storing data as well as big data analytics, such as real-time analytics and deep learning
    • Sizes: Store data which might be utilized

    Data Warehouse

    • Data Type: Historical which has been structured in order to suit the relational database diagram
    • Purpose: Business decision analytics
    • Users: Business analysts and data analysts
    • Tasks: Read-only queries for summarizing and aggregating data
    • Size: Just stores data pertinent to the analysis

    Data Type

    Data cleaning is a vital data skill as data comes in imperfect and messy types. Raw data that has not been cleared is known as unstructured data; this includes chat logs, pictures, and PDF files. Unstructured data that has been cleared to suit a plan, sort out into tables, and defined by relationships and types, is known as structured data. This is a vital disparity between data warehouses and data lakes.

    Data warehouses contain historical information that has been cleared to suit a relational plan. On the other hand, data lakes store from an extensive array of sources like real-time social media streams, Internet of Things devices, web app transactions, and user data. This data is often structured, but most of the time, it is messy as it is being ingested from the data source.


    When it comes to principles and functions, Data Lake is utilized for cost-efficient storage of significant amounts of data from various sources. Letting data of whichever structure decreases cost as it is flexible as well as scalable and does not have to suit a particular plan or program. On the other hand, it is easy to analyze structured data as it is cleaner. It also has the same plan to query from. A data warehouse is very useful for historical data examination for particular data decisions by limiting data to a plan or program.

    You might see that both set off each other when it comes to the workflow of the data. The ingested organization will be stored right away into Data Lake. Once a particular organization concern arises, a part of the data considered relevant is taken out from the lake, cleared as well as exported.


    Each one has different applications, but both are very valuable for diverse users. Business analysts and data analysts out there often work in a data warehouse that has openly and plainly relevant data which has been processed for the job. Data warehouse needs a lower level of knowledge or skill in data science and programming to use.

    Engineers set up and maintained data lakes, and they include them into the data pipeline. Data scientists also work closely with data lakes because they have information on a broader as well as current scope.


    Engineers make use of data lakes in storing incoming data. On the other hand, data lakes are not just restricted to storage. Keep in mind that unstructured data is scalable and flexible, which is better and ideal for data analytics. A big data analytic can work on data lakes with the use of Apache Spark as well as Hadoop. This is true when it comes to deep learning that needs scalability in the growing number of training information.

    Usually, data warehouses are set to read-only for users, most especially those who are first and foremost reading as well as collective data for insights. The fact that information or data is already clean as well as archival, usually there is no need to update or even insert data.


    When it comes to size, Data Lake is much bigger than a data warehouse. This is because of the fact that Data Lake keeps hold of all information that may be pertinent to a business or organization. Frequently, data lakes are petabytes, which is 1,000 terabytes. On the other hand, the data warehouse is more selective or choosy on what information is stored.

    Understand the Significance of Data Warehouses and Data Lakes

    If you are settling between data warehouse or data lake, you need to review the categories mentioned above to determine one that will meet your needs and fit your case. In case you are interested in a thorough dive into the disparities or knowing how to make data warehouses, you can partake in some lessons offered online.

    Always keep in mind that sometimes you want a combination of these two storage solutions, most especially if developing data pipelines.

    Author: Liraz Postan

    Source: Smart Data Collective

  • The emergence of the Internet of Things and its possible impact on the fashion industry

    The emergence of the Internet of Things and its possible impact on the fashion industry

    The Internet of Things (IoT) is slowly but indisputably changing all the aspects of the fashion industry. This includes smart clothes, engaging and interactive customer experience, combining fashion and health, wearable technology and generating power through solar cells or kinetic energy. The possibilities are endless as this technology is being implemented in our daily clothing items providing us with many benefits even outside the fashion world.

    Health benefits

    Probably one of the most significant contributions our society can notice in the fashion industry is health-related. Smart clothing has an enormous potential to monitor and measure the health of a person who is wearing these items. We've already scratched the surface with smartwatches which are able to measure heart rate and diabetes, detect a seizure, help with posture, and much more. Besides accessories, some fashion brands have focused on developing lines of smart clothes that will include an ECG and heart rate sensor. This smart clothing will send data to smartphones through an app which will then help you to analyze your health and seek medical advice if needed.

    Retail space customization

    The power of the IoT can even create a unique shopping experience for customers. In other words, the physical experience can be improved by leveraging technologies which use shoppers' data on online platforms to use it in the actual stores. With a deeper understanding of customer behavior, companies can increase their sales results by giving their customers exactly what they need. With this technology, companies can track customer movements in the store once they log into the app. This way, they can understand their interest across various pieces. We can expect the technology in this area will only grow, and customers will be able to enjoy a more focused, customized, and simpler shopping experience.

    Improved supply chain

    The ability to improve the supply chain and make it more effective is vital for ethical companies. With help of the IoT, companies can uniquely tell their own stories allowing their customers to even connect with the people who created the items they're wearing and say thank you to them. Moreover, this technology enables companies to tap into their shopper's values and use it to improve the supply chain. However, the IoT has the potential to solve yet another common challenge in fashion: inventory. Finding an efficient way to manage inventory and dispose of the headstock is a major problem, but with the IoT, they can optimize new technologies and make large quantities to order.

    Implementing emotions

    Fashion communicates emotions. It was only a matter of time until these two worlds become connected in one with the help of technology. However, hardly anybody expected to see new functionalities like regulating body temperature, detecting and relieving stress in our clothing items. When talking about emotions, the real challenge for these companies is to find apps that their consumers actually want and need. After all, we can't talk about full integration of the IoT in the fashion industry without emotions.

    Understanding which emotions consumers connect to a brand is what can tremendously improve your communication with them and, consequently, sales results. For instance, what do people feel when they see a picture of Swiss watches? Is it loyalty? Tradition? Security? Or something else? If loyalty is the most common emotion, how to use this emotion and implement it in all stages of the customer journey? Finding a specific emotion is the bridge between a brand and its customers.

    Sports player insight

    Sports apparel is a big part of the modern-day fashion industry, so it was only a matter of time until sports brands started to realize how much they can benefit from technological solutions. For example, there has been a rise in data analytics in football which provides extremely useful information on players' fitness level during a match. This way, coaches can get an insight into their players’ work rate and decide whether they need to be substituted or not.

    Football boots could be another item in sports fashion which has the ability to provide useful data thanks to the IoT. With embedded sensors that would measure every step of a player, coaches would also have data on the strength and angle of impact on the ball. This would be crucial when preparing football teams for big competitions as coaches would have vital information on time to make the right strategic decision.


    There is no telling what other areas of the fashion industry will be affected by the development of such powerful technology, but we can only assume it will be revolutionized completely. Having the ability to get information from consumers without wasting their time and adjusting the customer experience accordingly creates endless opportunities. This can improve our life quality as we gain valuable information on our health in such an easy and non-intrusive way. When talking about what the IoT can do for the fashion industry, the profit will significantly increase for various companies as their brands will be completely adjusted to customer's needs and customers will appreciate that.

    Source: Datafloq

  • The essence of centralizing analytics: a health system perspective

    Hospitals and health systems continue to invest in data analytics, but (too) often a fragmented, decentralized approach to analytics delivery models results in excessive costs, inefficiency and missed opportunities to improve patient care.

    A number of factors have coalesced in recent years to catalyze greater investment in healthcare analytics – the ongoing transition to new payment models under value-based care, a greater emphasis on the health of populations, and increasing competition. But also the explosion in available health data from electronic health records, laboratory test results, and wearable devices – to name a few.

    The momentum isn’t expected to slow down any time soon. A recent report from Zion Market Research predicts the global healthcare analytics market to grow to $68 billion in 2024 from approximately $20 billion in 2017, a compound annual growth rate of more than 19 percent.

    While there’s no question that providing organizations are busy writing checks to healthcare analytics vendors, there is some question about whether they’re getting an adequate bang for their bucks.

    For example, a Deloitte survey of U.S. hospitals and health systems with greater than $500 million in revenues found that fewer than half of respondents said their organization had a clear, integrated data analytics strategy, while about one in four didn’t have a data governance model in placebat all. Even more problematic, about one in three reported that they didn’t know their organizations’ total analytics spend.

    Multiple vendors, no single source of truth

    A common cause of many of these issues is a decentralized approach to analytics in which data analysis happens in different business units that do not share assumptions, analytics methods or insights broadly. In contrast, under a centralized delivery model, an experienced team of data analysts report to one function at the enterprise level, even if they are assigned to serve different business units, based on strategic priorities set at the corporate level. This business-oriented team of analysts meets the need of organizational stakeholders while maintaining and developing in-house intelligence.

    For a large part, a centralized analytics delivery model is important because it offers an improvement to the fragmented, incomplete data governance models that too many providers still use. For example, it’s not uncommon for large health systems to contract with multiple vendors to analyze population health risk for groups of patients with different conditions, such as diabetes and osteoarthritis among others.

    This lack of a single source of truth in analytics can lead to different answers to the same question, such as conflicting guidance on levels of risk, and in turn, on the highest-priority patients to target for interventions. As a result of this fragmented and potentially conflicting information, when prioritizing care plans and interventions, the health system cannot build a consistent clinical profile with a 360-degree view of each patient that accounts for the same factors.

    This results in health system decision makers being left wondering which vendors’ information they should believe.

    Delivering analytics as a service across the organization

    In addition to the fragmentation of data, there are a number of common barriers that prevent hospitals from efficiently and cost-effectively deploying analytics across their organizations, including territorial disputes over data, unclear roles and responsibilities and competition for already-scarce resources.

    As with virtually all organizational transitions, success in centralizing analytics starts with buy-in at the top. Strong executive leadership must bring together talented people with deep experience in applying analytical expertise to solving pressing clinical and business issues.

    A best practice is to place a senior-level executive in charge of analytics, potentially in a Chief Data Officer role, to lead the organization’s centralization initiative. A key function of this role is to establish effective and comprehensive data governance practices, clearly defining what type of data the organization will collect, how the data is structured, who can access it, and how it gets reported and presented to different people in the organization, among other steps.

    Once the organization establishes a solid foundation for data, it will be ready to adopt a single analytics platform that delivers actionable information to decision makers. Today’s leading analytics platforms often employ machine-learning systems to automatically extract important insights that may not be otherwise apparent to human analysts.

    Ultimately, the aim is the creation of one internal, centralized professional services group within the organization that delivers analytics as a service to other stakeholders in the hospital. By structuring a hospital’s analytics functions this way, the organization can eliminate the fragmentation and cacophony of multiple systems that offer conflicting insights and prevent leadership from understanding the organization’s full analytics spend.

    Generalization in practice

    Already, prominent health systems like University of Michigan Health System (UMHS) and Beth Israel Deaconess Medical Center (BIDMC) have taken the leap to centralized analytics delivery models. UMHS, for example, has created comprehensive registries for population health and used them to generate predictive analytics that focus predominantly on chronic diseases. BIDMC, through its centralized analytics governance model, provides layers of decision support and analytics for its physicians, with the goal of understanding variations in cost and care to maximize quality, safety, and efficiency.

    In the future, the insights derived from centralized analytics delivery models are likely to help hospitals improve quality, lower costs, identify at-risk populations and better understand performance. For that to happen, however, hospitals and health systems must first overcome the fragmented, decentralized approach to analytics that prevents them from realizing the full value of their analytics investments.

    Source: Insidebigdata

  • The essence of using an organization-wide data analytics strategy

    The essence of using an organization-wide data analytics strategy

    Does your organization spend loads of time and money collecting and analyzing data without ever seeing the expected return?

    Some 60% of data and analytics projects fail to meet their objectives. Part of the problem is that you can now a just about anything, which has caused our appetite for data to grow exponentially, often beyond what enterprise organization’s data and analytics teams can handle. Too often, talented people with the right tools can’t create meaningful outcomes because of cultural or organizational challenges.

    Here are some telltale signs that your data resources are being wasted.

    • Road to nowhere: When data and analytics teams are seen as order-takers, it can lead to a one-way stream of requests that overload resources and don’t reflect strategic needs.
    • Garbage in: A lack of standards around how data requests are made leads to disorder and inefficiency.
    • Static data in a dynamic world: Data is treated as a retrospective recording of historical measurements with little ability to draw insights or solve problems.
    • Data distrust: Data silos lead to a lack of transparency around who is producing data, what data is actually being used and how they’re doing it. Over time, this can make business leaders start to doubt the accuracy of their own organization’s information.

    In this environment, employees often try to satisfy their own data needs outside the company’s defined channels, which worsens the problem by creating more internal customers for the centralized data analytics team.

    With growing demand for data, you need to organize your data and analytics teams to reflect big-picture goals. Data resources should be assigned based on your organization’s strategic and operational needs rather than the frequently narrow requests of individuals. The goal is to become an organization where data and analytics partner with the business to create value over the long term.

    Your business objectives should drive any and all decisions you make toward organizing data and analytics teams. Data is not the end but rather the means to support the broader strategy.

    The long road toward organizing your data and analytics strategy can be simplified as a three-step process.

    • Organize your analytics resources around business processes.
    • Put money behind products that will help the whole enterprise.
    • Build a product-centric workflow that is transparent, manages the demand of data resources, and delivers on outcomes.

    Mapping your data resources to business processes will help your organization get the most out of its people. It’s also an eye-opening experience for many, revealing the shared needs across departments. Arranging your organization in this way also reduces waste in the form of redundant data reporting. Your people will also have more time to generate insights and spend less time and effort curating their own data marts.

    These newly formed 'analytics centers' subsequently govern the demand and prioritization of analytic products and can help to assess what the major data needs of the organization are. A side benefit is that your data and analytics teams will be empowered. Rather than fielding requests, they’ll start working on products that help the company succeed.

    Developing a long-term product roadmap for your data needs also requires someone to build consensus. The analytics product manager serves a critical role here, understanding the business objectives and translating them for technical teams.

    When analytics centers are enabled, a company will see better return on their investment, as well as more manageable demand on their data and IT resources without the overflow of one-off and redundant requests. The point isn’t to create a totally centralized data and analytics process. Rather, these analytics centers serve as spokes to the company’s enterprise data managementand IT hubs.

    The centers are also a resource to individual departments and teams, relaying their needs to EDM. This arrangement enables the data and analytics centers to filter through mountains of requests to find out what truly matters to the organization.

    Spending more isn’t the answer. Start by identifying the strategic aim of data, organizing analytics resources around them and building products that add lasting value.

    Author: BJ Fineman & Kurt Knaub

    Source: Information-management

  • The human impact of data literacy

    The human impact of data literacy

    What if I told you only 32% of business executives said that they’re able to create measurable value from data, and just 27% said their data and analytics projects produce actionable insights? Let me put it another way: How excited would you be if I said I made you some chocolate chip cookies, but I only put in 32% of the required sugar and 27% of the required flour?

    I sure hope you wouldn’t eat those cookies. The cookies would be underprepared and not correctly baked with all the necessary ingredients for tasty success. To make an analogy, there are companies creating data and analytics (think: making cookies) without the necessary cultural and organizational ingredients to derive the greatest value from their creations.

    To help others better understand how data literacy – properly and programmatically implemented – can encourage organizations to use these needed ingredients, I recently co-presented a webinar with Martha Bennett, VP and Principal Analyst, from Forrester, and Rishi Muchhala, Manager of Enterprise Intelligence, from Nemours Children’s Health System. The webinar had thousands of attendees, and we received many good questions. I’ve formulated them and provided detailed answers below.

    Question topic 1: What about the data culture of an organization?

    This was a recurring theme in each of the questions that were asked and for good reason. The number one obstacle to data literacy success has nothing to do with data, technology or the software you deploy; it has everything to do with your culture and the people in your organization. Now, how many of you reading this think changing a culture is easy? If so, trust me – it’s not.

    Changing a culture is definitely not easy. It involves changing the DNA of an organization, so that people embrace – not just accept – data. This means data fluency, data literacy, analytical competence and data mentoring must be encouraged and reinforced at multiple touchpoints throughout the organization. Part of the solution is convincing people at all levels that data is empowering.

    Question topic 2: What are key areas to focus on in a data literacy program?

    This question is very large in scope, and you could get lost trying to address all facets of a data literacy program. Below are a few key areas a data literacy program should concentrate on.

    • Leadership – For any data literacy program to succeed, it must have leadership buy-in. The leaders of any organization set the tone and agenda for cultural change, marking how to measure it, conveying its progress and extolling its virtues.
    • Tailored learning – Remember that each individual is at his or her own data literacy stage, and we cannot expect a program to succeed if we try to fit everyone into the same puzzle space. One size does not fit all – people learn at different speeds in different ways, and you should provide for differing learning experiences that nurture data literacy growth across that spectrum.
    • Curiosity, creativity and critical thinking – Work hard to foster the '3 Cs of Data Literacy', which form the foundational pillars of nearly all data literacy programs. People should have a strong desire to know and understand, as well as engage in divergent and novel thinking. This is more likely to occur when the tenets of such thinking are embedded in every part of a data literacy program.

    Mind you: I am not recommending that everyone go back to school, study statistics and so forth. But, I am saying we need a culture that encourages the questioning and challenging of assumptions.

    Question topic 3: Who should lead the data literacy effort in the company?

    This is another great question. I have been approached by people who wonder if a grassroots movement among the employee base is the key to data literacy success. I have been approached by people who wonder if it is the executive team that leads the charge. The short answer is both.

    In order for your data literacy program to succeed, you must have leadership and executive buy-in. By having buy-in from the executive team, you ensure the workforce understands the company is behind the data literacy initiative. Then, create excitement through grassroots work and data literacy evangelists. These two techniques help organizations drive a holistic and inclusive approach to data literacy.


    The human impact of data literacy cannot be overemphasized. A workforce and society empowered by data leads to smarter, better-informed decision making, which makes us less prone to errors, groupthink and orthodoxy. This means we will be more open to challenging others’ practices that are not supported by evidence and also more accepting of data-based feedback that challenges our own approaches. In short, as a society, increased data literacy can only help us grow, as professionals and people, enriching and deepening our perspectives.

    Author: Jordan Morrow

    Source: Qlik

  • The importance of ensuring AI has a positive impact on your organization

    The importance of ensuring AI has a positive impact on your organization

    Arijit Sengupta, founder and CEO of Aible, explains how AI is changing and why a single AI model is no longer smart business.

    There’s lots of buzz about artificial intelligence, but as Arijit Sengupta, founder and CEO of Aible, points out, “Everyone has heard a lot about AI, but the AI we’ve been hearing about is not the AI that delivers business impact.” Where is AI headed? Why is a single AI model no longer the right approach? How can your enterprise make the most of this technology?

    Arijit SenguptaAI needs to deliver context-specific recommendations at the moment a business user is making a decision. We’ve moved away from traditional analytics and BI, which looks backwards, to a forward-looking technology. That’s a fundamental shift.

    What one emerging technology are you most excited about and think has the greatest potential? What’s so special about this technology?

    Context-specific AI has the greatest potential to change business for the better. The first generation of AI was completely divorced from the context of the business. It didn’t take into account the unique cost-benefit tradeoffs and capacity constraints of an enterprise. Traditional AI assumed that all costs and benefits were equal, but in business, the benefit of a correct prediction is almost never equal to the cost of a wrong prediction.

    For example, what if the benefit of winning a deal is 100 times the cost of unnecessarily pursuing a deal? You might be willing to pursue and lose 99 deals for a single win. An AI that only finds 1 win in 100 tries would be very inaccurate based on model metrics, although it would boost your net revenue. That’s what you want from AI.

    The second generation of AI has a laser focus on the specific business reality of a company. As Forrester and other analysts have pointed out, AI that focuses on data science metrics such as model accuracy often doesn’t deliver business impact.

    What is the single biggest challenge enterprises face today? How do most enterprises respond (and is it working)?

    Solving the last-mile problem of AI is the single biggest business challenge facing companies today. Right now, most business managers don’t have a way to understand how a predictive model would impact their business. That’s a fundamentally different question than finding out what the AI has learned.

    Just because I tell you how a car works doesn’t mean you know how to drive a car. In fact, in order to drive a car, you often don’t need to know all of the details about how a car works. In the first generation of AI, we obsessed over explaining how the car works in great detail. That’s what was considered “explainable AI.”

    What we are shifting to now is the ability for businesses to understand how the car affects their lives. Enterprises need to know how the AI affects their business outcomes under different business scenarios. Without this knowledge, you can’t get AI adopted because you’re asking business owners to play Russian roulette. You’re not giving them the information they need to understand how a given AI model will affect their KPI. You’re just giving them a few models and telling them to hope for the best.

    Is there a new technology in data or analytics that is creating more challenges than most people realize? How should enterprises adjust their approach to it?

    Traditional AI built on model accuracy can actually be incredibly harmful to a business. AI that’s trained to optimize model accuracy is often very conservative, and that can put a business on a death spiral. A conservative model will tell you to go after fewer and fewer customers so you’re assured of closing almost every deal you pursue, but many times that means you end up leaving a lot of money on the table and slowly destroying your business. AI that maximizes accuracy at the expense of business impact is worse than useless - it destroys value.

    What initiative is your organization spending the most time/resources on today? In other words, what internal project(s) is your enterprise focused on so that your company (not your customers) benefit from your own data or business analytics?

    We’re an early-stage startup with a relatively small volume of data, but we believe in getting started with AI quickly rather than waiting to get a ton of data. What we first started doing is using AI to predict which customers were likely to go from a first contact to a first meeting and which were likely to click on an email.

    Over time, we’ve collected more data and been able to optimize our marketing spending across different channels and figure out exactly which customers to focus on. If we had waited until we had a lot of data to get started, we wouldn’t have progressed as far as we have. By getting started with AI quickly, we were able to improve our AI process much faster.

    Where do you see analytics and data management headed in 2020 and beyond? What’s just over the horizon that we haven’t heard much about yet?

    Everyone has heard a lot about AI, but the AI we’ve been hearing about is not the AI that delivers business impact. The AI we’ve been hearing about is the AI of labs that’s abstracted from business realities.

    What’s just over the horizon that people are beginning to wake up to is that to get business impact, you have to have a very different kind of AI. Creating a single AI model doesn’t make any sense because business realities constantly change. What you need to do is create a portfolio of AI models that are tuned to different business realities. You need a different model if your cost to pursue a customer goes up 10 percent or if your average deal size goes up 20 percent. If you create a portfolio of AI models, your business will be much more resilient to change - and the only thing you can count on in business is change.

    Can you describe your solution and the problem it solves for enterprises?

    Aible’s AI platform ensures business adoption by giving users tools tailored to their existing skills and needs. Aible overcomes the last-mile problem by enabling end users to customize models and see how they affect the business. Aible lets you get started quickly with the data you have by fully automating the machine learning process; team members can contribute their unique business insights to AI projects. Uniquely, Aible delivers dynamically balanced AI models so you always deploy the right model at the right time. Aible ensures data security by running in your secure AWS or Azure account or on premises and never sees your data or trained models.

    Author: James E. Powell

    Source: TDWI

  • The top 10 benefits of Business Intelligence reporting

    The top 10 benefits of Business Intelligence reporting

    Big data plays a crucial role in online data analysis, business information, and intelligent reporting. Companies must adjust to the ambiguity of data, and act accordingly. Spreadsheets no longer provide adequate solutions for a serious company looking to accurately analyze and utilize all the business information gathered.

    That’s where business intelligence reporting comes into play and, indeed, is proving pivotal in empowering organizations to collect data effectively and transform insight into action.

    So, what is BI reporting advancing in a business? It provides the possibility to create smart reports with the help of modern BI reporting tools, and develop a comprehensive intelligent reporting practice. As a result, BI can benefit the overall evolution as well as the profitability of a company, regardless of niche or industry.

    To put the business-boosting benefits of BI into perspective, we’ll explore the benefits of business intelligence reports, core BI characteristics, and the fundamental functions companies can leverage to get ahead of the competition while remaining on the top of their game in today’s increasingly competitive digital market.

    Let’s get started by asking the question 'What is business intelligence reporting?'

    What is BI reporting?

    Business intelligence reporting, or BI reporting, is the process of gathering data by utilizing different software and tools to extract relevant insights. Ultimately, it provides suggestions and observations about business trends, empowering decision-makers to act.

    Online business intelligence and reporting are closely connected. If you gather data, you need to analyze and report on it, no matter which industry or sector you operate in.

    Consequently, you can develop a more strategic approach to your business decisions and gather insights that would have otherwise remain overlooked. But let’s see in more detail what the benefits of these kinds of reporting practices are, and how businesses, whether small or enterprises, can develop profitable results.

    Benefits of business intelligence and reporting

    There are a number of advantages a company can gain if they approach their reporting correctly and strategically. The main goal of BI reports is to deliver comprehensive data that can be easily accessed, interpreted, and provide actionable insights.

    Let’s see what the crucial benefits are:

    1. Increasing the workflow speed

    Managers, employees, and important stakeholders often can be stuck by waiting for a comprehensive BI report from the IT department or SQL developers. Especially if a company connects its data from different data sources. The process can take days, which slows down the workflow. Decisions cannot be made, analysis cannot be done, and the whole company is affected.

    Centralizing all the data sources into a single place, with data connectors that can provide one point of access for all non-technical users in a company, is one of the main benefits a company can have. The data-driven world doesn’t have to be overwhelming, and with the right BI tools, the entire process can be easily managed with a few clicks.

    One additional element to consider is visualizing data. Since humans process visual information 60.000 times faster than text, the workflow can be significantly increased by utilizing smart intelligence in the form of interactive, and real-time visual data. Each information can be gathered into a single, live dashboard, that will ultimately secure a fast, clear, simple, and effective workflow. This kind of report will become visual, easily accessed, and steadfast in gathering insights.

    2. Implementation in any industry or department

    Creating a comprehensive BI report can be a daunting task for any department, employee or manager. The goals of writing successful, smart reports include cost reduction and improvement of efficiency. One business report example can focus on finance, another on sales, the third on marketing. It depends on the specific needs of a company or department.

    For example, a sales report can act as a navigational aid to keep the sales team on the right track.

    A sales performance dashboard can give you a complete overview of sales targets and insights on whether the team is completing their individual objectives. Of course, the main goal is to increase customers’ lifetime value while decreasing acquisition costs. 

    Financial analytics can be kept under control with its numerous features that can remove complexities and establish a healthy and holistic overview of all the financial information a company manages.

    It doesn’t stop here. Another business intelligence report sample can be applied to logistics, one of the sectors that can make the most out of business intelligence and analytics, therefore, easily track shipments, returns, sizes or weights, just to name a few.

    Enhancing the recruitment process with HR analytics tools can bring dynamic data under the umbrella of BI reporting, making feedbacks, interviews, applicants’ experience and staffing analysis easier to process and derive solutions. 

    3. Utilization of real-time and historical data

    With traditional means of reporting, it is difficult to utilize and comprehend the vast amount of gathered data. Creating a simple presentation out of voluminous information can challenge even the most experienced managers. Reporting in business intelligence is a seamless process since historical data is also provided within an online reporting tool that can process and generate all the business information needed. Artificial intelligence and machine learning algorithms used in those kinds of tools can foresee future values, identify patterns and trends, and automate data alerts.

    Another crucial factor to consider is the possibility to utilize real-time data. The amount of sophistication that reporting in BI projects can achieve cannot be compared with the traditional ones. A report written as a word document will not provide the same amount of information and benefit as real-time data analysis, with implemented alarms that can forewarn about any business anomaly, and that kind of support software will consequently increase business efficiency and decrease costs. It is not crucial to establish a whole department to manage and implement this process, numerous presentation software can help on the way.

    4. Customer analysis and behavioral prediction

    There is no company in the world which doesn’t concentrate on their customers. They are ultimately the ones that provide revenue and define if a business will survive the market.

    Customers have also become more selective towards buying and deciding which brand should they trust. They prefer brands “who can resonate between perceptual product and self-psychological needs.” If you can tackle into their emotional needs, and predict their behavior, you will stimulate purchase and provide a smooth customer experience. BI reports can combine those resources and provide a stimulating user experience. The key is to gather information and adjust to user needs and business goals, as shown in the picture below.

    Today there are numerous ways in which a customer can interact with a specific company. Chatbots, social media, emails, or direct interaction; the possibilities are endless.

    The increment of these kinds of engagement has increased the number of communication touchpoints and, consequently, sources of data. All of the information gathered can provide a holistic overview of the customer, evaluate why a certain strategy worked or failed, connect the cause and effect of customer service reports, and, thus, improve business operations.

    5. Operational optimization and forecasting

    Every serious business uses key performance indicators to measure and evaluate success. There are countless KPI examples to select and adopt in a strategy, but only the right tracking and analysis can bring profitable results. Business intelligence and reporting are not just focused on the tracking part, but include forecasting based on predictive analytics and artificial intelligence that can easily help avoid making a costly and time-consuming business decision. Reporting in business intelligence is, therefore, highlighted from multiple angles that can provide insights that can otherwise stay overlooked.

    6. Cost optimization

    Another important factor to consider is cost optimization. As every business needs to seriously consider their expenses and ROI (return on investment), often the costs and savings are hardly measured. In a business reporting software, you have access to evident data that can be easily calculated by small businesses and large enterprises with just a few clicks.

    7. Informed strategic decision-making

    Whether you’re a CEO, an executive, or managing a small team, with great power comes great responsibility. As someone with corporate seniority, you will need to formulate crucial strategies and make important choices that have a significant impact on the business. Naturally, decisions and initiatives of this magnitude aren’t to be taken lightly. That’s where reporting business intelligence tools come in.

    Concerning senior decision-making or strategy formulation, it’s essential to use digital data to your advantage to guide you through the process. BI reporting dashboards are intuitive, visual, and provide a wealth of relevant data, allowing you to spot trends, identify potential strengths or weaknesses, and uncover groundbreaking insights with ease.

    Whether you need to streamline your budget, put together a targeted marketing campaign, improve an internal process, or anything else you can think of, leveraging BI will give you the ability to make swift, informed decisions and set actionable milestones or benchmarks based on solid information.

    The customizable nature of modern data analytic stools means that it’s possible to create dashboards that suit your exact needs, goals, and preferences, improving the senior decision-making process significantly.

    8. Streamlined procurement processes

    One of the key benefits of BI-based reports is that if they’re arranged in a digestible format, they offer access to logical patterns and insights that will allow you to make key areas of your business more efficient. This is particularly true if you deal in a high turnover of goods or services. And if this is the case, it’s more than likely that you have some form of a procurement department.

    Your procurement processes are vital to the overall success and sustainability of your business, as its functionality will filter down through every core facet of the organization. Business intelligence reporting will help you streamline your procurement strategy by offering clear-cut visualizations based on all key functions within the department.

    Working with interactive dashboards will empower you to summarize your procurement department’s activities with confidence, which, in turn, will help you catalyze your success while building brand awareness. In the digital age, brand awareness is priceless to the continual growth of your organization.

    Another undeniable benefit of BI in the modern age.

    9. Enhanced data quality

    One of the most clear-cut and powerful benefits of data intelligence for business is the fact that it empowers the user to squeeze every last drop of value from their data.

    In a digital business landscape where new data is created at a rapid rate, understanding which insights and metrics hold real value is a minefield. With so much information and such little time, intelligent data analytics can seem like an impossible feat.

    We’ve touched on this subject throughout this post, but enhanced data quality is such a powerful benefit that it’s worth exploring in its own right. To put this notion into a practical perspective, it’s important to consider the core features and functions of modern BI dashboards:

    • Non-restricted data access: Typically, cutting-edge data intelligence dashboards are accessible across a broad range of mobile devices for non-restricted 24/7 access to essential trends, metrics, and insights. This makes it possible to make informed data-driven decisions anytime, anywhere, increasing productivity in the process.
    • Purity: As modern BI tools operate using highly-visual and focused KPIs, you can take charge of your data, ensuring that the metrics you’re served are 100% relevant to the ongoing success of your business. These intuitive tools work as incredibly effective data curation and filtration systems. As a result, your decisions will be accurate, and you will never waste time on redundant data again.
    • Organizational inclusion: The accessible, seamless functionality of BI tools means that you don’t have to be technically-minded to reap the rewards of data intelligence. As it’s possible to customize each dashboard to the specific needs of your user with ease and extract meaningful insights from a wealth of dynamic KPIs, everyone within the organization can improve their direct performance with data analytics, something that will benefit the entire organization enormously. Today’s dashboards are inclusive and improve the overall value of your organization’s data.
    • Data storytelling capabilities: Our brains are wired to absorb compelling narratives. If you’re able to tell an inspiring, relevant story with your data, you can deliver vital information in a way that resonates with your audience, whether it’s employees or external stakeholders. Intelligence dashboards make data storytelling widely accessible. 

    10. Human resources and employee performance management

    Last but certainly not least in our definitive rubdown of BI benefits, we’re going to consider how BI-centric reports can assist performance management.

    By gaining centralized access to performance-based KPIs, it’s easy to identify trends in productivity, compare relevant metrics, and hone in on the individual performance. In doing so, you can catalyze the success of your business in a big way. To put this into perspective, we’re going to look at human resources and employee performance management.

    In many ways, your employees are the lifeblood of your entire organization. If the talent within your organization is suffering, your business will, too. Keeping your staff engaged and motivated is vital.

    Role or department aside, if your employees are invested in their work, each other, and the core company mission, your business will continue to thrive. But how can reporting business intelligence software help with employee engagement and motivation?

    By gaining access to dynamic visual data based on the individual as well as collective employee performance, it’s possible to offer training as well as support to your staff where needed, while implementing leader boards to inspire everyone to work to the best of their abilities.

    Offering your employees tailored support and growth opportunities, showing that you care, and offering incentives will help you increase motivation exponentially. As a primary duty of the modern human resources department, having the insights to manage internal talent at your disposal is crucial. 

    The ability to interact with focused employee data will empower you to create strategies that boost performance, employee satisfaction, and internal cohesion in a way that gives you an all-important edge on the competition.

    Improved internal communication plays a pivotal role in employee performance and motivation. Find out how big screen dashboards can help improve departmental cohesion with our definitive guide to office dashboards.

     'Data that is loved tends to survive'. – Kurt Bollacker, a renowned computer scientist.

    Reporting in business intelligence: the future of a sustainable company

    Collecting data in today’s digitally-driven world is important, but analyzing it to its optimum capacity is even more crucial if a business wants to enjoy sustainable success in the face of constant change.

    Reporting and business intelligence play a crucial role in obtaining underlying figures to explain decisions and present data in a way that offers direct benefits to the business. As we mentioned earlier, there is no industry that isn’t currently affected by the importance of data and analysis. We have only scratched the surface with our top benefits which any company can take advantage of and bring positive business results.

    In this bold new world of data intelligence, businesses of all sizes can use BI tools to transform insight into action and push themselves ahead of the pack, becoming leaders in their field.

    Spotting business issues, with a BI solution that provides detailed business intelligence reports, can only create space for future development, cost reduction, and comprehensive analysis of the strategic and operational state of a company.

    Author: Sandra Durcevic

    Source: Datapine

  • The transformation of raw data into actionable insights in 5 steps

    The transformation of raw data into actionable insights in 5 steps

    We live in a world of data: there’s more of it than ever before, in a ceaselessly expanding array of forms and locations. Dealing with Data is your window into the ways organizations tackle the challenges of this new world to help their companies and their customers thrive.

    In a world of proliferating data, every company is becoming a data company. The route to future success is increasingly dependent on effectively gathering, managing, and analyzing your data to reveal insights that you’ll use to make smarter decisions. Doing this will require rethinking how you handle data, learn from it, and how data fits in your digital transformation.

    Simplifying digital transformation

    The growing amount and increasingly varied sources of data that every organization generates make digital transformation a daunting prospect. But it doesn’t need to be. At Sisense, we’re dedicated to making this complex task simple, putting power in the hands of the builders of business data and strategy, and providing insights for everyone. The launch of the Google Sheets analytics template illustrates this.

    Understanding how data becomes insights

    A big barrier to analytics success has been that typically only experts in the data field (data engineers, scientists, analysts and developers) understood this complex topic. As access to and use of data has now expanded to business team members and others, it’s more important than ever that everyone can appreciate what happens to data as it goes through the BI and analytics process. 

    Your definitive guide to data and analytics processes

    The following guide shows how raw data becomes actionable insights in 5 steps. It will navigate you through every consideration you might need to make about what BI and analytics capabilities you need, and every step of the way that leads to potentially game-changing decisions for you and your company.

    1. Generating and storing data in its raw state

    Every organization generates and gathers data, both internally and from external sources. The data takes many formats and covers all areas of the organization’s business (sales, marketing, payroll, production, logistics, etc.) External data sources include partners, customers, potential leads, etc. 

    Traditionally all this data was stored on-premises, in servers, using databases that many of us will be familiar with, such as SAP, Microsoft Excel, Oracle, Microsoft SQL Server, IBM DB2, PostgreSQL, MySQL, Teradata.

    However, cloud computing has grown rapidly because it offers more flexible, agile, and cost-effective storage solutions. The trend has been towards using cloud-based applications and tools for different functions, such as Salesforce for sales, Marketo for marketing automation, and large-scale data storage like AWS or data lakes such as Amazon S3, Hadoop and Microsoft Azure.

    An effective, modern BI and analytics platform must be capable of working with all of these means of storing and generating data.

    2. Extract, Transform, and Load: Prepare data, create staging environment and transform data, ready for analytics

    For data to be properly accessed and analyzed, it must be taken from raw storage databases and in some cases transformed. In all cases the data will eventually be loaded into a different place, so it can be managed, and organized, using a package such as Sisense for Cloud Data Teams. Using data pipelines and data integration between data storage tools, engineers perform ETL (Extract, transform and load). They extract the data from its sources, transform it into a uniform format that enables it all to be integrated. Then they load it into the repository they have prepared for their databases.

    In the age of the Cloud, the most effective repositories are cloud-based storage solutions likeAmazon RedShift,Google BigQuery, Snowflake, Amazon S3, Hadoop, Microsoft Azure. These huge, powerful repositories have the flexibility to scale storage capabilities on demand with no need for extra hardware, making them more agile and cost-effective, as well as less labor-intensive than on-premises solutions. They hold structured data from relational databases (rows and columns), semi-structured data (CSV, logs, XML, JSON), unstructured data (emails, documents, PDFs), and binary data (images, audio, video).  Sisense provides instant access to your cloud data warehouses.

    3. Data modeling: Create relationships between data. Connect tables

    Once the data is stored, data engineers can pull from the data warehouse or data lake to create tables and objects that are organized in more easily accessible and usable ways. They create relationships between data and connect tables, modeling data in a way that sets relationships, which will later be translated into query paths for joins, when a dashboard designer initiates a query in the front end. Then, users, in this case, BI and business analysts, can examine it, create relationships between data, connect and compare different tables and develop analytics from the data.

    The combination of a powerful storage repository and a powerful BI and analytics platform enables such analysts to transform live Big Data from cloud data warehouses into interactive dashboards in minutes. They use an array of tools to help achieve this.Dimension tables include information that can be sliced and diced as required for customer analysis ( date, location, name, etc.). Fact tables include transactional information, which we aggregate. TheSisense ElastiCube enables analysts to mashup any data from anywhere. The result: highly effective data modeling that maps out all the different places that a software or application stores information, and works out how these sources of data will fit together, flow into one another and interact.

    After this, the process follows one of two paths:

    4. Building dashboards and widgets

    Now,developers pick up the baton and they create dashboards so that business users can easily visualize data and discover insights specific to their needs. They also build actionable analytics apps, thereby integrating data insights into workflows bytaking data-driven actions through analytic apps. And they define exploration layers, using an enhanced gallery of relationships between widgets.

    Advanced tools that help deliver insights include universal knowledge graphs and augmented analytics that use machine learning (ML)/artificial intelligence (AI) techniques to automate data preparation, insight discovery, and sharing. These drive automatic recommendations arising from data analysis and predictive analytics respectively. Natural language querying puts the power of analytics in the hands of even untechnical users by enabling them to ask questions of their datasets without needing code, and to tailor visualizations to their own needs.

    5. Embed analytics into customers’ products and services

    Extending analytics capabilities even further, developers can create applications that they embed directly into customers’ products and services, so that they become instantly actionable. This means that at the end of the BI and analytics process, when you have extracted insights, you can immediately apply what you’ve learned in real time at the point of insight, without needing to leave your analytics platform and use alternative tools. As a result, you can create value for your clients by enabling data-driven decision-making and self-service analysis. 

    With a package like Sisense for Product Teams, product teams can build and scale custom actionable analytic apps and seamlessly integrate them into other applications, opening up new revenue streams and providing a powerful competitive advantage.

    Author: Adam Murray

    Source: Sisense

  • Three objectives to guide your business' KPI's

    Three objectives to guide your business' KPI's

    Many data analytics vendors give users the ability to measure everything but offer little guidance. This can be overwhelming for new users. It is very important to determine the metrics that really matter to your business. To get you started, your business should establish critical metrics, and then teach you how to quickly identify areas of concern to meet the unique needs of your business. 

    We have learned three objectives that serve as guideposts to help you decide what to measure. These guideposts are also a rubric to make sure that each functional area of the business is aligned toward overall success. In other words, every area of the business, like sales, inventory management, operations, and finance, is measuring core Key Performance Indicators (KPIs) that contribute to the overall success of the business. The three key objectives are improved customer experience, optimizing growth, and increasing profitability. Excelling in these three areas will drive your business goals. Each of these objectives drives and supports the others and creates a framework for success.

    1. Improve customer experience

    When considering how to improve customer experience, it may be helpful to begin asking the following questions. What is the experience of your customer base? How would you measure that experience? Do you know what factors might be impacting your customers’ experience? Do you know how to measure those factors?

    Customer experience is critical to increasing your market share. However, this is difficult to do if your customers are leaving because they are dissatisfied. So, how can we make sure our customers have a great experience and want to keep us as their supplier? First, customers want their orders on time. It might be they need their order delivered to a job site so they can complete their work.

    In this case, a key metric is 'delivery in full, on time' (DIFOT). A gauge on your dashboard can quickly show you what percentage of your orders are delivered in full and on time. In just a few clicks you can go from a high-level summary to a detailed analysis of your data to see DIFOT rates by warehouse, category of products, individual products, and more to pinpoint the problem. Is it a shipping problem from a particular warehouse? Is there a problem with a product category? Do I have enough product in stock? This is a key element to a positive customer experience. To be sure you always have the right product in stock, create a KPI to measure 'stock outs' or priority items out of stock.

    2. Support company growth

    When considering ways to support the growth of your company, begin with the following questions: What are your top growth opportunities for new customers or new products? Are you aware of your biggest opportunities? Where might you have some risks? Can you quickly list these risks and opportunities? Growth is the key to business success. If you’re not increasing your share of the market, or at least keeping up with your competitors, then eventually you’re going to be out scaled. Maybe you have enough market share for the immediate future, but if you’re not striving to grow, then you are likely to be overtaken by your competition.

    It is important for sales managers to be alerted to 'customers in decline'. By having market analysts monitor customers whose sales have been declining for the last few months, your sales team will be able to quickly intervene before the sale is lost. Begin with the customers with the highest sales values to prevent the greatest losses. Another important alert is new customers and the product categories and individual products they are purchasing. The purchasing manager should pay attention to the sales trends for new products to ensure there is always enough stock on hand. 

    3. Enhance profitability

    What can you do today to move the needle on your profitability? This is a core objective for every business. In the beginning, a small company must focus its efforts on gaining volume. However, once a company has matured, it is in a position to make small, subtle changes that will have a tremendous impact on profit. 

    Improving profitability usually involves making small changes in highly repeated business processes, adapting to your environment. For instance, strategic price increases can improve your profit margin without risking sales volume. Improving delivery processes can reduce the cost of each truck leaving the warehouse. Minimizing deadstock frees up cash that can be used on other profitable investments. To monitor profitability, your sales manager can create a KPI to monitor margin trends, deadstock, and low turns.

    When measuring the right KPI's, your sales team will know which customers are at risk. Your accounting team will know to keep an eye on those customers’ accounts receivables. Your warehouse will know how they’re performing against on-time delivery targets. In this way, each area of your company can work to meet the same three objectives to drive your success.

    Source: Phocas Software

  • Top 4 e-mail tracking tools using big data

    Top 4 e-mail tracking tools using big data

    Big data is being incorporated in many aspects of e-mail marketing. It has made it surprisingly easy for organizations to track the performance of e-mail marketing campaigns in fascinating ways.

    How big data changes e-mail tracking

    No matter what your role is, if you work in the technology sector, you likely spend a large portion of your day dealing with e-mail in some way. You’re sending, reading, or reviewing e-mails, or you’re checking your inbox to see if anything else comes in. By some estimates, the average worker even spends 30 hours a week checking their e-mail.

    Despite being such a centrally important and frequent job function, most of us are flying blind. We don’t understand how much time we’re spending on e-mail, nor do we have a solid understanding of whether our efforts are productive. Fortunately, there are several new e-mail tracking software tools that employers and employees can use to keep a closer eye on these metrics.

    The problem is that previous e-mail monitoring tools lacked the analytics capabilities needed to make empirically based decisions with the quality managers needed. Big data is making it easier for companies to get deeper insights.

    Why use e-mail tracking software tools that rely on big data?

    There are many potential applications for e-mail tracking software tools, but these are some of the most important:

    • Productivity analytics. Studying how you e-mail can alert you to the nuances of your e-mail habits, including how often you send e-mail, how long it takes you to write and read e-mail, and what your busiest days and times are. You’ll learn what your worst habits are, so you can polish them and use your time more efficiently, and will learn to optimize your schedule to get more done each day.
    • Sales and response metrics. Many companies rely on sales or prospecting via e-mail, but if you aren’t gathering metrics like open rates and response rates, you may not be able to improve your process over time. e-mail tracking software can help you keep tabs on your progress, and may help you gather or organize information on your prospects at the same time.
    • Employee monitoring. Employees waste about 3 hours a day on unproductive activities, while most human resources departments only assume that 1 hour or less is wasted per day. Using some kind of e-mail tracking can help you measure your employees’ productivity, and help you balance workloads between multiple employees.

    Big data is at the root of all of these functions, and this makes it critical to control your data. It makes it easier for brands to get better insights.

    The best e-mail tracking software tools that leverage big data

    Some e-mail tracking tools focus exclusively on one e-mail function, like tracking sales or marketing campaigns. Others offer a more robust suite of features, allowing you to track your overall productivity.

    Whatever your goals are, these five tools are some of the best e-mail tracking apps you can get your hands on. They all rely on sophisticated big data analytics systems.

    1. EmailAnalytics

    First, we have EmailAnalytics, which can be thought of like Google Analytics for Gmail. This tool integrates with your Gmail or G Suite account and visualizes your e-mail activities into charts, graphs, and tables. It reports on metrics like average e-mail response time, e-mails sent, e-mails received, times and days of the week that are busiest for you, and how long your average e-mail threads tend to last. With the help of interactive data visuals and regular reports, you can quickly determine the weak points in your approach to e-mail (and resolve to fix them). The tool also enables managers to view reports for teams or employees, so you can monitor team e-mail productivity.

    2. Microsoft MyAnalytics

    Microsoft’s MyAnalytics isn’t quite as robust as EmailAnalytics, but it works quite well as a productivity tracker for Microsoft Outlook. With it, you can keep track of how you and your employees are spending the hours of your day, drawing in information from your e-mail inbox and calendar. If you’re spending too much time in meetings, or too much time on managing your inbox, you’ll be able to figure that out quickly, and starting making proactive changes to your scheduling and work habits.

    3. Streak

    Streak is another Gmail tool, and one that attempts to convert Gmail into a full-fledged CRM platform. With it, you can convert messages into leads and prospects across various pipelines, and track your progress with each new prospective sale. It also offers built-in collaboration tools, so your team can work together on a single project—and track each other’s efforts.

    4. Yesware

    Yesware is designed with salespeople and salesmanagers in mind, and it offers prescriptive sales analytics based on your e-mail activity. With it, you can track a number of metrics within your e-mail strategy, including open rates, click-through rates, and other forms of customer engagement. Over time, you’ll learn which strategies work best for your prospects, and can use those strategies to employ more effective sales techniques.

    Implementing these e-mail tracking software tools in your business can help you better understand how you and your employees are using e-mail, improve your sales process, and spend less time on this all-too-important communication medium. Just remember, while data visuals and reports can be helpful in improving your understanding, those insights are only truly valuable if you take action on them.

    Big data makes e-mail tracking more effective than ever

    Big data is changing the nature of e-mail marketing. Companies can use more nuanced data analytics capabilities to drive their decision-making models in fascinating ways.

    Author: Matt James

    Source: SmartDataCollective

  • Using big data to improve as a manufacturer

    Using big data to improve as a manufacturer

    Here's how to implement manufacturing analytics today, in a world where big data, business intelligence, and artificial intelligence are steadily expanding.

    Big data is everywhere, and it’s finding its way into a multitude of industries and applications. One of the most fascinating big data industries is manufacturing. In an environment of fast-paced production and competitive markets, big data helps companies rise to the top and stay efficient and relevant.

    Manufacturing innovation has long been an integral piece of our economic success, and it seems that big data allows for great industry gains. Improvements in efficiency, maintenance, decision-making and supply chain management are possible with the right data tools. Anything from staff schedules to machine performance can be improved with big data.

    Decreasing inefficiency with big data

    Manufacturers are always looking for ways to make marginal improvements in their systems and how they operate. This type of management can be complex, and with the many different steps of the supply chain, teasing out every last detail to improve can be challenging. Thankfully, with big data, manufacturing companies can competently manage supply chain details in order to oversee any possible improvements available.

    Big data allows manufacturers to look at each discrete part of a supply process. This microscopic view of the supply chain can show managers new insights into how their process can be improved or tweaked. Big data can be used in different ways to cut down on supply chain inefficiencies. Individual machines, supply chain setup, and staffing, among others, are all components of a manufacturer’s efficiency.

    More and more manufacturers are closing gaps in inventory inefficiencies, too. For example, 72% of manufacturers consider real-time monitoring essential for modern inventory reconciliation.

    Managing supply and customization

    Taking the customer’s preferences into consideration when configuring the manufacturing processes is of extreme importance. The need for consumer customization is a challenge for supply chain managers. Cookie-cutter solutions don’t apply to consumers anymore. They want and need customized products and services. However, in most scenarios, added customization equals added costs. Big data can help bridge that gap of wanting to appease customers while making ends meet at the same time.

    With advanced data analytics, manufacturers can see customer data in real-time. This reduces the time required to make necessary adjustments to the product lines, cutting down on wasted time and improving overall efficiency.

    One of the largest effects of real-time monitoring in manufacturing is the ability to improve order-to-fulfillment cycle times. Building a robust data platform can transform the way manufacturers handle their customers and supplies. Not only are real-time results available, but big data can also provide demand forecasts to guide the production chain based on historical data sales trends in order to stay on top of the demand.

    Predictive maintenance

    One way to reduce the amount of downtime spent on fixing manufacturing machines is fixing the machines before they break. The ability to monitor manufacturing assets in order to predict necessary maintenance is another application for big data. The less time a machine is out of commission, the less money is being lost. With increased notice before a breakdown occurs, you can secure an easy win for your company’s return on investment: you’ll be able to form a strategy around those maintenance intervals and costs without having any negative surprises.

    Big data means using a wired or wireless connection to track machine utilization with greater accuracy to see the variables that could impact its performance. A manager can see what or who is performing optimally, giving the information needed when making business decisions.

    Improved strategic decision-making

    With all of the information available today, many decisions can be driven by big data. The power of advanced data collection and monitoring systems means increasingly little guesswork when it comes to overall management strategy. A well-structured data management system can connect supply line communication. There can be many areas within a manufacturing company that may not speak to each other effectively. If big data is applied to the process, information can be gathered and analyzed across departments and locations.

    With big data, there is less guessing and more data-backed action.

    Deconstructing big data in manufacturing

    There are several steps involved before big data can be utilized by parties within the manufacturing industry:

    • Gathering and storing data: The ability to gather data is essential in the big data process. Although many systems can gather data, accurate data is much harder to find. Once the data is gathered, it must be stored. Storing data is essential for keeping quality records of important business assets as well as for overall safety and auditability.
    • Cleaning and analyzing data: Gathering and storing data is not helpful when you can’t find the data you need to make decisions. Data cleaning allows the immense amount of data to become more scalable. Trends and patterns are easier to spot when the data is clean. Analyzing relevant data is what leads to strategic business decisions.
    • Data mining: The ability to find information fast and easily is of extreme importance in the manufacturing industry, since each decision can have a major impact on the bottom line. Advanced data mining allows a company to find the data they need exactly when they need it.
    • Data monitoring: A strong data monitoring system allows manufacturers to keep their business up to industry standards. The continual ability to monitor important data points that matter to your company is essential in having a competitive advantage.


    Big data is certainly a buzzword within many industries, and for good reason. The ability to collect important data is priceless to a business and can easily lead them to a competitive advantage. However, the ability to use big data in an efficient and useful way in order to make business decisions is more challenging. Making sure there is a purpose behind all that data is necessary for taking advantage of all big data has to offer.

    Author: Megan Ray Nichols

    Source: SmartDataCollective

  • Using Business Intelligence in an optimal way to support business decisions

    Using Business Intelligence in an optimal way to support business decisions

    By utilizing a fact-based, real-time, singular version of the truth, business people are empowered to achieve and maintain a competitive edge through the use of industry specific business intelligence. Executives, CFOs, branch managers and your sales team have immediate access to crucial information to make fast and educated decisions.

    Data is available across core business processes such as industry trends, customer behavior, productivity, inventory, and detailed financial analysis. Business intelligence software extracts the information, transforming it into clear insights to enable actionable and strategic decision-making so people can readily achieve their goals.

    Better sales decisions

    Industry specific business intelligence enables companies to discover detailed sales trends based on their customers’ preferences, reactions to promotions, online shopping experiences, purchasing habits, and patterns and trends which affect sales.  Leveraging customer buying habits permits a company to decide the best course of action to retain valuable customers and take advantage of missed sales opportunities.

    By drilling down to such comprehensive insights, a company can quickly decide which link-sell opportunities to increase or which products are best for cross selling.  By identifying customers in decline, a business can determine the best plan to reposition the product before they stop buying altogether. Sales managers are able to identify the best type of customers, where to find them, and determine the most effective acquisition and conversion strategies. By identifying bottom-buyers, a company may make decisions around the best promotional strategies or whether to let those customers go.

    Having a clear picture of sales trends also allows collaboration in marketing and management decision-making. 

    Better marketing strategy

    By monitoring trends to determine customer preferences, a company can quickly make strategic marketing decisions to best capitalize on their products or services. Data analytics software can identify promotional returns and analyze campaign outcomes. A company can now use it to decide how to prioritize campaigns, tailor promotions, and engage in social media to maximize marketing efforts. This enables a company to make decisions that will fine-tune their marketing strategies, reduce overhead and garner a better return on investment.

    Better business decisions 

    Data analytics allows Executives to make decisions based on statistical facts. Those facts can be used to guide choices about future company growth by evaluating a long-term view of the market and competition. Data analytics can help Executives decide how to streamline processes by using visualizations identifying the productivity in each area of the company, including employee management. By identifying actionable insights, a manager can determine the most effective strategies to improve employee productivity, streamline the recruitment process, or reduce employee turnover. Data analytics allows Executives to funnel all of the facts into making crucial operational decisions.

    Better inventory decisions

    Using data analytics to identify problem areas and opportunities allows a company to make decisions that will refine their inventory management. For example, the decision to reduce excess inventory also reduces the cost to maintain it. With better visibility, a company can make better decisions about how much to order and when. Knowing a product’s ordering patterns along with the best times, prices, and quantities to buy also allows managers to change pricing tiers to increase profit margins and capitalize on every opportunity.

    Better financial decisions

    Data analytics offers an up to date view of a company’s financial picture. A Manager can view profit and loss, general ledger, and balance sheet figures through features such as Phocas' Financial Statements. Top-notch BI will allow businesses to drill all the way down to individual transactions to get instant answers to revenue opportunities and cost concerns. By examining incoming and outgoing finances of the present and past, a business can make decisions based on the company’s future financial status. Breaking down revenue by location evaluates the strength of product lines by branch. For example, a business may decide to remove a specialty item from one location and increase its promotion in another. Customizing the dashboard allows Executives to track key performance indicators (KPI’s) to enable effective financial oversight and management.

    By analyzing data and monitoring critical business operations, a company is well positioned for successful strategic decision-making based on factual insights and “one view of the truth.”

    Source: Phocas Software

  • Wat te verwachten van BI in 2020 volgens Toucan Toco

    Wat te verwachten van BI in 2020 volgens Toucan Toco

    In 2020 wordt elke seconde bijna 1,7 MB aan data gegenereerd. Het potentieel van deze data is oneindig. Toucan Toco, specialist in data storytelling, ziet vijf manieren waarop Business Intelligence, oftewel het verzamelen, analyseren en presenteren van data om daarmee betere beslissingen te kunnen nemen, in 2020 gaat veranderen.  

    AI in BI

    Kunstmatige intelligentie (AI) heeft invloed op elk gebied binnen organisaties en business intelligence is daar geen uitzondering op. Het grote potentieel van deze technologie belooft menselijke intelligentie te vergroten door een revolutie teweeg te brengen in de manier waarop we omgaan met bedrijfsgegevens en analyses. Hebben we het beste van AI in BI al gezien? Toucan Toco zegt zeker van niet. Het is duidelijk dat AI enorme hoeveelheden gegevens sneller kan verwerken dan mensen. Bovendien biedt de technologie een nieuw perspectief in business intelligence en wordt het verkrijgen van inzichten die eerder onopgemerkt bleven gemakkelijker. Met de opkomst van explainable AI (vaak afgekort als XAI), oftewel het verklaren hoe kunstmatige intelligentie tot een bepaalde uitkomst komt, zal het niet lang duren voordat AI-beslissingen op een begrijpelijke manier gerechtvaardigd kunnen worden. En de verwachting is dat het komende jaar(en) meer kritieke ontwikkelingen zullen plaatsvinden. AI in BI is een blijvende ontwikkeling en de impact ervan zal ver na 2020 voelbaar zijn.

    Focus op datakwaliteit  

    Data zijn de levensader van elk bedrijf. Er is echter één essentieel voorbehoud: als data niet nauwkeurig, actueel, consistent en volledig zijn, kan dit niet alleen tot verkeerde beslissingen leiden, maar zelfs de winstgevendheid aantasten. IBM berekende dat alleen al in de VS bedrijven elk jaar 3,1 biljoen dollar verliezen vanwege slechte datakwaliteit. Slechte datakwaliteit is een probleem waar bedrijven van alle groottes al lang last van hebben. Bijgevolg wordt het probleem nog erger naarmate databronnen steeds meer met elkaar verweven raken.
    De opkomst van Data Quality Management zorgt voor verandering. Beheer van datakwaliteit is een integraal proces dat technologie, proces, de juiste mensen en organisatiecultuur combineert om data te leveren die niet alleen nauwkeurig maar ook nuttig zijn. Data Quality Management was een van de populairste focusgebieden in business intelligence in 2019. Elk bedrijf wil processen voor het optimaliseren van datakwaliteit implementeren om beter business intelligence toe te kunnen passen. In 2020 zal deze focus nog groter worden.   

    Overal actionable analytics

    Traditioneel bestaat er een letterlijke afstand tussen waar business intelligence-data wordt verzameld en waar BI-inzichten ontstaan. Om echter grip te houden op zakelijke workflows en processen, kunnen bedrijven niet langer data analyseren in de ene silo en actie ondernemen in de andere. Gelukkig zijn moderne BI-tools zo geëvolueerd dat bedrijfsdata beschikbaar gemaakt kunnen worden daar waar gebruikers actie willen ondernemen. Deze tools worden samengevoegd met kritieke bedrijfsprocessen en workflows door bijvoorbeeld dashboard-uitbreidingen en API's. Als gevolg hiervan is het nu eenvoudig om actionable analytics te implementeren om het besluitvormingsproces te versnellen. Zakelijke gebruikers kunnen nu data inzien, er bruikbare inzichten uit afleiden en deze implementeren, allemaal op één plek. De meeste BI-tools bieden bovendien mobile analytics om overal en altijd inzichten te leveren. Hoewel actionable analytics een van de opkomende trends in business intelligence is, is de populariteit ervan al groot en zal dit volgend jaar alleen maar toenemen.

    Data storytelling wordt de norm

    Data analyseren is één ding, data interpreteren en er lering uit trekken is een tweede. Juist die interpretatie en inzichten uit data vormen de leidraad voor besluitvorming in het Business Intelligence-proces. Bedrijven hebben zich gerealiseerd dat dashboards-cijfers alleen geen zin hebben als ze niet nauwkeurig zijn van een context zijn voorzien en kunnen worden geïnterpreteerd. In de datagedreven wereld wordt data storytelling daarom steeds belangrijker. Storytelling voegt context toe aan statistieken en biedt het verhaal dat nodig is om inzichten om te zetten in daden. In 2020 zorgt data storytelling voor een verdieping voor de manier waarop bedrijven data gebruiken om nieuwe inzichten te ontdekken.

    Data discovery verrijkt met datavisualisatie

    Data discovery is een proces waarbij data uit meerdere silo's en databases worden verzameld en samengevoegd in één bron om de analyse te vereenvoudigen. Dit helpt bedrijven ook om betere afstemming en samenwerking te realiseren tussen mensen die de data prepareren voor analyse en de mensen die de analyse uitvoeren en er inzichten uithalen. Systemen voor data discovery maken het steeds gemakkelijker voor iedere werknemer om toegang te krijgen tot data en er de informatie uit te halen die ze nodig hebben. Ook datavisualisatie ontwikkelt zich en is onder meer uitgebreid met heatmaps en geografische functionaliteit. Doordat data discovery en datavisualisatie de eindgebruiker steeds meer bieden verwacht Toucan Toco dat organisaties in 2020 de data die zij tot hun beschikking hebben nog beter weten te benutten en daarmee onverwachte inzichten kunnen blootleggen.

    Bron: BI-platform

  • What is dark data? And how to deal with it

    What is dark data? And how to deal with it

    It’s easier than ever to collect data without a specific purpose, under the assumption that it may be useful later. Often, though, that data ends up unused and even forgotten because of several simple factors: The fact that the data is being collected isn’t effectively communicated to potential users within an organization. The repositories that hold the data aren’t widely known. Or perhaps there simply isn’t enough analysis capacity within the company to process it. This data that is collected but not used is often termed 'dark data'. 

    Dark data presents an organization with tremendous opportunities, as well as liabilities. If it is harnessed effectively, it can be used to produce insights that wouldn’t otherwise be available. With that in mind, it’s important to make this dark data accessible so it can power those innovative use cases.

    On the other hand, lack of visibility into all the data being collected within an organization can make it difficult to accurately manage costs, and easy to accidentally run afoul of retention policies. It can also hamper efforts to ensure compliance with regulations like the EU’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA).

    So what can be done to maximize the benefits of dark data and avoid these problems?

    Some best practices

    When dealing with dark data, the foremost best practice is to shine a spotlight on it by communicating to potential users within the organization what data is being collected.

    Secondly, organizations need to evaluate whether and for how long it makes sense to retain the data. This is crucial to avoid incurring potentially substantial costs collecting and storing data that isn’t being used and won’t be used in the future, and even more importantly to ensure that the data is being handled and secured properly.

    Perhaps the biggest challenge when working with dark data is simply getting access to it, as it’s often stored in siloed repositories close to where the data is being collected. Additionally, it may be stored in systems and formats that are difficult to query or have limited analytics capabilities.

    So the next step is to ensure that the data that is collected can actually be used effectively. The two main approaches are: (1) investing in tooling that can query the data where it is currently stored, and (2) moving the data into centralized data storage platforms. 

    I recommend combining these two approaches. Firstly, adopt tools that provide the ability to discover, analyze, and visualize data from multiple platforms and locations via a single interface, which will increase data visibility and reduce the tendency to store the same data multiple times. Second, leverage storage platforms that can efficiently aggregate and store data that would otherwise be inaccessible, in order to reduce the number of data stores that must be tracked and managed.

    Considering the potential power and pitfalls that come with having dark data in your organization, it’s definitely worth the effort to bring it out of the dark.

    Author: Dan Cech

    Source: Insidebigdata

  • Why you should use data analytics to determine your pricing strategy

    Why you should use data analytics to determine your pricing strategy

    Every company must have a strategy to price their goods. Your pricing strategy is a fundamental component to your marketing process as pricing can increase sales or send customers to your competitors. Because a variety of factors such as product life cycle, competition, and customer perception can affect pricing decisions, it’s important to consider these when determining the best pricing strategy for your company. 

    Data analytics provides a clear, consolidated view of your pricing, allowing you to make sound pricing decisions. We've examined the three most common strategies: cost-plus, competitor-based, and value-based, and how data analytics can help manage each one across your customer base.

    Cost plus pricing

    When people think of the term ‘pricing strategy’, cost-plus pricing is what comes to mind. This is the simplest form of pricing as it is just a matter of pricing your products above cost. Simply total all of your costs and add the margin you want on top to determine the price. The benefit of this strategy is that there is no strategizing. There is very little data analysis or market research involved. Due to this, cost-plus pricing has been considered a good starting point for a new company with little overhead.

    However, cost-plus pricing is harder to manage over time as you may not be able to predict all of your costs since costs can fluctuate. If, for example, your company calculates your costs and adds a 15% margin, this may work well for the first quarter. But if some unexpected cost comes up, such as a supplier raising their prices, your margin may be cut to 10%. A data analytics solution will help manage these unforeseen costs and you can set up alerts to advise when margins drop beyond a set threshold.

    Competitor based pricing

    Rather than using costs as a benchmark, this strategy is based on setting your prices according to your competitors’ pricing.  This is common when companies are vying for the same contract with government in health or construction. When you are in a market with a product that is not unique or where prices are already established, it’s best to set your prices somewhere in the middle but data analytics can help you do modelling for tenders so you can put forward desired volumes to receive the preferred price.

    On the other hand, if you are offering a better product with new features or more value, you should consider pricing your products higher than your competitors. And setting your prices below your competitors is similar to cost-plus pricing as this depends on your resources. Are you to be able to withstand unexpected costs? If not, you risk impacting your profit margins. In any case, your pricing should be close to those of your competitors if you’re in a highly competitive market.

    The drawback to competitor-based pricing is that you don’t have a strategy that addresses the unique needs and concerns of your company. By developing your own pricing strategy, you can focus on adding value by offering better products at the right price. Data analytics will allow you to determine best selling products, in what markets and to what customers which will help drive a more efficient pricing policy.

    Value based pricing

    Value-based pricing is setting your prices based on what your customers believe your product is worth and what they are willing to pay. The more value your product offers your customers, the more money they will be willing to pay. Rather than looking at your costs or competitors, value-based pricing requires you to look to your customers. By getting to know the people who decide whether to purchase your product, you ensure that you understand what your customers truly want, and that you are offering the most value for the best price.

    When determining the price point for a product, consider factors such as whether your product is different from your competitors. Will it help your customers to save time or money? Will it help your customers gain a competitive advantage? What features can you develop over time? Answers to these questions will help you determine your product’s value and whether your customers are willing to pay for it. Once you know your customers are willing to pay for your product, you can set a higher price point and then raise prices as you add more value.  The downside to value-based pricing is that it takes time. You must be willing to invest the time to get to know your customers and understand their needs to set effective prices this way. 

    Data analytics allows you to compare and assess different strategies

    With data analytics, you can price according to your target market. Analytics enables companies to dramatically improve profitability by developing optimal pricing strategies to win more contracts and offer the most value to customers. Combining pricing with analytics allows you to leverage your data to understand both the internal and external factors affecting profitability at a granular level.

    In spite of the wealth of data available, many companies are still in the dark when it comes to understanding their customers. Yet, facing growing complexity and a multi-channel business environment, companies must be able to answer fundamental questions such as 'Who is my most profitable customer?' and 'What is my most profitable product or region?' Answering these questions can help you understand your customers and their buying behaviors to create the most effective pricing strategies. In other cases, analytics can highlight your most unprofitable customers so you can realign their discounts or other incentives to increase profits. With analytics, you have a mechanism that functions as both a catalyst and a metrics engine for managing profitability. 

    Source: Phocas Software

  • Wrangling and governing unstructured data

    Unstructured data is the common currency in this era of the Internet of Things (IoT), cognitive computing, mobility and social networks. It’s a core rebusinessIntelligence unstructuredsource for businesses, consumers and society in general. But it’s also a challenge to manage and govern.

    Unstructured data’s prevalence

    How prevalent is unstructured data? Sizing it up can give us a good sense for the magnitude of the governance challenge. If we look at the world around us, we see how billions of things become instrumented and interconnected, generating tons of data. In the Internet of Things, the value of things is measured not only by the data they generate, but also by the way those things securely respond to and interact with people, organizations and other things.

    If we look into public social networks such as Facebook, LinkedIn or Twitter, one of the tasks will be to know what the social network data contains to extract valuable information that can then be matched and linked to the master data. And mobile devices, enabled with the Global Positioning System (GPS), generate volumes of location data that is normally contained in very structured data sets. Matching and linking it to master data profiles will be necessary.

    The volume of unstructured information is growing as never before, mostly because of the increase

    of unstructured information that is stored and managed by enterprises, but is not really well understood. Frequently, unstructured data is intimately linked to structured data—in our databases, in our business processes and in the applications that derive value from it all. In terms of where we store and manage it, the difference between structured and unstructured data is usually that the former resides in databases and data warehouses and the latter in everything else.

    In format, structured data is generated by applications, and unstructured data is free form. In addition, like structured data, unstructured data usually has metadata associated with it. But not always, and therein lies a key problem confronting enterprise information managers in their attempts to govern it all comprehensively.

    Governance of the structured-unstructured data link

    When considering the governance of unstructured data, a focus on the business processes that generate both the data itself and any accompanying metadata is important. Unstructured data, such as audio, documents, email, images and video, is usually created in a workflow or collaboration application, generated by a sensor or other device, or produced upon ingestion into some other system or application. At creation, unstructured data is often but not always associated with structured data, which has its own metadata, glossaries and schemata.

    In some industries, such as oil and gas or healthcare, we handle the unstructured data that streams from the sensors where it originated. In any case, unstructured data is usually created or managed in a business process that is linked to some structured entity, such as a person or asset. Consider several examples: 

    • An insurance claim with structured data in a claims processing application and associated documents such as police records, medical reports and car images
    • A mortgage case file with structured data in a mortgage processing application and associated  pplicant employment status and house assessment documents
    • An invoice with structured data in an asset management application and associated invoice documents
    • An asset with records managed across different applications and associated engineering drawings 

    Governance challenges enter the picture as we attempt to link all this structured and unstructured information together. That linkage, in turn, requires that we understand dependencies and references and find the right data, which is often stored elsewhere in the enterprise and governed by different administrators, under different policies and in response to different mandates.

    What considerations complicate our efforts to combine, integrate and govern structured and unstructured data in a unified fashion? We must know how we control this information, how it is exchanged across different enterprises and what are the regulations and standards to secure delivery of its value and maintain privacy.

    We also need to understand what we are going to do with the data that we collect because just collecting data for future use, just in case, is not the solution for any problems. We can easily shift from competitive advantage to unmanageable complexity.

    Governance perspectives

    Across different industries in a complicated ecosystem of connected enterprises, we handle different types of information that is exchanged, duplicated, made anonymous and duplicated again. In analytics we handle predictive models to provide recommendations resulting in critical decision making. We need to think about models’ lifecycle and track the data sets used to develop such models as well as ownership changes.

    How can governance be applied here? When we speak about information, integration and governance, we usually get different answers. Some, such as a legal record manager, focus on unstructured data curation, document classification and retention to comply with internal policies and external legislation. On the other hand, data warehouse IT groups focus on structured and transactional data and its quality to maintain the best version of the truth.

    But the business usually doesn’t care about what type of information it is. What they want to see is the whole picture that will include all related information from structured, unstructured and other sources with proper governance around it. The importance for integrated metadata management became crucial.

    Data lifecycle governance environments

    To unify governance of structured and unstructured data, enterprises need to remove borders between information silos. In addition, organizations need to be connecting people and processes inside and outside the organization. And they need to make every effort to create trusted and collaborative environments for effective information configuration and management.

    What should span all information assets, both structured and unstructured, is a consistent set of organizational policies, roles, controls and workflows focused on lifecycle data governance.

    Author: Elizabeth Koumpan

    Source: Big Data & Analytics Hub

EasyTagCloud v2.8