30 items tagged "data analytics "

  • 3 Important don'ts when entering the market with a new business

    3 Important don'ts when entering the market with a new business

    Entering the market with a new business is an exciting experience. Your marketing strategy will play a crucial role in the success of your company. Here are three of the most common marketing mistakes, and how you can overcome them. These are 3 important don'ts' that people tend to do.

    When most people think about starting a business, they often think too far ahead. Even though the long term plan is important, it’s also important to take some time to think about your short term marketing strategy.

    Instead of focusing heavily on niche specific marketing mistakes, we are going to take a look at mistakes that new business owners can make when working out how they are going to market their company and target their customers.

    1. Don't overestimate the idea of needing a complete website for marketing

    One of the first mistakes many business owners make is to believe they must have a complete website to start marketing to their audience. A common trend among new startups that may be useful is creating a ‘coming soon’ page for their potential customers.

    A coming soon page is a way to give customers a taste of what’s to come when your website and business officially launches. We often have it set in our minds that it is impossible to promote something that doesn’t yet exist, but that’s not true.

    2. Don’t forget a contact page

    Surprisingly, many new business owners forget about this crucial aspect of their website. It doesn't matter if you’re creating a coming soon page or launching the full website on day one, you must have an easy way for customers to contact your business!

    There is a variety of free and premium contact form builders available you can download for your website. These builders give you the freedom and flexibility to build custom contact pages for your customers so they can communicate what they need help with regarding your product or service. You could also use this as an opportunity to discover what kind of pain points your customers have, as it pertains to your niche, so you can work on improving your company once you notice a reoccurring problem.

    Contact pages are relevant because they are a way for you to keep in contact with the people who buy your products or services, potentially adding them to your email marketing list. This strategy could open the door for future email offers and makes marketing new products to a test audience easy.

    3. Don’t be afraid to experiment

    When people start marketing their website to a broader audience, they often feel as if their way is the only right way. You have to be able to come to terms with your notions of what customers want and run split tests on your marketing campaign and on-site ads to learn what your customers expect from your brand.

    For example, if you’re running a site about content marketing and keep pushing a free checklist for new bloggers and no one is responding, maybe it’s time to think about other things that could benefit your target audience. You can run split tests with two ad campaigns for instance, one with the free checklist and one with access to a SEO webinar and see which one is the most attractive.

    The point here is that sometimes you have to put different options out there, see how the customers respond, and use the obtained analytic data to determine where to take your business model next.

    Conclusion

    As a new business owner, your marketing strategy will likely change rapidly during the first few years of business. You’ll learn how to handle customer issues and how you can use their problems to build a better product.

    After you’ve nailed down a formula, it’s essential to keep your mind open and anticipate small changes while marketing. The small changes can and will add up to more significant changes over time. One final tip is that you must be prepared to evolve with your brand and your customers. Don’t get too comfortable or set in your ways.

    If the past decade has taught us anything, it’s that marketing is continually changing based on social media, customer perception and needs, and how you present your business to potential leads.

    Author: Thomas Griffin

    Source: Business.com

  • 4 Trends That Are Driving Business Intelligence Demands

    IM Photo business intelligence fourMany organizations have sung the praises of business intelligence for years, but many of those firms were not actually realizing the full benefits of it. That picture is beginning to change, as advanced analytics tools and techniques mature.

    The result is that 2016 will definitely be the ‘year of action’ that many research firms have predicted when it comes to data analytics. That, at least, is the view of Shawn Rogers, chief research officer at Dell Statistica, who believes “we are at a tipping point with advanced analytics.”

    If Rogers sounds familiar, it may be due to his early connection to Information Management. Rogers was, in fact, the founder of Information Management when it was originally called DM Review Magazine. He is now in his second year as chief research officer for Dell Statistica. “Prior to that I was an industry analyst. I worked for Enterprise Management Associates and I covered the business intelligence, data warehousing and big data space.”

    Rogers believes there are a number of key trends driving business intelligence today that are making it more useful for a greater number of organizations.

    “The maturity in the market has helped everyone evolve to a much more agile and flexible approach to advanced analytics. I think there are four things that are driving that which make it exciting,” Rogers says.

    “One of them is the new sophistication of users,” Rogers notes “Users have become very comfortable with business intelligence. They want advanced insights into their business so they’re starting to look at advanced analytics as that next level of sophistication.”

    “They’re certainly not afraid of it. They want it to be more consumable. They want it to be easier to get to. And they want it to move as fast as they are. The users are certainly making a change in the market,” Rogers says.

    The market is also benefitting from new technologies that are enhancing the capabilities of advanced analytics.

    “It now functions in a way that the enterprise functions,” Rogers explains. “Now the technology allows advanced analytics on all of the data within your environment to work pretty much at the speed of the business.”

    Certainly not insignificant is the economic advantage of more competition from data analytics tool vendors.

    “There are all kinds of solutions out there that are less money. It has opened the door for a much wider group of companies to leverage the data in their enterprise and to leverage advanced analytics,” Rogers observes.

    “Lastly, the data is creating some fun pressure and opportunities. You have all these new data sources like social and things of that nature. But even more importantly we’re able to incorporate all of our data into our analysis,” Rogers says.

    “I know that when I was in the press and as an analyst I use to write a lot about the 80/20 rule of data in the enterprise – the 20 percent we could use and the 80 percent that was too difficult. Now with all these new technologies and their cost benefits we’re not ignoring this data. So we’re able to bring in what use to look like expensive and difficult to manage information, and we’re merging it with more traditional analytics.”

    “If you look at more sophisticated users, and economic advantage, and better technology, and new data, everything is changing,” Rogers says. “I think those four pieces are what are enabling advanced analytics to find a more critical home in the enterprise.”

    Finally, the other key trend driving the need for speed when it comes to analytics and business intelligence return on investment is where those investments are coming from. Increasingly they are not from IT, Rogers stresses.

    “I think there has been a big shift and most of the budgets now seem to be coming from the line of business – sales, marketing, finance, customer service. These are places where we’re seeing budgets fly with data-driven innovation,” Rogers says.

    “When you shift away from the technology side of innovation and move toward the business side, there is always that instant demand for action. I think that saturation of big data solutions, the saturation of analytics tools, and a shift from IT to the business stakeholder standpoint is creating the demand for action over just collecting data,” Rogers concludes.

    Source: Information Management

  • 5 Arguments that will convince sales people of the value of analytics

    5 Arguments that will convince sales people of the value of analytics

    Many sales reps have a certain way of doing things. Implementing new processes or adding new tools or technologies that attempt to change their habits can often be met with resistance.

    Sales reps rely on their “tried-and-true” methods learned from predecessors, or they lean on their personal knowledge and experience to manage their customers and plan their approach with individual customers. Gut-feel has been the leading driver for sales strategies for many years, but in today’s fast paced and competitive environment, sales reps need every advantage they can get.  

    A recent McKinsey article suggested, “driving sales growth today requires fundamentally different ways of working, as well as outstanding execution across large, decentralized sales teams and channel partners. While many sales leaders accept this reality in principle, they don’t put sufficient energy or focus into driving that level of change. Advances in digital and analytics, however, mean that sales leaders can now drive and scale meaningful changes that pay off today and tomorrow.”

    So, if you’re a sales rep that doesn’t think you need data analytics, here are five reasons why you do:

    1. There are always more sales opportunities than you think 

    This alone should steer your team toward data analytics. Data can uncover trends in your customers’ buying behavior that can help you identify gaps in their ordering. In addition, your customers’ data can also reveal upsell or cross-sell opportunities that can help you increase your sales volume across a much wider swath of products, without impacting any of your existing sales. While your gut feel may tell you to spend more time with a customer, data can help you understand why, pointing you to new complementary products that can quickly grow your sales.

    2. It is critical to uncover challenges before they impact your bottom line

    There is a good chance one or more of your customers purchase products from other suppliers. What if that same customer started to buy less from you and more from that other supplier that recently entered the market? What if that decline occurred over several months? Would you even know? These are difficult questions to ask and answer, but if you’re like many sales people, you have dozens of customers that you are working with and a slow decline in sales with a single customer may go unnoticed. With data at your fingertips, from your laptop to your mobile device, you can constantly monitor your customers' purchasing habits, and ask questions about negative trends before they start to impact your company’s bottom line and your paycheck.

    3. Retaining customers is easier than finding new ones

    This is related to number two, but it deserves its own bullet point. Retention is a simple business reality that makes your business data even more important. Underserved customers are underserved for a variety of reasons. Perhaps they are new and got lost in the shuffle, or turnover at the sales rep position has left them without support for a period of time. Perhaps they have made several large purchases over the last year and deserve better pricing, or they were once a loyal customer, but their sales have slowly declined, and are at risk of leaving to a competitor. Engaging these at-risk customers requires that you recognize the signs before they take their business elsewhere.

    4. It will make your life easier 

    Access to data analytics has oftentimes only been given to the IT team or specially trained individuals. Data analytics turns raw data into actionable intelligence. No more reading outdated spreadsheets, guessing where your next sale will come from or what information to share with your customer during your next sales meeting. Business intelligence software is designed to help you quickly mine value from data so you can make the right decision for you and your customers. Rows and columns of data are now presented in charts, graphs and tables that you can click to uncover transactional level details that brings to the surface the accounts that need your attention the most. Data analytics helps you eliminate the guess work about your job and focus on what customer you can help the most while also helping you achieve your sales goals.

    5. It helps you prepare to perform

    Imagine going into a customer meeting with their entire order history at your fingertips, or an understanding of their recent commitment to certain brand, style or size of product. How will that information shape your next product presentation or sales proposal? You can turn your customers into data advocates by reviewing with them weekly reports about their engagement with you. Could that information help them improve efficiencies, capitalize on sales promotions or recognize holes in their own ordering? As you share and use your data to help them, you show them that you are committed to their success, as well as your own.

    Data analytics is a powerful tool for sales people that are looking to maximize their performance, grow sales and retain customers. The results of implementing analytics are better revenue growth at the same or improved margins, quickly, while customer satisfaction improves. If you’re not using data to drive your business, there’s no better time than the present to start.

    Source: Phocas Software

  • Are you aware of the value of your data?

    Are you aware of the value of your data?

    While most executives understand that their data is an asset, many haven’t harnessed the valuable insights accessible with a data analytics solution. The immense amount of data you generate may seem impossible to understand, but data analytics will transform it into clear, actionable information. Another way at looking at data, if you closed your doors tomorrow, what would the new owner be most interested in? The products on your shelves or the insights into your customers?

    Better understand your customers

    Leveraging your data can help you better understand your customers. For instance, you can create robust customer profiles that include information such as sector, job title, geographical locations, channels they use, and preferences. Identify their purchasing behaviors such as what they are buying, what they aren’t, when, how often, in what quantity, and their lifetime value.

    Understanding your customers enables your sales team to recognize new cross- and up-selling opportunities and recognize your top performing accounts. Knowing your best customers means you can reinforce those relationships by periodically rewarding them with a special promotion for products they like. Another benefit of analytics is the ability to identify when a customer is declining. By analyzing customer buying habits and visit frequency, your team can quickly detect a reduction in order frequency or volume, and make a sales call to find out if there is a problem.

    Transactional data keeps an eye on product sales

    Transactional data such as time, place, price, discount, and payment methods is generated at the point of sale. This data can help you measure the success of your various product lines. By analyzing your transactional data, you can tell you whether a product is gaining traction with your target customer base or it can reveal an unexpected dip in sales.

    While it’s important to determine which products aren’t selling as expected, it’s equally important to identify the products with high conversion rates. It may be that the price point is too low, for example. Finally, your transactional data can help you identify trends such as seasonal buying patterns. Knowing when sales increase due to the season can help you better manage the trend. If you know that sales for a particular product line typically increase in October, you can prepare for this by adjusting your stock level to meet the upcoming rise in demand.

    Be more strategic

    Even though many companies have adopted data analytics to guide their decision making, many other companies still rely on traditional approaches. Without realizing it, this means they are a step behind their competition. On the other hand, companies that use a data analytics solution to extract the value from their data have greater success. A study from the MIT Center for Digital Business found that companies that adopt a data-driven culture have 4% higher productivity rates and 6% higher profits. Data-driven companies rely on hard, verifiable data to back up their decision-making rather than making decisions based on intuition and gut-alone. An analytics solution can show you where to strategically deploy your business resources so you can gain a competitive advantage.

    Manage costs

    A major business resource is your capital. Managing your costs enables you to make the most profitable investments. Data analytics can help you lower costs companywide. For instance, analytics can help you track shipments and optimize deliveries to lower your shipping costs. Your marketing team can use analytics to trim marketing costs by creating targeted marketing campaigns and assessing their effectiveness. Finally, data analytics can help you improve employee performance and operational efficiencies across your various departments.

    To remain competitive in our data-driven economy, your business decisions must be based on credible evidence rather than on subjective experience. Data analytics helps companies achieve their goals by identifying fact-based, actionable insights so executives can develop effective strategies for each area of the business. 

    Source: Phocas Software

  • Big Data on the cloud makes economic sense

    With Big Data analytics solutions increasingly being made available to enterprises in the cloud, more and more companies will be able to afford and use them for agility, efficiency and competitiveness

    google
    For almost 10 years, only the biggest of technology firms such as Alphabet Inc.’s Google and Amazon.com Inc.
    used data analytics on a scale that justified the idea of ‘big’ in Big Data. Now more and more firms are
    warming up to the concept. Photo: Bloomberg

    On 27 September, enterprise software company SAP SE completed the acquisition of Altiscale Inc.—a provider of Big Data as-a-Service (BDaaS). The news came close on the heels of data management and analytics company Cloudera Inc. and data and communication services provider CenturyLink Inc. jointly announcing BDaaS services. Another BDaaS vendor, Qubole Inc., said it would offer a big data service solution for the Oracle Cloud Platform.

    These are cases in point of the growing trend to offer big data analytics using a cloud model. Cloud computing allows enterprises to pay for software modules or services used over a network, typically the Internet, on a monthly or periodical basis. It helps firms save relatively larger upfront costs for licences and infrastructure. Big Data analytics solutions enable companies to analyse multiple data sources, especially large data sets, to take more informed decisions.

    According to research firm International Data Corporation (IDC), the global big data technology and services market is expected to grow at a compound annual growth rate (CAGR) of 23.1% over 2014-2019, and annual spending is estimated to reach $48.6 billion in 2019.

    With Big Data analytics solutions increasingly being made available to enterprises in the cloud, more and more companies will be able to afford and use them for agility, efficiency and competitiveness.

    MarketsandMarkets, a research firm, estimates the BDaaS segment will grow from $1.8 billion in 2015 to $7 billion in 2020. There are other, even more optimistic estimates: research firm Technavio, for instance, forecasts this segment to grow at a CAGR of 60% from 2016 to 2020.

    Where does this optimism stem from?

    For almost 10 years, it was only the biggest of technology firms such as Alphabet Inc.’s Google and Amazon.com Inc., that used data analytics on a scale that justified the idea of ‘big’ in Big Data. In industry parlance, three key attributes are often used to understand the concept of Big Data. These are volume, velocity and variety of data—collectively called the 3Vs.

    Increasingly, not just Google and its rivals, but a much wider swathe of enterprises are storing, accessing and analysing a mountain of structured and unstructured data. The trend is necessitated by growing connectivity, falling cost of storage, proliferation of smartphones and huge popularity of social media platforms—enabling data-intensive interactions not only among ‘social friends’ but also among employers and employees, manufacturers and suppliers, retailers and consumers—virtually all sorts of connected communities of people.

    g tech web
     
    A November 2015 IDC report predicts that by 2020, organisations that are able to analyse all relevant data and deliver actionable information will achieve an extra $430 billion in productivity benefits over their less analytically oriented peers.

    The nascent nature of BDaaS, however, is causing some confusion in the market. In a 6 September article onNextplatform.com, Prat Moghe, founder and chief executive of Cazena—a services vendor—wrote that there is confusion regarding the availability of “canned analytics or reports”. According to him, vendors (solutions providers) should be carefully evaluated and aspects such as moving data sets between different cloud and on-premises systems, ease of configuration of the platform, etc., need to be kept in mind before making a purchase decision.

    “Some BDaaS providers make it easy to move datasets between different engines; others require building your own integrations. Some BDaaS vendors have their own analytics interfaces; others support industry-standard visualization tools (Tableau, Spotfire, etc.) or programming languages like R and Python. BDaaS vendors have different approaches, which should be carefully evaluated,” he wrote.

    Nevertheless, the teething troubles are likely to be far outweighed by the benefits that BDaaS brings to the table. The key drivers, according to the IDC report cited above, include digital transformation initiatives being undertaken by a lot of enterprises; the merging of real life with digital identity as all forms of personal data becomes available in the cloud; availability of multiple payment and usage options for BDaaS; and the ability of BDaaS to put more analytics power in the hands of business users.

    Another factor that will ensure growth of BDaaS is the scarcity of skills in cloud as well as analytics technologies. Compared to individual enterprises, cloud service providers such as Google, Microsoft Corp., Amazon Web Services and International Businsess Machines Corp. (IBM) can attract and retain talent more easily and for longer durations.

    Manish Mittal, managing principal and head of global delivery at Axtria, a medium-sized Big Data analytics solutions provider, says the adoption of BDaaS in India is often driven by business users. While the need is felt by both chief information officers and business leaders, he believes that the latter often drive adoption as they feel more empowered in the organisation.

    The potential for BDaaS in India can be gauged from Axtria’s year-on-year business growth of 60% for the past few years—and there are several niche big data analytics vendors currently operating in the country (besides large software companies).

    Mittal says that the growth of BDaaS adoption will depend on how quickly companies tackle the issue of improving data quality.

    Source: livemint.com, October 10, 2016
     

     

  • Competenties en mogelijkheden voor succes met (big) data analytics

    success-with-big-data-analytics-competencies-and-capabilities-for-the-journey691x200

    Voor bedrijven uit alle industrieën is big data analytics van grote waarde. Deze waarde ontstaat onder andere door een betere focus op de klant en het verbeteren van processen. Toch is het niet gemakkelijk om deze waarde er meteen uit te halen. Veel organisaties onderschatten de kosten, complexiteit en competenties om op dat punt te komen.

    Big data analytics

    Big data analytics helpt bij het analyseren van datasets die over het algemeen een stuk groter en gevarieerder zijn dan de data types uit traditionele business intelligence of datawarehouse omgevingen. Het doel van big data analytics is het herkennen van verborgen patronen, onbekende correlaties, markttrends, voorkeuren van de klant en andere informatieve bedrijfsinformatie.

    Waarom is succes behalen met big data lastig?

    Succes behalen met big data is niet vanzelfsprekend. Veel organisaties worstelen op verschillende aspecten met het inzetten van big data. De volgende aspecten kunnen worden onderscheiden:

    • Big data analytics wordt gezien als een technologie project en niet als een transformatie dat op verschillende fronten binnen de organisatie plaatsvindt.
    • Het ecosysteem van aanbieders is gefragmenteerd en veranderd snel.
    • Nieuwe technologieën en architecturen vragen om nieuwe vaardigheden van gebruikers.
  • Data analytics: From studying the past to forecasting the future

    Data analytics: From studying the past to forecasting the future

    To compete in today's competitive market place, it is critical that executives have access to an accurate and holistic view of their business. The key element to sifting through a massive amount of data to gain this level of transparency is a robust analytics solution. As technology is constantly evolving, so too are data analytics solutions. 

    In this blog, three types of data analytics and the emerging role of artificial intelligence (AI) in processing the data are discussed:

    Descriptive analytics

    As the name suggests, descriptive analytics describe what happened in the past. This is accomplished by taking raw historical, whether from five minutes or five years ago, and presenting an easy-to-understand, accurate view of past patterns or behaviors. By understanding what happened, we can better understand how it might influence the future. Many businesses use descriptive analytics to understand customer buying patterns, sales year-over-year, historical cost-to-serve, supply chain patterns, financials, and much more.

    Predictive analytics

    This is the ability to accurately forecast or predict what could happen moving forward. Understanding the likelihood of future outcomes enables the company to better prepare based on probabilities. This is accomplished by taking the historical data from your various silos such as CRM, ERP, and POS, and combining it into one single version of the truth. This enables users to identify trends in sales, forecast demands on the supply chain, purchasing and inventory level based on a number of variables. 

    Prescriptive Analytics

    This solution is the newest evolution in data analytics. It takes previous iterations to the next level by revealing possible outcomes and prescribing courses of actions. In addition, this solution will also show why it will happen. Prescriptive analytics answers the question: What should we do? Although this is a relatively new form of analytics, larger retail companies are successfully using it to optimize customer experience, production, purchasing and inventory in the supply chain to make sure the right products are being delivered at the right time. In the stock market, prescriptive analytics can recommend where to buy or sell to optimize your profit.

    All three categories of analytics work together to provide the guidance and intelligence to optimize business performance.

    Where AI fits in

    As technology continues to advance, AI will become a game-changer by making analytics substantially more powerful. A decade ago, analytics solutions only provided descriptive analytics.  As the amount of data generated increased, solutions started to develop predictive analytics. As AI evolves, data analytics solutions are also changing and becoming more sophisticated. BI software vendors are currently posturing to be the first to market with an AI offering to enhance prescriptive analytics. 

    AI can help sales-based organizations by providing specific recommendations that sales representatives can act on immediately. Insight into customer buying patterns will allow prescriptive analytics to suggest products to bundle which ultimately leads to an increase in the size of an order, reduce delivery costs and number of invoices.

    Predictive ordering has enabled companies to send products you need before you order them. For example, some toothbrush or razor companies will send replacement heads in this way. They predict when the heads will begin to fail and order the replacement for you. 

    Improving data analytics for your business

    If you are considering enhancing your data analytics capability and adding artificial intelligence, we encourage you to seek out a software vendor that offers you industry-matched data analytics that is easy and intuitive for everyone to use. This means dashboards, scorecards, alerts developed with the standard KPIs for your industry, pre-built.

    Collaborating to customize the software to fit your business and augmenting with newer predictive analytics and machine learning-based AI happens next.

    Source: Phocas Software

  • Data conscious consumers: What about them and how to target them?

    Data conscious consumers: What about them and how to target them?

    Data is big news these days. Tech behemoths like Google and Facebook make a lot of money from our data, and people are growing increasingly aware of how valuable their data is to companies of all sizes.

    But as data becomes increasingly valuable for companies, consumers are starting to question how much data they are prepared to give away. Many people are becoming uncomfortable with the idea of giving away their personal information.

    It’s easy to understand their concerns. Huge data breaches each the headlines on a regular basis. It seems like every week, a large and respected brand loses millions of passwords or credit card details.

    As consumers become warier about handing over their data, this poses a challenge for brands. How can you persuade your customers that it’s in their interests to hand over their data? And how can you market to them more effectively as a result?

    Focus on the value exchange

    If a consumer sees little value in handing over their data, they will be far less likely to do so. As such, your focus should be on trading data for something of value.

    This idea has been around for a long time. Every time you sign up for an email list in return for a voucher or free eBook, this is the value exchange at work. Some companies use the concept of gated content whereby the consumer is given access to valuable content on a website in return for their data.

    One of the most common ways that companies use this value exchange is to provide a better experience for the consumer in return for their data. In this case, the consumer may provide an app with permission to access their location, and the app then provides them with directions or specific products based on where they are.

    In short, value exchange needs to be evident in some form. You need to convince your customers that they will enjoy a better experience or receive something of value in return for their data.

    Understand different types of data consciousness

    Consumers are all different, and they have different ideas about how their data should be used. Some consumers are perfectly happy to hand over their data, while others hold the opposite view.

    Brands need to understand the differences between consumers before they can start marketing to them effectively. Consumers can broadly be separated into three groups:

    1. Data unconcerned:These consumers do not care how their data is used and they are happy to hand over more data more readily.

    2. Data pragmatists: These consumers are more guarded about their data, but they are willing to give it away if they can see a clear value exchange.

    3. Data fundamentalists: These consumers are not willing to give away their data under any circumstances.

    According to research from the Data & Marketing Association (DMA), the percentages of the population in each group are roughly as follows:

    • Data unconcerned: 25%
    • Data pragmatists: 50%
    • Data fundamentalists: 25%

    Clearly, when it comes to your marketing efforts, you want to be targeting those consumers in the 75% of the population under the ‘Data Unconcerned’ and ‘Data Pragmatists’ groups.

    So how should you do this? There are three key principles to focus on.

    1. Hyper-Personalization

    Personalization has long been an important concept in marketing. But these days, businesses need to go beyond basic information like the consumer’s name and location. The focus should be on hyper-personalization.

    Hyper-personalization uses data like browsing behavior, purchasing behavior and real-time data to change the message you send to your customers.

    The first thing you will need to do is collect the data. You need quality data to personalize effectively, and that means you need to know the types of people who buy specific products, how much they spend, the types of models they are interested in, which brands they like, and more. Look at Spotify’s annual ‘Wrapped’ campaign, where the company sends users an annual roundup of their yearly listening. At first sight it’s a fun, quirky way to see user data in action. But it also shows exactly which data is being collected.

    Context also comes into it. Factors like location, the weather, important events, seasons and real-time pricing can all be used in your messaging.

    You could launch a browser abandonment campaign where you target people who were looking at a product but did not make a purchase, perhaps offering them a discount if they buy it now.

    In short, the more personalized you can make your messaging, the more effective it will be.

    2. Convenience

    Other than personalization, businesses also need to focus on the convenience of your messaging. The hyper personalized communications need to be delivered through the right channels at the right times.

    This means gathering data about the engagement techniques that work best for different types of consumers, and then using these to provide greater convenience for them.

    3. Relevance

    With all this collecting of data, there is a genuine concern that your business will annoy your customers and they will opt out of your communications.

    As a result, it is necessary be careful about how data is gathered and how consumers are contacted. Data conscious consumers will have strict preferences about how they want to be communicated with. This preference data is essential to avoid alienating them.

    By setting up a preference center where customers are asked how they want to communicate and the types of messages they want to receive. This can be done at the sign-up stage or later if preferred, perhaps by sending an email requesting the information.

    Brands must also work to clarify the value of signing up to a service. If you can’t explain why you want their details, you shouldn’t have them. Use the Abercrombie & Fitch app for inspiration. Users who download it are rewarded with points that can add up to substantial discounts, and there’s a clear correlation between performing an action (i.e. registering) and receiving a gift.

    Give consumers the option to opt-out of communications as well. Consumers may decide they don’t want your weekly newsletter, but rather than unsubscribing from all your communications, they may want to keep the promotions. Having the option to choose different types of emails they want to receive can be helpful in this case.

    By managing customers’ data preferences effectively, businesses can ensure the right messages get to them more often.

    Gather data and use it wisely

    Marketing is changing all the time. Customers are more data conscious than ever, and this shows no signs of changing. To reach the right people with the right messages at the right time, you need to focus on gathering as much data as possible, without annoying your customers, and understanding your customers’ preferences.

    That way, you can continue to reach them with (hyper-) personalizedand marketing messages that generate sales for your company.

    Author: Jason Lark

    Source: Dataversity

  • Dealing with data preparation: best practices - Part 1

    Dealing with data preparation: best practices - Part 1

    IBM is reporting that data quality challenges are a top reason why organizations are reassessing (or ending) artificial intelligence (AI) and business intelligence (BI) projects.

    Arvind Krishna, IBM’s senior vice president of cloud and cognitive software, stated in a recent interview with the Wall Street Journal that 'about 80% of the work with an AI project is collecting and preparing data. Some companies aren’t prepared for the cost and work associated with that going in. And you say: ‘Hey, wait a moment, where’s the AI? I’m not getting the benefit.’ And you kind of bail on it'.

    Many businesses are not prepared for the cost and effort associated with data preparation (DP) when starting AI and BI projects. To compound matters, hundreds of data and record types and billions of records are often involved in a project’s DP effort.

    However, data analytics projects are increasingly imperative to organizational success in the digital economy, hence the need for DP solutions.

    What is AI/BI data preparation?

    Gartner defines data preparation as 'an iterative and agile process for exploring, combining, cleaning, and transforming raw data into curated datasets for data integration, data science, data discovery, and analytics/business intelligence (BI) use cases'. 

    A 2019 International Data Corporation (IDC) study reports that data workers spend a remarkable time each week on data-related activities: 33% on data preparation compared to 32 % on analytics (and, sadly, just 13% on data science). The top challenge cited by more than 30% of all data workers in this study was that 'too much time is spent on data preparation'.

    The variety of data sources, the multiplicity of data types, the enormity of data volumes, and the numerous uses for data analytics and business intelligence, all result in multiple data sources and complexity for each project. Consequently, today’s data workers often use numerous tools for DP success.

    Capabilities needed in data preparation tools

    Evidence in the Gartner Research report Market Guide for Data Preparation Tools shows that data preparation time and reporting of information discovered during DP can be reduced by more than half when DP tools are implemented.

    In the same research report, Gartner lists details of vendors and DP tools. The analyst firm predicts that the market for DP solutions will reach $1 billion this year, with nearly a third (30%) of IT organizations employing some type of self-service data preparation tool set.

    Another Gartner Research Circle Survey on data and analytics trends revealed that over half (54%) of respondents want and need to automate their data preparation and cleansing tasks during the next 12 to 24 months.

    To accelerate data understandings and improve trust, data preparation tools should have certain key capabilities, including the ability to:

    • Extract and profile data. Typically, a data prep tool uses a visual environment that enables users to extract interactively, search, sample, and prepare data assets.
    • Create and manage data catalogs and metadata. Tools should be able to create and search metadata as well as track data sources, data transformations, and user activity against each data source. It should also keep track of data source attributes, data lineage, relationships, and APIs. All of this enables access to a metadata catalog for data auditing, analytics/BI, data science, and other operational use cases.
    • Support basic data quality and governance features. Tools must be able to integrate with other tools that support data governance/stewardship and data quality criteria.

    Keep an eye out for part 2 of this article, where ake a deeper dive into best practices for data preparation.

    Author: Wayne Yaddow

    Source: TDWI

  • Dealing with data preparation: best practices - Part 2

    Dealing with data preparation: best practices - Part 2

    If you haven't read yesterday's part 1 of this article, be sure to check it out before reading this article.

    Getting started with data preparation: best practices

    The challenge is getting good at DP. As a recent report by business intelligence pioneer Howard Dresner found, 64% of respondents constantly or frequently perform end-user DP, but only 12% reported they were very effective. Nearly 40% of data professionals spend half of their time prepping data rather than analyzing it.

    Following are a few of the practices that help assure optimal DP for your AI and BI projects. Many more can be found from data preparation service and product suppliers.

    Best practice 1: Decide which data sources are needed to meet AI and BI requirements

    Take these three general steps to data discovery:

    1. Identify the data needed to meet required business tasks.
    2. Identify potential internal and external sources of that data (and include its owners).
    3. Assure that each source will be available according to required frequencies.

    Best practice 2: Identify tools for data analysis and preparation

    It will be necessary to load data sources into DP tools so the data can be analyzed and manipulated. It’s important to get the data into an environment where it can be closely examined and readied for the next steps.

    Best practice 3: Profile data for potential and selected source data

    This is a vital (but often discounted) step in DP. A project must analyze source data before it can be properly prepared for downstream consumption. Beyond simple visual examination, you need to profile data, detect outliers, and find null values (and other unwanted data) among sources.

    The primary purpose of this profiling analysis is to decide which data sources are even worth including in your project. As data warehouse guru Ralph Kimball writes in his book, The Data Warehouse Toolkit , 'Early disqualification of a data source is a responsible step that can earn you respect from the rest of the team'.

    Best practice 4: Cleansing and screening source data

    Based on your knowledge of the end business analytics goal, experiment with different data cleansing strategies that will get the relevant data into a usable format. Start with a small, statistically-valid sample to iteratively experiment with different data prep strategies, refine your record filters, and discuss the results with business stakeholders.

    When discovering what seems to be a good DP approach, take time to rethink the subset of data you really need to meet the business objective. Running your data prep rules on the entire data set will be very time consuming, so think critically with business stakeholders about which entities and attributes you do and don’t need and which records you can safely filter out.

    Final thoughts

    Proper and thorough data preparation, conducted from the start of an AI/BI project, leads to faster, more efficient AI and BI down the line. DP steps and processes outlined here apply to whatever technical setup you are using, and they will get you better results.

    Note that DP is not a 'do once and forget' task. Data is constantly generated from multiple sources that may change over time, and the context of your business decisions will certainly change over time. Partnering with data preparation solution providers is an important consideration for the long-term capability of your DP infrastructure.

    Author: Wayne Yaddow

    Source: TDWI

  • Drawing value from data with BI: Data Discovery

    Drawing value from data with BI: Data Discovery

    'We are drowning in information but starved for knowledge' according to best selling author, John Naisbitt. Today’s businesses have the ability to collect an extraordinary amount of information on everything from customer buying patterns and feedback to supply chain management and marketing efforts. Are you drawing value from your data?

    It is nearly impossible to draw value from the massive amount of data your business collects without a data discovery system in place. So, what is data discovery?

    Data discovery

    Data discovery is a term related to business intelligence technology. It is the process of collecting data from your various databases and silos, and consolidating it into a single source that can be easily and instantly evaluated. Once your raw data is converted, you can follow your train of thought by drilling down into the data with just few clicks. Once a trend is identified, the software empowers you to unearth the contributing factors.

    For instance, BI enables you to explore the data by region, different employees, product type, and more. In a matter of seconds, you have access to actionable insights to make rapid, fact-based decisions in response to your discoveries. Without BI, discovering a trend is usually a case of  coincidence.

    With data discovery, the user searches for specific items or patterns in a data set. Visual tools make the process fun, easy-to-use, swift, and intuitive. Visualization of data now goes beyond traditional static reports. BI visualizations have expanded to include geographical maps, pivot-tables, heat maps, and more, giving you the ability to create high-fidelity presentations of your discoveries.

    Discover trends you did not know where there

    With data discovery, executives are often shocked to discover trends they didn’t know were there. Michael Smith of the Johnston Corporation had this to say after implementing BI:

    'Five minutes into the demo, I had found items that didn't have the margin I was expecting, customers that didn't have the profitability I was expecting and vendors that weren't performing the way I expected. I realised that we were onto something that would be very impactful to our business'.

    These discoveries allow companies to discover unfavourable trends before they become a problem and take action to avoid losses.

    Take action now

    Many of the most successful companies today are using BI to inform their strategies and day-to-day operations. With relevant insights, a company can now make the most knowledgeable decisions about effective (digital) strategies to acquire, serve, and retain valuable customers. Having a mountain of data is useless unless it is converted into meaningful information. The ability to discover the truth behind your data will go a long way to guarantee your company achieves and maintains its competitive edge.

    Source: Phocas Software

  • Five Mistakes That Can Kill Analytics Projects

    Launching an effective digital analytics strategy is a must-do to understand your customers. But many organizations are still trying to figure out how to get business values from expensive analytics programs. Here are 5 common analytics mistakes that can kill any predictive analytics effort.

    Why predictive analytics projects fail

    failure of analytics

    Predictive Analytics is becoming the next big buzzword in the industry. But according to Mike Le, co-founder and chief operating officer at CB/I Digital in New York, implementing an effective digital analytics strategy has proven to be very challenging for many organizations. “First, the knowledge and expertise required to setup and analyze digital analytics programs is complicated,” Le notes. “Second, the investment for the tools and such required expertise could be high. Third, many clients see unclear returns from such analytics programs. Learning to avoid common analytics mistakes will help you save a lot of resources to focus on core metrics and factors that can drive your business ahead.” Here are 5 common mistakes that Le says cause many predictive analytics projects to fail.

    Mistake 1: Starting digital analytics without a goal

    “The first challenge of digital analytics is knowing what metrics to track, and what value to get out of them,” Le says. “As a result, we see too many web businesses that don’t have basic conversion tracking setup, or can’t link the business results with the factors that drive those results. This problem happens because these companies don’t set a specific goal for their analytics. When you do not know what to ask, you cannot know what you'll get. The purpose of analytics is to understand and to optimize. Every analytics program should answer specific business questions and concerns. If your goal is to maximize online sales, naturally you’ll want to track the order volume, cost-per-order, conversion rate and average order value. If you want to optimize your digital product, you’ll want to track how users are interact with your product, the usage frequency and the churn rate of people leaving the site. When you know your goal, the path becomes clear.”

    Mistake 2: Ignoring core metrics to chase noise

    “When you have advanced analytics tools and strong computational power, it’s tempting to capture every data point possible to ‘get a better understanding’ and ‘make the most of the tool,’” Le explains. “However, following too many metrics may dilute your focus on the core metrics that reveal the pressing needs of the business. I've seen digital campaigns that fail to convert new users, but the managers still setup advanced tracking programs to understand user 

    behaviors in order to serve them better. When you cannot acquire new users, your targeting could be wrong, your messaging could be wrong or there is even no market for your product - those problems are much bigger to solve than trying to understand your user engagement. Therefore, it would be a waste of time and resources to chase fancy data and insights while the fundamental metrics are overlooked. Make sure you always stay focus on the most important business metrics before looking broader.”

    Mistake 3: Choosing overkill analytics tools

    “When selecting analytics tools, many clients tend to believe that more advanced and expensive tools can give deeper insights and solve their problems better,” Le says. “Advanced analytics tools may offer more sophisticated analytic capabilities over some fundamental tracking tools. But whether your business needs all those capabilities is a different story. That's why the decision to select an analytics tool should be based on your analytics goals and business needs, not by how advanced the tools are. There’s no need to invest a lot of money on big analytics tools and a team of experts for an analytics program while some advanced features of free tools like Google Analytics can already give you the answers you need.”

    Mistake 4: Creating beautiful reports with little business value

    “Many times you see reports that simply present a bunch of numbers exported from tools, or state some ‘insights’ that has little relevance to the business goal,” Le notes. “This problem is so common in the analytics world, because a lot of people create reports for the sake of reporting. They don’t think about why those reports should exist, what questions they answer and how those reports can add value to the business. Any report must be created to answer a business concern. Any metrics that do not help answer business questions should be left out. Making sense of data is hard. Asking right questions early will

    help.”

    Mistake 5: Failing to detect tracking errors

    “Tracking errors can be devastating to businesses, because they produce unreliable data and misleading analysis,” Le cautions. “But many companies do not have the skills to setup tracking properly, and worse, to detect tracking issues when they happen. There are many things that can go wrong, such as a developer mistakenly removing the tracking pixels, transferring incorrect values, the tracking code firing unstably or multiple times, wrong tracking rule's logic, etc. The difference could be so subtle that the reports look normal, or are only wrong in certain scenarios. Tracking errors easily go undetected because it takes a mix of marketing and tech skills. Marketing teams usually don’t understand how tracking works, and development teams often don’t know what ‘correct’ means. To tackle this problem, you should frequently check your data accuracy and look for unusual signs in reports. Analysts should take an extra step to learn the technical aspect of tracking, so they can better sense the problems and raise smart questions for the technical team when the data looks suspicious.”

    Author: Mike Le

    Source: Information Management

  • Graph Analytics keeps growing in popularity and possibilities

    Graph Analytics keeps growing in popularity and possibilities

    Graph continues to be the fastest growing segment of data management. The benefit: the ability to offer deeper insights on data and in real-time, and therefore enabling better business outcomes. A number of graph solution providers are continuing to innovate by taking their technology to the cloud. Specifically, we’re seeing enterprise-class, pay-as-you-go graph analytics solutions in the cloud based on Amazon Web Services.

    Take TigerGraph for example. This company offers a scalable graph database for the enterprise, and is accelerating its cloud strategy with the availability of its platform as a pay-as-you-go offering on Amazon Web Services (AWS). This move broadens the company’s global reach by providing AWS users with instant access to the world’s fastest and most scalable graph database.

    TigerGraph also achieved advanced partner status in the AWS Partner Network (APN). To obtain this status, TigerGraph’s platform passed AWS’ stringent technical certification process. In addition, TigerGraph had to validate its expertise through a wide range of enterprise references that demonstrated strong customer value. TigerGraph’s customers run applications in AWS for some of the largest brands in the world across financial services, healthcare, and retail.

    With the launch of the pay-as-you-go offering, customers will be able to enjoy a friction-free experience for using TigerGraph’s fast and powerful graph database, without cumbersome acquisition and deployment processes. AWS users can now get an Amazon Machine Image (AMI), which includes the TigerGraph DB and GraphStudio SDK. TigerGraph’s launch in the AWS Marketplace provides an easy-to-use and powerful cloud-based solution with fast deployment and pay-as-you-go pricing.

    One of the key benefits of using the new TigerGraph AMI on AWS is the ease of getting started on the highly performant and scalable platform without having to manage underlying infrastructure. The platform makes it possible for anyone to quickly load data, select a graph algorithm from TigerGraph’s library and explore graph analytics within minutes. With TigerGraph’s speed and performance, users have the ability to go 10 or more levels of connection deep into their data and to compute and reveal insights based on multi-dimensional criteria in real time.

    Author: Daniel Gutierrez

    Source: Insidebigdata

     

  • Het takenpakket van de CIO gaat ook customer experience omvatten

    Het takenpakket van de CIO gaat ook customer experience omvatten

    De rol van de CIO verandert, dat weten we allemaal. Maar steeds vaker wordt de technologische leider ook leidend in het optimaliseren van de klantervaring.

    IT wordt steeds complexer en de rol van de CIO verandert continu, daar is iedereen het over eens, maar hoe ziet de CIO van de toekomst eruit? De toekomstige CIO is de absolute leider van digitale transformatie en drijver van innovatie en groei. Dit komt omdat de CIO van de toekomst goed in staat is om technische expertise te verbinden met organisatorische skills. Maar wist je ook dat de CIO de persoon is die verantwoordelijk is, of zou moeten zijn, voor de customer experience?

    CIO: de trusted operator en business cocreator

    Onderzoek van Deloitte laat zien dat de rol van de CIO op verschillende manieren verandert. Er wordt in dit onderzoek verschil gemaakt in verschillende rollen die de CIO kan spelen. De trusted operator is de CIO die zich concentreert op efficiëntie, betrouwbaarheid en kosten. Hij of zij levert ondersteunende technologieën en sluit aan bij de bedrijfsstrategie. Een andere rol is de business cocreator, die zijn tijd vooral besteedt aan het sturen van de bedrijfsstrategie en mogelijk maken van verandering. Met als doel een effectieve uitvoering van de strategie.

    Customer experience valt onder takenpakket CIO

    Bij Salesforce helpt men steeds vaker organisaties die van hun CIO verwachten dat hij of zij bedrijfsprocessen, zoals de customer experience, kan transformeren. Uiteraard is hij of zij verantwoordelijk voor het (laten) bouwen van back-endsystemen, maar daarnaast zal de CIO ook bedrijfsprocessen moeten stroomlijnen door middel van technologie. Kijkende naar het onderzoek van Deloitte kun je concluderen dat het optimaliseren van de customer experience steeds meer een taak wordt van de CIO daar dat linkt aan bedrijfsprocessen en dus de rol van de trusted operator.

    Een mooi voorbeeld hiervan is van het bedrijf KONE, waar de CIO verantwoordelijk is voor het steeds slimmer worden van productie en onderhoud van roltrappen, liften en rolpaden met hulp van IoT en realtime-technologie. Het systeem kan storingen opsporen en automatisch een werkorder afgeven, waardoor het probleem al kan worden verholpen voordat de klant door heeft dat er iets mis is. Daarnaast heeft de buitendienst van KONE toegang tot goede en actuele informatie, en kan het door deze inzichten beter inspelen op de behoeften van de klant. Hierdoor kan een bedrijf proactieve, betere service verlenen waardoor de customer experience verbetert en het meer tevreden klanten oplevert.

    De ultieme customer experience

    De ultieme customer experience is gebaseerd op diepe en brede inzichten in de klant; een uniform klantbeeld is hiervoor noodzakelijk. Bij veel organisaties verzamelt elke afdeling eigen data en vormt zijn eigen inzichten over de klant. Het probleem is echter dat, zonder een customer data platform, deze data gefragmenteerd is en niet volledig kan worden benut. Wanneer systemen en databronnen niet samenwerken, kan de ene afdeling niet bij de waardevolle inzichten van de andere afdeling. Inzichten die kunnen helpen om de klant nog beter van dienst te zijn. Aan de CIO de taak om alle verschillende databronnen te integreren, zodat elke afdeling op elk moment over alle informatie van de klant beschikt en een betere klantervaring kan bieden. Dit stelt een verkoper bijvoorbeeld in staat om mogelijkheden voor cross-selling en up-selling te identificeren op basis van de geschiedenis van elke klant.

    Wanneer alle data van de klant beschikbaar is, kan deze data geanalyseerd worden en kunnen er aanbevelingen op worden gedaan. Aan de hand van eerder gekochte producten kun je er dan bijvoorbeeld achter komen of en wát iemand hoogstwaarschijnlijk de volgende keer zal kopen. Zo stelt het vervolgens marketing weer in staat om de juiste boodschap op het juiste moment bij de juiste klant te brengen, zodat deze sneller converteert. Dat is de belangrijke rol van de CIO binnen de customer experience waarbij de CIO zowel de trusted operator als ook de business cocreator-rol uitoefent en combineert. Dus CIO van de toekomst; zorg dat je zowel de trusted operator bent áls de business cocreator zodat je in staat bent de customer experience van begin tot het eind te optimaliseren.

    Auteur: Onno Tjeerdsma

    Bron: CIO

  • How data analytics is affecting the insurance industry

    How data analytics is affecting the insurance industry

    Data analytics in the insurance industry is transforming the way insurance businesses operate. Here's why that is important.

    Technology has had a profound impact on the insurance industry recently. Insurers are relying heavily on big data as the number of insurance policyholders also grow. Big data analytics can help to solve a lot of data issues that insurance companies face, but the process is a bit daunting. It can be challenging for insurance companies who have not adjusted to this just yet.

    Effect of big data analytics on customer loyalty

    One of the reasons why some insurance companies get more customers as compared to others is because they can provide the things that their customers need. The more that they can give what the customers expect, the more loyalty customers reciprocate in return.

    Instead of just aggregating one policy from their insurer at a time, they may get all of their insurance policies in a single, centric dashboard. Even if people solicit an anonymous car insurance quote from a different company that is lower than others, they would still stick to a company that they are fiercely loyal to. This means that they will need to consider other factors, such as whether they have been unfairly prejudicing customers based on characteristics like gender or race. Big data may be able to help address this.

    Big data analytics can be very useful in acquiring all of the necessary data in a short amount of time. This means that insurance companies will know what their customers want and will offer these wants immediately. Insurance companies will also have the ability to provide personalized plans depending on their customer’s needs.

    Big data analytics in fraud cases

    One of the biggest issues that insurance companies are facing nowadays is fraud. According to industry findings, 1 out of 10 claims is fraudulently filed. This is an alarming rate, especially with the number of policyholders that an insurance company may have. Some consumers filing fraudulent claims have done so sloppily, which makes it easier for the company to seek restitution and prosecute the offenders before they can drive premiums up on other drivers. Some may be meticulously done and people can get away with it.

    With big data analytics, a large amount of data can be checked in a short amount of time. It includes a variety of big data solutions, such as social network analysis and telemetrics. This is the biggest weapon insurers have against insurance fraud.

    Subrogation

    A large amount of data that is needed and received for subrogation cases. The data can come from police records, medical records, and even notes regarding cases. Through big data analytics, it will be possible to get phrases that will show that the cases that are being investigated are subrogation cases.

    Settlement cases

    There are a lot of customers who may complain that lawsuit settlements often take a long time, because there is a lot of analysis that needs to be done. With the use of big data analytics, the processes can help settle the needed claims instantly. It will also be possible to check and analyze the history of the claims and the claims history of each customer. This can help reduce labor costs as the employees do not have to put all of their time into checking and finalizing each data regarding the claim. It can also give the payouts to the customer faster which means that customer satisfaction will also greatly increase.

    Checking more complex cases

    There are some people who have acquired anonymous car insurance quote and have gotten insurance in order to file claims to acquire money from the insurance company. Some cases are obvious frauds and the authentic ones can be immediately analyzed with the use of big data analytics. Yet, there are some cases that are just too complex that it would take a lot of checking to see if the data received coincide with what the customer claims. Big data analytics use data mining techniques. These techniques allow the various claims to be categorized and scored depending on their importance. There are even some that will allow the claims to be settled accordingly.

    Some common issues in using big data analytics

    It is always important for insurance companies to consider both the good and the bad details about using analytics. Some of the good things have been tackled above. These are just some concerns that you need to be familiar with:

    • You still need to use multiple tools in order to process the data which can be problematic as data may get lost along the way.
    • Getting too many data analysts when a few will be enough.
    • Not unifying the gathered information.

    Take note of these issues so that they can be avoided.

    With all of the things that big data analytics can do, it is not surprising why a lot of insurance companies would need to start using this soon. This can be integrated little by little so that it will not be too overwhelming for everyone who is involved. The sooner that this can be done, the better. Not only for the customers but for the insurance company as a whole.

    Big data will address countless insurance industry challenges

    The insurance industry is more dependent on big data than many other sectors. Their entire business model is built around actuarial analyses. As a result, they will need to rely on big data to solve many of the challenges that have plagued them for years. Big data will also help them fight fraud and process lawsuit settlements more quickly.

    Author: Diana Hope

     Source: Smart Data Collective

  • How the data-based gig economy affects all markets

    How the data-based gig economy affects all markets

    Data is infinite. Any organization that wants to grow at a meaningful pace would be wise to learn how to leverage the vast amount of data available to drive growth. Just ask the top five companies in the world today: Apple, Amazon, Google, Facebook, and Microsoft. All these technology giants either process or produce data.

    Companies like these with massive stockpiles of data often find themselves surrounded by other businesses that use that data to operate.Salesforce is a great example: Each year at its Dreamforce conference in San Francisco, hundreds of thousands of attendees and millions of viewers worldwide prove just how many jobs the platform has created.

    Other companies are using vast amounts of information from associated companies to enhance their own data or to provide solutions for their clients to do so. When Microsoft acquired LinkedIn, for instance, it acquired 500 million user profiles and all of the data that each profile has generated on the platform. All ripe for analysis.

    With so much growth evolving from a seemingly infinite ocean of data, tomorrow’s leading companies will be those that understand how to capture, connect, and leverage information into actionable insight. Unless they’re already on the top 10 list of the largest organizations, the problem most companies face is a shortage of highly skilled talent that can do it for them. Enter the data scientist.

    More data, more analysts

    The sheer amount of data at our fingertips isn’t the only thing that’s growing. According to an Evans Data report, more than 6 million developers across the world are officially involved in analyzing big data. Even traditionally brick-and-mortar retail giant Walmart plans to hire 2,000 tech experts, including data scientists, for that specific purpose.

    Companies old and new learned long ago that data analysis is vital to understanding customers’ behavior. Sophisticated data analytics can reveal when customers are likely to buy certain products and what marketing methods would be effective in certain subgroups of their customer base.

    Outside of traditional corporations, companies in the gig economy are relying even more on data to utilize their resources and workforce more efficiently. For example, Uber deploys real-time user data to determine how many drivers are on the road at any given time, where more drivers are needed, and when to enact a surge charge to attract more drivers.

    Data scientists are in demand and being hired by the thousands. Some of the most skilled data scientists are going the freelance route because their expertise allows them to choose more flexible work styles. But how can data scientists who aren’t interested in becoming full-time, in-house hires ensure that the companies for which they freelance are ready for their help?

    The data-based gig economy

    Gartner reports that the number of freelance data scientists will grow five times faster than that of traditionally employed ones by next year. The data-based gig economy can offer access to top talent on flexible schedules. But before data scientists sign on for a project, they should check to see that companies are prepared in the following areas:

    • Companies need to understand their data before they decide what to do with it. That data could include inventory, peak store hours, customer data, or other health metrics.
    • Next, businesses should have streamlined the way they collect and store their data to make it easy to analyze. Use of a CRM platform is a good indicator of preparedness at this stage.
    • Finally, companies need to be able to act on the insights they glean. After freelancers are able to use organizations’ collected and organized data to find valuable connections and actionable insights, those organizations should have a process for implementing the discoveries.

    Today’s organizations need data in order to be successful, and they need data scientists to make use of that data. In order for both parties to thrive in this era, companies need to have the right strategies in place before they invest in freelance talent. When they do, freelance data scientists will have the opportunity to gather critical knowledge from the data and use their talents to drive innovation and success.

    Author: Marcus Sawyerr

    Source: Insidebigdata

  • Key components of developing the right data strategy

    Key components of developing the right data strategy

    What does your company do? 

    That was a trick question. It doesn’t matter what you think your company does, it’s going to have to turn into a data company soon, if it hasn’t started already, in addition to continuing to provide your core product or service. This may sound daunting, but it’s a good challenge to have and one that will ultimately improve your offering, delight your customers, increase stickiness and adoption, and keep you competitive in a changing data landscape. 

    In this article you will read a brief overview a data strategy's key components: what a data strategy has to encompass, vital considerations when dealing with data, and who the main players are when it comes to executing your data strategy.

    Data strategies for the uninitiated

    First off, 'So, what even is a data strategy anyway?' Everyone knows that data is important for organizations to make money, but just having a bunch of data is useless without a data strategy. A data strategy deals with all aspects of your data: where it comes from, where it’s stored, how you interact with it, who gets to see what, and who is ultimately in charge of it. This sounds like a tall order and you may be thinking 'Oh man! Is that my job?' Depending on your company’s level of data maturity, it might not be any one person or department’s job (yet). But you do need to start coming up with answers to all of these tough questions.

    “Everybody is going to assume that somebody else is taking care of the data, and the result is, nobody does”. - Charles Holive, Managing Director for Sisense’s Strategy Consulting Business.

    That’s a bad situation, and you definitely need to know who’s in charge of what data. However, one of the first questions you need to answer as you build your strategy is 'So, what do we want to do with all this data? Why? And how will this make us money/delight our customers?' Those answers ultimately have to come from the business unit that has the idea for making money/delighting customers in the first place: 'Internal data is owned by the function that creates it. It all sits within IT, but sales should own sales data, marketing should own the marketing data…' 

    These departments should also own the efforts to use that data to create new revenue, engagement, etc. A common misconception when it comes to data strategies is that they should be these all-encompassing, top-down initiatives that come from an all-seeing, all-knowing Chief Data Officer (more on this later), when actually you can, and should, build your strategy piece by piece and that the process should be driven by the areas who have the data in the first place. Whatever the initiative is (surfacing user data to inform them about their buying habits, etc.), the department with the data and the idea for using it should drive it. This increases ownership within the department and prevents the 'whose job is this?' question.

    Diversifying your data

    Once you’ve got your initiative in mind, it’s important to think about what data you need for it. The two main kinds of data your company has will be the data you generate and own and the data your customers generate, which you are only the custodians of (they own it). Whatever you plan on doing with data, this is the time to make sure that you are legally within your rights (consult your company’s legal department, counsel, etc.) and make sure that your user agreement contracts are properly worded to allow you to do what you want with the data you have. 

    There’s a third type of data your company can and should be thinking about for your data projects, and that’s third-party data, which can be used to add context to your datasets. More and more companies want to augment the context of their data. In healthcare, for instance, a hospital only has access to about 2% of the data on its patients, which is created while they are physically in the hospital. They are missing the other 98% of the data that is generated everywhere else. Their eating habits, buying habits, some of this could be useful to help provide better care. 

    As the outlook on data shifts from a company-centric to an ecosystem-spanning view, more and more companies will buy, sell, trade, and partner with other companies for access to the data they want and need to augment their datasets, deliver more value, and maintain a dominant position in their industries.

    Key players for implementing your data strategy

    Now that you know where the data strategy starts, who’s responsible for implementing it at the department level, and how to safely and responsibly use the data you’ve got, it’s time to talk about the key players within your organization who will help keep everything running smoothly. These are the business unit stakeholders, data professionals pulling the data together, and maybe the Chief Data Officer if your organization has one. The first one, we already covered: whoever came up with the idea for how to use your data (and whatever data you can get access to) should own the execution of that plan.

    They’ll need support from your company’s data experts: the IT department and data engineers (if you have them). These folks will walk the team executing the plan through the specifics of where the data is and how to access it. Additionally, they’ll make sure that the company has the analytics platform needed to pull it all together and present meaningful insights to your users. They may even be instrumental, along with product team members, in helping create embedded analytics that will live right inside your product or service.

    Lastly, we should discuss the Chief Data Officer (CDO). As previously discussed, this person is not the be-all-end-all of your data strategy. Many businesses, right now, may not even have a CDO, but when you do get one, they will wear a lot of hats within the organization. Their first job will be to look at all the data your company has and how it’s all being used and make sure that the processes in place make sense and are working. They will also check in with legal and make sure that data is being used in a way that’s compliant and that all user agreements are properly worded to protect users and the company. The CDO will also look for ways to augment your data holdings (through buying, partnering, etc.) to keep expanding the ways your company can use data to increase revenue. 

    Data strategies and culture

    A final, vital aspect of the CDO’s role is a cultural one: they have to assess the organization and make sure that everyone using data has a mindset that prioritizes the security of the data, but also the opportunity that it represents for the company. Every company is becoming a data company and the financial incentives are too huge to ignore: ´The market for monetizing data and insights is getting so big. Depending on what you read, it’s between 20 and 36 billion dollars over the next three or four years´. 

    Business teams need to understand this and be serious about getting the most out of their data. Dragging your feet or being half-hearted about it will not do: 'If someone says ‘the way I’ve made money before is the way I will make money tomorrow,’ I say ‘well, I’m not going to invest in your company.’ I know five years from now, someone’s going to get to your data and create much more value than you do with your transactions'. 

    Encouraging a culture of experimentation is key to finding new ways to use data to drive revenue and keep your company competitive. Charles suggested finding ways to make building new apps and projects with data as easy as possible, so that people across the company can build quickly and fail quickly, to find their way to solutions that will ultimately pay off for users and the company. 

    What will your company do?

    By now your head is probably spinning with all the potential challenges and opportunities of your data strategy (whether you had one when you started reading this article or not). If your team isn’t doing stuff with data right now, start asking the hard questions as to why that is and how you can change it. If your company doesn’t have the tools to build the analytics functionality you need, figure out how to get them. Whatever you have in your imagination, start building it. If you don’t, someone else will. 

    Author: Jack Cieslak

    Source: Sisense

  • Self-service reporting as part of a company-wide data analytics solution

    Self-service reporting as part of a company-wide data analytics solution

    In today’s fast-paced business environment, it can be difficult to predict and prepare for the future. This is why many companies are arming themselves with on-demand reporting. Self-service reporting allows users to produce reports and visualizations on-the-go.

    Whether you want to stay ahead of your competition, increase profits or improve performance, a quality data analytics solution is a must. The following three indicators strongly suggest that you are ready to implement a data analytics solution that provides self-service reporting:

    1. Reports lead to more reports

    Traditional reporting is often frustrating and time-consuming. Waiting multiple days for IT to generate a report frequently results in outdated information, further delaying decision-making. In addition, IT reports are static. They may answer some high-summary questions but lack the ability to answer additional questions on a granular level. When more information is needed, you find it necessary to go back to IT and request additional reports.

    Self-service data analytics enables anyone, even non-technical users, to access, query, and generate reports on demand, such as during business meetings. The nature of dynamic reporting means that if more information is needed, users can quickly drill down for more detail.

    2. Desire for visual charts

    Would visualizations help you present complex data with better clarity and efficiency? A graphical depiction of numbers presents your data in a way that people can easily digest, retain, and recall. Like a well-told story, a visualization allows you to set the scene and explain what happened, why it matters and how users can turn it into something actionable.

    With clear visualizations, it is easier to guide stakeholders from a broad overview down to the granular details. Graphic representations of data make it easy to see existing patterns and project future trends, which can help drive decision-making. Depending on your needs, visualizations might be simple such as bar charts, pie charts, and maps. Or they may be more complex models such as waterfalls, funnels, gauges, and many other components. Whatever the case, users should be able to build a customized dashboard to fit any scenario.

    3. People in the company are already doing it

    You may know some colleagues who are already using analytics at work. Many department heads and top employees understand that the IT department is stretched, yet they have important projects to deliver. As such, they may have already adopted and easy-to-use analytics solution for their personal use. Ask around or take notice at your next business meeting and you are likely to find resourceful employees who are already using self-service analytics to quickly make informed decisions.

    A study by the Harvard Business Review revealed that 'high-performing, customer-oriented workforces' have a high prevalence of employees using the 'bring your own technology' idea. For instance, American-based big box store Walmart realized that employees were using their phones to help them at work. Consequently, Walmart has embraced this trend by creating an employee app so they can help customers locate items within the store. So implementing a company-wide data analytics solution may not be difficult at all, perhaps you already have many users and advocates.

    Source: Phocas Software

  • Software kiest de beste sollicitant

    hh-6379374Sollicitanten interviewen is tijdverspilling. Wie beschikt over voldoende historische data en de juiste rekenmodellen, kan uit een stapel cv’s haarfijn destilleren wie er het meest geschikt is voor een bepaalde vacature. Sterker nog: als een wervingsspecialist maar voldoende gegevens heeft, kan hij voorspellen hoe goed iemand zal worden in zijn baan zonder diegene ooit gezien te hebben.

    Geraffineerd rekenmodel

    Voor de meeste bedrijven is het bovenstaande een verre toekomstschets, maar de technologie is er al, betoogt wetenschapper Colin Lee in zijn proefschrift. Hij promoveerde deze maand aan de Rotterdam School of Management (Erasmus Universiteit) op onderzoek waarin hij een geraffineerd rekenmodel gebruikt om patronen in meer dan 440.000 bestaande cv’s en sollicitaties te analyseren. Het model blijkt met 70% nauwkeurigheid te kunnen voorspellen wie er uiteindelijk werkelijk wordt uitgenodigd op gesprek, op basis van zaken als werkervaring, opleidingsniveau en vaardigheden.

    Intuïtie

    ‘Belangrijke voorspellers zijn relevantie van de werkervaring en het aantal dienstjaren. Je kunt die samenvoegen in een formule, en zo de beste match bepalen’, zegt Lee. Hoewel werkervaring bepalend is, zijn recruiters verder niet erg consequent in wat zij de doorslag laten geven, zo concludeert hij uit de patronen. ‘We kunnen daar wel een rode draad in herkennen, maar veel lijkt op basis van intuïtie te gebeuren.’

    Argwaan

    Waar Nederlandse bedrijven huiverig zijn om de analyse van 'big data' een centrale rol te geven bij werving en selectie, is die praktijk al jaren gemeengoed in Silicon Valley. Voorlopers als Google baseren hun aannamebeleid in de eerste plaats op harde data en algoritmen, gebaseerd op succesvolle wervingen uit het verleden. ‘Bedrijven zijn vaak extreem slecht in werving en mensen interviewen. Ze varen op gevoel en ongefundeerde theorieën’, zei directeur human resources Laszlo Bock van Google vorig jaar in een interview met het FD.

    Kan een bedrijf zich met louter data een weg banen naar de perfecte kandidaat? In Nederland heerst de nodige argwaan, en niet alleen over de nog onbewezen technologie. Ook ethische vraagstukken spelen een rol, zegt Lee. ‘De toekomst is dat je exact kunt becijferen hoe iemand gaat presteren op basis van de parameters in zijn cv. Dat is eng omdat je mensen op voorhand uitvlakt.’

    Optimale match

    Wervingssoftware wordt wel al langer in minder extreme vormen toegepast, bijvoorbeeld door grote uitzenders als Randstad, USG en Adecco. Die maken met speciale software een eerste voorselectie uit honderden, of zelfs duizenden cv’s. Dat gebeurt met behulp van zogenaamde 'applicant tracking systemen' (ATS). Dat zijn filters die zowel openbare gegevens op sociale media als interne databases van klanten gebruikt om te werven, of om te bepalen of een werknemer wel de optimale ‘match’ is in zijn huidige functie.

    ‘Vaak kunnen wij beter zien of iedereen binnen een bedrijf tot zijn recht komt dan dat bedrijf zelf’, zegt Jan van Goch van Connexys, een maker van wervingssoftware. De belangrijkste barrière voor verdere ontwikkeling van dit soort toepassingen is volgens hem niet zozeer de technologie, als wel de angst van klanten voor privacyinbreuk en aansprakelijkheid. Zij zitten vaak op bergen aan waardevolle historische informatie over hun sollicitanten, maar weigeren die te ontsluiten voor gebruik in grotere databases.

    Wetgeving

    Van Goch: ‘Als al die informatie bij elkaar komt, kunnen we nog veel slimmer matchen en werven. Klanten willen dat wel, maar ze geven zelf niet altijd toestemming om eigen gegevens te gebruiken en blijven er dus op zitten en dat is doodzonde. Een deel is bang om aangeklaagd te worden op het moment dat het op straat komt te liggen, des te meer sinds de wetgeving voor dataopslag is aangescherpt.’

    Source: FD

  • Starting a BI project in 4 simple steps

    Starting a BI project in 4 simple steps

    What would it mean to you and your enterprise, if you could start getting useful business insights from your data in literally five days or less, using four simple steps?

    As exciting as this seems, it’s actually just what a good business intelligence platform should be able to do for you. While BI projects can be short term or long term, straightforward or sophisticated, they should all bring actionable results as soon as possible. Business moves fast nowadays, and there isn’t enough time for months of preparation, data modeling, IT platform planning, management decisions, and implementation.

    Fortunately, these four clear, do-able steps will allow you to publish your first BI dashboard in five days, keeping up with the pace of your business without needing specialist help or extensive resources.

    STEP 1: Map out your BI project with small, practical milestones (half-a-day)

    Why do certain BI projects fail? Often because they try to bite off more than they can chew. Start off by focusing on one insight of value, and your BI project can already be a success in just days. Afterwards, there will be plenty of opportunities to derive further insights, making sure each additional step brings you a measurable benefit.

    So, let’s begin! Here’s how to do step one:

    • Start with a standard business process you want to understand better or improve
    • Keep data sources few at first, with a just 2-3 reports that hold the answers
    • Get an initial, useful result, before iterating to go deeper or wider into your business processes

    This also means using a business intelligence system that lets you start simply, and then scale to any level of BI that makes sense for your organization.

    Allowing half-a-day for step one, your BI project map will then look like the following steps for the rest of the week (the 4.5 days left)

    • Business planning to define useful questions to answer (step two, below)
    • Setting up your data model to bring your data sources together properly (step three)
    • Designing and publishing a dashboard to display the results (step four)

    Remember that as you progress with your BI projects, your BI tool should let you go beyond just automating any manual business reporting you are doing currently (Excel spreadsheets included). A little business rethinking may show you even more important questions to answer, for which your BI tool will then become even more valuable. That’s when you start reaching beyond the realm of standard reports and into the realm of BI.

    STEP 2: Collect requirements (half-a-day)

    To get your first successful BI project off the ground in five days, requirements should be modest. On the other hand, business, result, and technical requirements should be stated clearly and precisely enough to keep your BI project on track for success:

    • Business requirement: state the question that is to be answered. For example, 'what are the trends in the monthly revenues of the organization?' Or, 'which product lines can use more marketing budget to generate higher profits?'
    • Result requirement: decide how a result from the BI system should be displayed or communicated, so that the business teams involved can understand and act on it as quickly and as easily as possible
    • Technical requirement: what hardware and software will be needed for the BI project? If you can use standard PC hardware, for instance, you can meet technical requirements that much more easily. Sisense, for example, both runs and scales on a standard PC, handling up to terabytes or billions of rows of data with full BI functionality quickly and efficiently, as needed.

    STEP 3: Select and compile your data sources (2 days)

    Business intelligence needs inputs of data to produce outputs of results and business insights. Data can come from many different sources (some BI tools have built-in data connectors that make it super easy use data from different places). Remember, data must be correct to start with. Otherwise, the end results will be flawed. Here’s your to-do list with detailed examples:

    • Select the data sources you want to use to answer your business question (see step two above). You might choose your organization’s sales database, existing Excel spreadsheets with financial data, Google Analytics data on the number and type of visits to your enterprise web site, or some combination of such data sources.
    • Understand the correlation between the data sources you want to use. For example, your sales database and your financial spreadsheets might both list your products: the sales database showing how well they are selling, and the spreadsheets showing how much they cost to make. Using the two data sources, your BI tool could show you how to maximize profit by putting more marketing resources on specific products.
    • Join the data from different sources for one version of the truth. Sisense lets you use simple 'drag and drop' to bring different data sources and tables into the same central, high-performance database, called an ElastiCube. Everybody then uses the same version of the collected data, avoiding arguments and allowing people to focus on the results and conclusions of the data analysis.

    STEP 4: Build and display your first BI dashboard (2 days)

    Remember the result requirement from step two above? In this final step, it’s time to create the displays that help your own users understand the results from the BI tool and the data it has analyzed.

    Sisense gives you numerous options to produce web-based dashboard displays, reports that can be distributed to groups of users, and interactive analytics to let users ask new questions and explore further. Here are some great dashboard templates by industry. Your goals in step four are:

    • Identify your target audience. Seek to understand, before trying to be understood! A business management audience may want more intuitive overviews and indications of trends, compared to a technical audience looking for more detail. So, use a corresponding approach to your dashboard.
    • Design your dashboard. Sisense provides options for graphs, charts, and filters that can also be accessed by dashboard viewers to make the dashboard as useful and as engaging as possible. Dashboards are also accessible using a standard web browser, meaning that your viewers do not have to use any additional plugin or download.
    • Information design. Common sense will play an important role here. Looking to show a trend over time? A line chart might be the simplest and most effective way. Or perhaps you want to show how overall sales of different products compare? A pie chart may be the right choice. If in doubt, remember the KISS principle (Keep It Simple, Stupid!).

    Actionable results from data using BI in one week

    By following the steps above, business users can start their business intelligence journey simply and effectively. They can also rapidly accomplish data management and analysis tasks that would otherwise have taken months of IT resources.

    Author: Elana Roth

    Source: Sisense

  • The art of looking beyond vanity metrics

    The art of looking beyond vanity metrics

    B2B marketers beware: Marketing vanity metrics are easy on the eyes but only skim the surface when it comes to actual value. Although vanity metrics may make you feel good about your marketing efforts, these surface-level metrics only reveal part of the story.

    But, fear not dear marketer! If you turn your attention to the metrics that matter, you can improve your marketing strategy and communicate the important insights to leadership.

    Before we get into it, here’s a quick definition of a vanity metric: a vanity metric is data that looks good at first glance, but provides little insight into business success, company revenue, and ROI.

    So, which data points are the common culprits? Examples of marketing vanity metrics include:

    • Page views
    • Downloads
    • Facebook likes
    • Twitter followers

    An alternative to marketing vanity metrics

    In order to communicate the value of marketing initiatives, marketers must hone in on actionable metrics: metrics that can guide decision-making. These types of metrics are often referred to as engagement metrics. Engagement metrics can tell you more about what’s working, what’s not working, and what information you need to test further. In fact, 91% of marketers named engagement metrics, such as social media interactions, time on site, and bounce rate, as the number one way to measure success.

    But let’s face it, executives and board members can get stuck on marketing vanity metrics. So, how can you manage the ever-increasing expectations around marketing vanity metrics? Today, we take a closer look at three common marketing vanity metrics and explore the different ways to steer the conversation towards more meaningful metrics. Let’s jump right in!

    1. Social media followers

    Many marketers rely too heavily on their social media followers to measure their social media success. And we get it! All marketers want to see an increase in social media followers, but, these numbers don’t necessarily equal an engaged audience.

    Think about it this way: you may have thousands of Twitter followers but if only one of them engages with your social content regularly, what is your following really worth? On the other hand, you may have a small but dedicated following on LinkedIn with your social posts often leading to big sales. Yes, your LinkedIn audience is smaller, but it turns out these users engage more with your content, ultimately bring in more value. Just by digging into the data, you’ve zeroed in on actionable information to guide your social media efforts.

    The next time higher-ups inquire about your social media following, be sure to shift the focus to more important engagement metrics. It’s important to note that your marketing and business goals will dictate which metrics are most important to your executive team. Here’s what we recommend:

    Brand awareness:

    An easy way to show brand awareness on social media is through the number of brand mentions or tags you receive. During your next marketing campaign or product launch, keep a close eye on branded keywords. Next, keep an eye on the competition’s branded keywords to reveal how often social media users interact with competing businesses. Use this information as a benchmark to measure and understand your own performance.

    Lead generation:

    When tracking lead generation, focus on conversions for maximum impact. As you review conversion data in your preferred analytics platform, take note of the social networks that deliver the highest number of qualified leads.

    Website traffic:

    If your goal is to generate website traffic from your social presence, look closely at metrics that demonstrate real social engagement. For instance, check out where your social media leads enter your website, track the pages you visit, and where they drop off. Also, take a look at the specific posts and channels that garner the most clicks so you can scale your success and serve more content that resonates with your followers.

    Customer experience:

    If you use social media as a customer support channel, the number of followers you accumulate won’t give you any information about how you are doing. Instead look at metrics like the ratio of questions asked to questions answered or responsiveness. Then, work to improve how many cases or complaints you solve.

    Event or webinar registrants:

    If your goal is to generate event participation, break your reports down by social channel. This shows you where users are the most active and engaged in your webinar or event. Simply include campaign tracking information in your social links.

    Content downloads:

    Not all content is created equal. For instance, a high conversion on gated content signals a high-quality piece of content. Use this metric to strategize on future content offerings and bring those insights to leadership.

    The list above is a good starting point to show the senior team how your social efforts meet specific business goals. Roll up your sleeves, and start tracking!

    2. Total app, product, or software downloads

    Total downloads. This number can be impressive on the surface but it isn’t a clear way to gauge the impact your marketing efforts have on product adoption. Instead of looking at total number of downloads, look to yearly and monthly download trends to reveal if downloads are increasing or decreasing over time. Then, look at this timeline in comparison to a timeline of major marketing campaigns. That way, you can pinpoint which efforts had an impact on downloads and which did not.

    Another issue with total downloads, is that it doesn’t paint a complete picture of product usage or adoption. Instead, look at these key usage metrics for a clear understanding of how your customers and prospects engage with your offers:

    • Uninstall rate
    • Renewal rate
    • Trial conversion rate
    • Time users spend using the software

    Although higher-ups and executives may only express interest in total downloads, it’s your job as a marketer to paint a more complete picture for them. For example, you could explain that total downloads are up after a recent marketing campaign, but usage metrics stayed level. This indicates that your campaign was faulty in some way. Maybe you didn’t give an accurate description of your product, or maybe it was too difficult for users to figure out. These are important insights to highlight to upper management.

    3. Website pageviews

    A high number of pageviews is an ego boost, but pageviews are another metric to be wary of. When you report this data to management, it’s important to provide pageviews along with actionable engagement metrics to fully show user behavior. Focus on how users engage with your website content rather than how many pageviews each webpage garners. Important engagement metrics include:

    • Time spent on site
    • Unique users
    • Bounce rate
    • Pages per visitor
    • Conversion rate

    Some questions to think about when reviewing website analytics:

    • Which pages keep people engaged, and which ones do users abandon quickly?
    • Which elements and CTAs convert best?
    • Can you identify which traffic sources perform best and why?
    • Or, can you determine which campaigns generate the most traffic and why?
    • Is your website content mapped to your sales journey in a way that makes sense?
    • Can you pinpoint at which stage of the buyer’s journey users leave your website?

    Take an in-depth look at these engagement metrics to really focus your online marketing initiatives on engagement over pageviews. Use your findings to build best practices and reduce bounce rate to ultimately keep users coming back for more great content.

    Final thoughts on marketing vanity metrics

    While higher-ups may ask for marketing vanity metrics, it’s your job to refocus on data points that correlate to sales and revenue, improving your business' KPI's.

    Know that you can still report on vanity metrics to management, but don’t spend much time there. Instead, focus the conversation on more actionable, advanced metrics, highlighting the value they offer your company.

    Source: Zoominfo

  • The benefits of analyzing the customer journey of your users

    The benefits of analyzing the customer journey of your users

    Skills related to User Experience (UX) design are high in demand. They are among the top 10 most demanded skills in 2019. ranked by a recent LinkedIn study. Finding qualified UX designers is tied with finding software engineers in terms of hiring priorities, according to a recent Adobe study. Within that UX bucket, designers who have skills related to data analytics and research are particularly sought after, with those qualities being named as a must-have.

    But the ability to analyze the user journey to create delightful experiences for end-users isn’t just a skill that is exclusive to (nor required only by) UX professionals. For stakeholders across the spectrum of software development and delivery, access to interactive data visualizations on how the user is moving through a task can help each group more successfully deliver on their own goals. From engineering, to product management, to marketing. And while access to this data may be expected in a cloud-based application, it’s equally (if not more) important for on-premise software publishers to enable this type of analysis in their products.

    By looking at data related to user flow (also known as ´path analytics´), product stakeholders begin to identify the series of steps it takes users to reach their goals. With a deep view into the steps surrounding a key task, several helpful pieces of information that may have been difficult or impossible to visualize now become readily apparent. Things like unanticipated actions, broken workflows, or shortcuts that power users have discovered that could be promoted or productized. 

    Having this knowledge has benefits that extend beyond streamlining and optimizing the user interface. This insight can help better determine training requirements and guide users, and also provide points for comparison between old and new user interfaces that inform product development.

    How does user flow analysis work?

    It starts with choosing a ´hotspot´ event to analyze. This can range from launching the application, to launching any event within it such as using a wizard, opening a menu, or accessing a particular feature. Next, pick a path direction within the hotspot to drill further into. This can be the start, the end, or somewhere in between. This is where it is crucial to understand the question you’re trying to answer. For instance, the hotspot would be the starting point if the goal is to understand where users go from a particular point, the steps taken, and whether that meets expectations. The hotspot would be the endpoint if you’re trying to answer a broader question about the value of the experience, such as the steps leading up to the user clicking on a link to explore upgraded functionality.

    Choose the number of steps to analyze, and the number of events within each step, as well as any paths that you don’t want to look atAs you audit the events you have tagged, there are a couple of best practices you can follow.

    First, make sure to have a naming convention for events that makes interpreting them easier in user flow reports and visualizations. Secondly, make sure that all of the high value events are tagged, to get data on them as soon as possible or before a specific marketing campaign or product roadmap decision.

    Having a window into these user flows has several key benefits, as it enables the organization to:

    Validate design: Confirm that users are taking the path designed for them or identify if different workflows may produce a better result.

    Visualize the journey: Quickly navigate through path reports to see traffic patterns through events and relative popularity of next/previous steps with a single click. This includes the ability to filter reports to view paths of specific sets of users based on their properties, and exclude noise events such as system generated events that are not user-initiated for clean user paths. The best tools will enable chart-based analysis, and provide the ability to export the data to CSV for offline analysis.

    Verify campaign effectiveness: User flow analysis can also be applied to measuring the effectiveness of marketing campaigns being pushed out through in-application messaging, with the ability to see the path a user took after seeing that message. User flow analysis lends the ability not only to see click-throughs, but also drill down within that to see the exact path users took.

    Author: Victor DeMarines

    Source: Dataversity

  • The emergence of the Internet of Things and its possible impact on the fashion industry

    The emergence of the Internet of Things and its possible impact on the fashion industry

    The Internet of Things (IoT) is slowly but indisputably changing all the aspects of the fashion industry. This includes smart clothes, engaging and interactive customer experience, combining fashion and health, wearable technology and generating power through solar cells or kinetic energy. The possibilities are endless as this technology is being implemented in our daily clothing items providing us with many benefits even outside the fashion world.

    Health benefits

    Probably one of the most significant contributions our society can notice in the fashion industry is health-related. Smart clothing has an enormous potential to monitor and measure the health of a person who is wearing these items. We've already scratched the surface with smartwatches which are able to measure heart rate and diabetes, detect a seizure, help with posture, and much more. Besides accessories, some fashion brands have focused on developing lines of smart clothes that will include an ECG and heart rate sensor. This smart clothing will send data to smartphones through an app which will then help you to analyze your health and seek medical advice if needed.

    Retail space customization

    The power of the IoT can even create a unique shopping experience for customers. In other words, the physical experience can be improved by leveraging technologies which use shoppers' data on online platforms to use it in the actual stores. With a deeper understanding of customer behavior, companies can increase their sales results by giving their customers exactly what they need. With this technology, companies can track customer movements in the store once they log into the app. This way, they can understand their interest across various pieces. We can expect the technology in this area will only grow, and customers will be able to enjoy a more focused, customized, and simpler shopping experience.

    Improved supply chain

    The ability to improve the supply chain and make it more effective is vital for ethical companies. With help of the IoT, companies can uniquely tell their own stories allowing their customers to even connect with the people who created the items they're wearing and say thank you to them. Moreover, this technology enables companies to tap into their shopper's values and use it to improve the supply chain. However, the IoT has the potential to solve yet another common challenge in fashion: inventory. Finding an efficient way to manage inventory and dispose of the headstock is a major problem, but with the IoT, they can optimize new technologies and make large quantities to order.

    Implementing emotions

    Fashion communicates emotions. It was only a matter of time until these two worlds become connected in one with the help of technology. However, hardly anybody expected to see new functionalities like regulating body temperature, detecting and relieving stress in our clothing items. When talking about emotions, the real challenge for these companies is to find apps that their consumers actually want and need. After all, we can't talk about full integration of the IoT in the fashion industry without emotions.

    Understanding which emotions consumers connect to a brand is what can tremendously improve your communication with them and, consequently, sales results. For instance, what do people feel when they see a picture of Swiss watches? Is it loyalty? Tradition? Security? Or something else? If loyalty is the most common emotion, how to use this emotion and implement it in all stages of the customer journey? Finding a specific emotion is the bridge between a brand and its customers.

    Sports player insight

    Sports apparel is a big part of the modern-day fashion industry, so it was only a matter of time until sports brands started to realize how much they can benefit from technological solutions. For example, there has been a rise in data analytics in football which provides extremely useful information on players' fitness level during a match. This way, coaches can get an insight into their players’ work rate and decide whether they need to be substituted or not.

    Football boots could be another item in sports fashion which has the ability to provide useful data thanks to the IoT. With embedded sensors that would measure every step of a player, coaches would also have data on the strength and angle of impact on the ball. This would be crucial when preparing football teams for big competitions as coaches would have vital information on time to make the right strategic decision.

    Conclusion

    There is no telling what other areas of the fashion industry will be affected by the development of such powerful technology, but we can only assume it will be revolutionized completely. Having the ability to get information from consumers without wasting their time and adjusting the customer experience accordingly creates endless opportunities. This can improve our life quality as we gain valuable information on our health in such an easy and non-intrusive way. When talking about what the IoT can do for the fashion industry, the profit will significantly increase for various companies as their brands will be completely adjusted to customer's needs and customers will appreciate that.

    Source: Datafloq

  • The essence of centralizing analytics: a health system perspective

    Hospitals and health systems continue to invest in data analytics, but (too) often a fragmented, decentralized approach to analytics delivery models results in excessive costs, inefficiency and missed opportunities to improve patient care.

    A number of factors have coalesced in recent years to catalyze greater investment in healthcare analytics – the ongoing transition to new payment models under value-based care, a greater emphasis on the health of populations, and increasing competition. But also the explosion in available health data from electronic health records, laboratory test results, and wearable devices – to name a few.

    The momentum isn’t expected to slow down any time soon. A recent report from Zion Market Research predicts the global healthcare analytics market to grow to $68 billion in 2024 from approximately $20 billion in 2017, a compound annual growth rate of more than 19 percent.

    While there’s no question that providing organizations are busy writing checks to healthcare analytics vendors, there is some question about whether they’re getting an adequate bang for their bucks.

    For example, a Deloitte survey of U.S. hospitals and health systems with greater than $500 million in revenues found that fewer than half of respondents said their organization had a clear, integrated data analytics strategy, while about one in four didn’t have a data governance model in placebat all. Even more problematic, about one in three reported that they didn’t know their organizations’ total analytics spend.

    Multiple vendors, no single source of truth

    A common cause of many of these issues is a decentralized approach to analytics in which data analysis happens in different business units that do not share assumptions, analytics methods or insights broadly. In contrast, under a centralized delivery model, an experienced team of data analysts report to one function at the enterprise level, even if they are assigned to serve different business units, based on strategic priorities set at the corporate level. This business-oriented team of analysts meets the need of organizational stakeholders while maintaining and developing in-house intelligence.

    For a large part, a centralized analytics delivery model is important because it offers an improvement to the fragmented, incomplete data governance models that too many providers still use. For example, it’s not uncommon for large health systems to contract with multiple vendors to analyze population health risk for groups of patients with different conditions, such as diabetes and osteoarthritis among others.

    This lack of a single source of truth in analytics can lead to different answers to the same question, such as conflicting guidance on levels of risk, and in turn, on the highest-priority patients to target for interventions. As a result of this fragmented and potentially conflicting information, when prioritizing care plans and interventions, the health system cannot build a consistent clinical profile with a 360-degree view of each patient that accounts for the same factors.

    This results in health system decision makers being left wondering which vendors’ information they should believe.

    Delivering analytics as a service across the organization

    In addition to the fragmentation of data, there are a number of common barriers that prevent hospitals from efficiently and cost-effectively deploying analytics across their organizations, including territorial disputes over data, unclear roles and responsibilities and competition for already-scarce resources.

    As with virtually all organizational transitions, success in centralizing analytics starts with buy-in at the top. Strong executive leadership must bring together talented people with deep experience in applying analytical expertise to solving pressing clinical and business issues.

    A best practice is to place a senior-level executive in charge of analytics, potentially in a Chief Data Officer role, to lead the organization’s centralization initiative. A key function of this role is to establish effective and comprehensive data governance practices, clearly defining what type of data the organization will collect, how the data is structured, who can access it, and how it gets reported and presented to different people in the organization, among other steps.

    Once the organization establishes a solid foundation for data, it will be ready to adopt a single analytics platform that delivers actionable information to decision makers. Today’s leading analytics platforms often employ machine-learning systems to automatically extract important insights that may not be otherwise apparent to human analysts.

    Ultimately, the aim is the creation of one internal, centralized professional services group within the organization that delivers analytics as a service to other stakeholders in the hospital. By structuring a hospital’s analytics functions this way, the organization can eliminate the fragmentation and cacophony of multiple systems that offer conflicting insights and prevent leadership from understanding the organization’s full analytics spend.

    Generalization in practice

    Already, prominent health systems like University of Michigan Health System (UMHS) and Beth Israel Deaconess Medical Center (BIDMC) have taken the leap to centralized analytics delivery models. UMHS, for example, has created comprehensive registries for population health and used them to generate predictive analytics that focus predominantly on chronic diseases. BIDMC, through its centralized analytics governance model, provides layers of decision support and analytics for its physicians, with the goal of understanding variations in cost and care to maximize quality, safety, and efficiency.

    In the future, the insights derived from centralized analytics delivery models are likely to help hospitals improve quality, lower costs, identify at-risk populations and better understand performance. For that to happen, however, hospitals and health systems must first overcome the fragmented, decentralized approach to analytics that prevents them from realizing the full value of their analytics investments.

    Source: Insidebigdata

  • The essence of using an organization-wide data analytics strategy

    The essence of using an organization-wide data analytics strategy

    Does your organization spend loads of time and money collecting and analyzing data without ever seeing the expected return?

    Some 60% of data and analytics projects fail to meet their objectives. Part of the problem is that you can now a just about anything, which has caused our appetite for data to grow exponentially, often beyond what enterprise organization’s data and analytics teams can handle. Too often, talented people with the right tools can’t create meaningful outcomes because of cultural or organizational challenges.

    Here are some telltale signs that your data resources are being wasted.

    • Road to nowhere: When data and analytics teams are seen as order-takers, it can lead to a one-way stream of requests that overload resources and don’t reflect strategic needs.
    • Garbage in: A lack of standards around how data requests are made leads to disorder and inefficiency.
    • Static data in a dynamic world: Data is treated as a retrospective recording of historical measurements with little ability to draw insights or solve problems.
    • Data distrust: Data silos lead to a lack of transparency around who is producing data, what data is actually being used and how they’re doing it. Over time, this can make business leaders start to doubt the accuracy of their own organization’s information.

    In this environment, employees often try to satisfy their own data needs outside the company’s defined channels, which worsens the problem by creating more internal customers for the centralized data analytics team.

    With growing demand for data, you need to organize your data and analytics teams to reflect big-picture goals. Data resources should be assigned based on your organization’s strategic and operational needs rather than the frequently narrow requests of individuals. The goal is to become an organization where data and analytics partner with the business to create value over the long term.

    Your business objectives should drive any and all decisions you make toward organizing data and analytics teams. Data is not the end but rather the means to support the broader strategy.

    The long road toward organizing your data and analytics strategy can be simplified as a three-step process.

    • Organize your analytics resources around business processes.
    • Put money behind products that will help the whole enterprise.
    • Build a product-centric workflow that is transparent, manages the demand of data resources, and delivers on outcomes.

    Mapping your data resources to business processes will help your organization get the most out of its people. It’s also an eye-opening experience for many, revealing the shared needs across departments. Arranging your organization in this way also reduces waste in the form of redundant data reporting. Your people will also have more time to generate insights and spend less time and effort curating their own data marts.

    These newly formed 'analytics centers' subsequently govern the demand and prioritization of analytic products and can help to assess what the major data needs of the organization are. A side benefit is that your data and analytics teams will be empowered. Rather than fielding requests, they’ll start working on products that help the company succeed.

    Developing a long-term product roadmap for your data needs also requires someone to build consensus. The analytics product manager serves a critical role here, understanding the business objectives and translating them for technical teams.

    When analytics centers are enabled, a company will see better return on their investment, as well as more manageable demand on their data and IT resources without the overflow of one-off and redundant requests. The point isn’t to create a totally centralized data and analytics process. Rather, these analytics centers serve as spokes to the company’s enterprise data managementand IT hubs.

    The centers are also a resource to individual departments and teams, relaying their needs to EDM. This arrangement enables the data and analytics centers to filter through mountains of requests to find out what truly matters to the organization.

    Spending more isn’t the answer. Start by identifying the strategic aim of data, organizing analytics resources around them and building products that add lasting value.

    Author: BJ Fineman & Kurt Knaub

    Source: Information-management

  • Three objectives to guide your business' KPI's

    Three objectives to guide your business' KPI's

    Many data analytics vendors give users the ability to measure everything but offer little guidance. This can be overwhelming for new users. It is very important to determine the metrics that really matter to your business. To get you started, your business should establish critical metrics, and then teach you how to quickly identify areas of concern to meet the unique needs of your business. 

    We have learned three objectives that serve as guideposts to help you decide what to measure. These guideposts are also a rubric to make sure that each functional area of the business is aligned toward overall success. In other words, every area of the business, like sales, inventory management, operations, and finance, is measuring core Key Performance Indicators (KPIs) that contribute to the overall success of the business. The three key objectives are improved customer experience, optimizing growth, and increasing profitability. Excelling in these three areas will drive your business goals. Each of these objectives drives and supports the others and creates a framework for success.

    1. Improve customer experience

    When considering how to improve customer experience, it may be helpful to begin asking the following questions. What is the experience of your customer base? How would you measure that experience? Do you know what factors might be impacting your customers’ experience? Do you know how to measure those factors?

    Customer experience is critical to increasing your market share. However, this is difficult to do if your customers are leaving because they are dissatisfied. So, how can we make sure our customers have a great experience and want to keep us as their supplier? First, customers want their orders on time. It might be they need their order delivered to a job site so they can complete their work.

    In this case, a key metric is 'delivery in full, on time' (DIFOT). A gauge on your dashboard can quickly show you what percentage of your orders are delivered in full and on time. In just a few clicks you can go from a high-level summary to a detailed analysis of your data to see DIFOT rates by warehouse, category of products, individual products, and more to pinpoint the problem. Is it a shipping problem from a particular warehouse? Is there a problem with a product category? Do I have enough product in stock? This is a key element to a positive customer experience. To be sure you always have the right product in stock, create a KPI to measure 'stock outs' or priority items out of stock.

    2. Support company growth

    When considering ways to support the growth of your company, begin with the following questions: What are your top growth opportunities for new customers or new products? Are you aware of your biggest opportunities? Where might you have some risks? Can you quickly list these risks and opportunities? Growth is the key to business success. If you’re not increasing your share of the market, or at least keeping up with your competitors, then eventually you’re going to be out scaled. Maybe you have enough market share for the immediate future, but if you’re not striving to grow, then you are likely to be overtaken by your competition.

    It is important for sales managers to be alerted to 'customers in decline'. By having market analysts monitor customers whose sales have been declining for the last few months, your sales team will be able to quickly intervene before the sale is lost. Begin with the customers with the highest sales values to prevent the greatest losses. Another important alert is new customers and the product categories and individual products they are purchasing. The purchasing manager should pay attention to the sales trends for new products to ensure there is always enough stock on hand. 

    3. Enhance profitability

    What can you do today to move the needle on your profitability? This is a core objective for every business. In the beginning, a small company must focus its efforts on gaining volume. However, once a company has matured, it is in a position to make small, subtle changes that will have a tremendous impact on profit. 

    Improving profitability usually involves making small changes in highly repeated business processes, adapting to your environment. For instance, strategic price increases can improve your profit margin without risking sales volume. Improving delivery processes can reduce the cost of each truck leaving the warehouse. Minimizing deadstock frees up cash that can be used on other profitable investments. To monitor profitability, your sales manager can create a KPI to monitor margin trends, deadstock, and low turns.

    When measuring the right KPI's, your sales team will know which customers are at risk. Your accounting team will know to keep an eye on those customers’ accounts receivables. Your warehouse will know how they’re performing against on-time delivery targets. In this way, each area of your company can work to meet the same three objectives to drive your success.

    Source: Phocas Software

  • Top 4 e-mail tracking tools using big data

    Top 4 e-mail tracking tools using big data

    Big data is being incorporated in many aspects of e-mail marketing. It has made it surprisingly easy for organizations to track the performance of e-mail marketing campaigns in fascinating ways.

    How big data changes e-mail tracking

    No matter what your role is, if you work in the technology sector, you likely spend a large portion of your day dealing with e-mail in some way. You’re sending, reading, or reviewing e-mails, or you’re checking your inbox to see if anything else comes in. By some estimates, the average worker even spends 30 hours a week checking their e-mail.

    Despite being such a centrally important and frequent job function, most of us are flying blind. We don’t understand how much time we’re spending on e-mail, nor do we have a solid understanding of whether our efforts are productive. Fortunately, there are several new e-mail tracking software tools that employers and employees can use to keep a closer eye on these metrics.

    The problem is that previous e-mail monitoring tools lacked the analytics capabilities needed to make empirically based decisions with the quality managers needed. Big data is making it easier for companies to get deeper insights.

    Why use e-mail tracking software tools that rely on big data?

    There are many potential applications for e-mail tracking software tools, but these are some of the most important:

    • Productivity analytics. Studying how you e-mail can alert you to the nuances of your e-mail habits, including how often you send e-mail, how long it takes you to write and read e-mail, and what your busiest days and times are. You’ll learn what your worst habits are, so you can polish them and use your time more efficiently, and will learn to optimize your schedule to get more done each day.
    • Sales and response metrics. Many companies rely on sales or prospecting via e-mail, but if you aren’t gathering metrics like open rates and response rates, you may not be able to improve your process over time. e-mail tracking software can help you keep tabs on your progress, and may help you gather or organize information on your prospects at the same time.
    • Employee monitoring. Employees waste about 3 hours a day on unproductive activities, while most human resources departments only assume that 1 hour or less is wasted per day. Using some kind of e-mail tracking can help you measure your employees’ productivity, and help you balance workloads between multiple employees.

    Big data is at the root of all of these functions, and this makes it critical to control your data. It makes it easier for brands to get better insights.

    The best e-mail tracking software tools that leverage big data

    Some e-mail tracking tools focus exclusively on one e-mail function, like tracking sales or marketing campaigns. Others offer a more robust suite of features, allowing you to track your overall productivity.

    Whatever your goals are, these five tools are some of the best e-mail tracking apps you can get your hands on. They all rely on sophisticated big data analytics systems.

    1. EmailAnalytics

    First, we have EmailAnalytics, which can be thought of like Google Analytics for Gmail. This tool integrates with your Gmail or G Suite account and visualizes your e-mail activities into charts, graphs, and tables. It reports on metrics like average e-mail response time, e-mails sent, e-mails received, times and days of the week that are busiest for you, and how long your average e-mail threads tend to last. With the help of interactive data visuals and regular reports, you can quickly determine the weak points in your approach to e-mail (and resolve to fix them). The tool also enables managers to view reports for teams or employees, so you can monitor team e-mail productivity.

    2. Microsoft MyAnalytics

    Microsoft’s MyAnalytics isn’t quite as robust as EmailAnalytics, but it works quite well as a productivity tracker for Microsoft Outlook. With it, you can keep track of how you and your employees are spending the hours of your day, drawing in information from your e-mail inbox and calendar. If you’re spending too much time in meetings, or too much time on managing your inbox, you’ll be able to figure that out quickly, and starting making proactive changes to your scheduling and work habits.

    3. Streak

    Streak is another Gmail tool, and one that attempts to convert Gmail into a full-fledged CRM platform. With it, you can convert messages into leads and prospects across various pipelines, and track your progress with each new prospective sale. It also offers built-in collaboration tools, so your team can work together on a single project—and track each other’s efforts.

    4. Yesware

    Yesware is designed with salespeople and salesmanagers in mind, and it offers prescriptive sales analytics based on your e-mail activity. With it, you can track a number of metrics within your e-mail strategy, including open rates, click-through rates, and other forms of customer engagement. Over time, you’ll learn which strategies work best for your prospects, and can use those strategies to employ more effective sales techniques.

    Implementing these e-mail tracking software tools in your business can help you better understand how you and your employees are using e-mail, improve your sales process, and spend less time on this all-too-important communication medium. Just remember, while data visuals and reports can be helpful in improving your understanding, those insights are only truly valuable if you take action on them.

    Big data makes e-mail tracking more effective than ever

    Big data is changing the nature of e-mail marketing. Companies can use more nuanced data analytics capabilities to drive their decision-making models in fascinating ways.

    Author: Matt James

    Source: SmartDataCollective

  • Using big data to improve as a manufacturer

    Using big data to improve as a manufacturer

    Here's how to implement manufacturing analytics today, in a world where big data, business intelligence, and artificial intelligence are steadily expanding.

    Big data is everywhere, and it’s finding its way into a multitude of industries and applications. One of the most fascinating big data industries is manufacturing. In an environment of fast-paced production and competitive markets, big data helps companies rise to the top and stay efficient and relevant.

    Manufacturing innovation has long been an integral piece of our economic success, and it seems that big data allows for great industry gains. Improvements in efficiency, maintenance, decision-making and supply chain management are possible with the right data tools. Anything from staff schedules to machine performance can be improved with big data.

    Decreasing inefficiency with big data

    Manufacturers are always looking for ways to make marginal improvements in their systems and how they operate. This type of management can be complex, and with the many different steps of the supply chain, teasing out every last detail to improve can be challenging. Thankfully, with big data, manufacturing companies can competently manage supply chain details in order to oversee any possible improvements available.

    Big data allows manufacturers to look at each discrete part of a supply process. This microscopic view of the supply chain can show managers new insights into how their process can be improved or tweaked. Big data can be used in different ways to cut down on supply chain inefficiencies. Individual machines, supply chain setup, and staffing, among others, are all components of a manufacturer’s efficiency.

    More and more manufacturers are closing gaps in inventory inefficiencies, too. For example, 72% of manufacturers consider real-time monitoring essential for modern inventory reconciliation.

    Managing supply and customization

    Taking the customer’s preferences into consideration when configuring the manufacturing processes is of extreme importance. The need for consumer customization is a challenge for supply chain managers. Cookie-cutter solutions don’t apply to consumers anymore. They want and need customized products and services. However, in most scenarios, added customization equals added costs. Big data can help bridge that gap of wanting to appease customers while making ends meet at the same time.

    With advanced data analytics, manufacturers can see customer data in real-time. This reduces the time required to make necessary adjustments to the product lines, cutting down on wasted time and improving overall efficiency.

    One of the largest effects of real-time monitoring in manufacturing is the ability to improve order-to-fulfillment cycle times. Building a robust data platform can transform the way manufacturers handle their customers and supplies. Not only are real-time results available, but big data can also provide demand forecasts to guide the production chain based on historical data sales trends in order to stay on top of the demand.

    Predictive maintenance

    One way to reduce the amount of downtime spent on fixing manufacturing machines is fixing the machines before they break. The ability to monitor manufacturing assets in order to predict necessary maintenance is another application for big data. The less time a machine is out of commission, the less money is being lost. With increased notice before a breakdown occurs, you can secure an easy win for your company’s return on investment: you’ll be able to form a strategy around those maintenance intervals and costs without having any negative surprises.

    Big data means using a wired or wireless connection to track machine utilization with greater accuracy to see the variables that could impact its performance. A manager can see what or who is performing optimally, giving the information needed when making business decisions.

    Improved strategic decision-making

    With all of the information available today, many decisions can be driven by big data. The power of advanced data collection and monitoring systems means increasingly little guesswork when it comes to overall management strategy. A well-structured data management system can connect supply line communication. There can be many areas within a manufacturing company that may not speak to each other effectively. If big data is applied to the process, information can be gathered and analyzed across departments and locations.

    With big data, there is less guessing and more data-backed action.

    Deconstructing big data in manufacturing

    There are several steps involved before big data can be utilized by parties within the manufacturing industry:

    • Gathering and storing data: The ability to gather data is essential in the big data process. Although many systems can gather data, accurate data is much harder to find. Once the data is gathered, it must be stored. Storing data is essential for keeping quality records of important business assets as well as for overall safety and auditability.
    • Cleaning and analyzing data: Gathering and storing data is not helpful when you can’t find the data you need to make decisions. Data cleaning allows the immense amount of data to become more scalable. Trends and patterns are easier to spot when the data is clean. Analyzing relevant data is what leads to strategic business decisions.
    • Data mining: The ability to find information fast and easily is of extreme importance in the manufacturing industry, since each decision can have a major impact on the bottom line. Advanced data mining allows a company to find the data they need exactly when they need it.
    • Data monitoring: A strong data monitoring system allows manufacturers to keep their business up to industry standards. The continual ability to monitor important data points that matter to your company is essential in having a competitive advantage.

    Conclusion

    Big data is certainly a buzzword within many industries, and for good reason. The ability to collect important data is priceless to a business and can easily lead them to a competitive advantage. However, the ability to use big data in an efficient and useful way in order to make business decisions is more challenging. Making sure there is a purpose behind all that data is necessary for taking advantage of all big data has to offer.

    Author: Megan Ray Nichols

    Source: SmartDataCollective

  • Why you should use data analytics to determine your pricing strategy

    Why you should use data analytics to determine your pricing strategy

    Every company must have a strategy to price their goods. Your pricing strategy is a fundamental component to your marketing process as pricing can increase sales or send customers to your competitors. Because a variety of factors such as product life cycle, competition, and customer perception can affect pricing decisions, it’s important to consider these when determining the best pricing strategy for your company. 

    Data analytics provides a clear, consolidated view of your pricing, allowing you to make sound pricing decisions. We've examined the three most common strategies: cost-plus, competitor-based, and value-based, and how data analytics can help manage each one across your customer base.

    Cost plus pricing

    When people think of the term ‘pricing strategy’, cost-plus pricing is what comes to mind. This is the simplest form of pricing as it is just a matter of pricing your products above cost. Simply total all of your costs and add the margin you want on top to determine the price. The benefit of this strategy is that there is no strategizing. There is very little data analysis or market research involved. Due to this, cost-plus pricing has been considered a good starting point for a new company with little overhead.

    However, cost-plus pricing is harder to manage over time as you may not be able to predict all of your costs since costs can fluctuate. If, for example, your company calculates your costs and adds a 15% margin, this may work well for the first quarter. But if some unexpected cost comes up, such as a supplier raising their prices, your margin may be cut to 10%. A data analytics solution will help manage these unforeseen costs and you can set up alerts to advise when margins drop beyond a set threshold.

    Competitor based pricing

    Rather than using costs as a benchmark, this strategy is based on setting your prices according to your competitors’ pricing.  This is common when companies are vying for the same contract with government in health or construction. When you are in a market with a product that is not unique or where prices are already established, it’s best to set your prices somewhere in the middle but data analytics can help you do modelling for tenders so you can put forward desired volumes to receive the preferred price.

    On the other hand, if you are offering a better product with new features or more value, you should consider pricing your products higher than your competitors. And setting your prices below your competitors is similar to cost-plus pricing as this depends on your resources. Are you to be able to withstand unexpected costs? If not, you risk impacting your profit margins. In any case, your pricing should be close to those of your competitors if you’re in a highly competitive market.

    The drawback to competitor-based pricing is that you don’t have a strategy that addresses the unique needs and concerns of your company. By developing your own pricing strategy, you can focus on adding value by offering better products at the right price. Data analytics will allow you to determine best selling products, in what markets and to what customers which will help drive a more efficient pricing policy.

    Value based pricing

    Value-based pricing is setting your prices based on what your customers believe your product is worth and what they are willing to pay. The more value your product offers your customers, the more money they will be willing to pay. Rather than looking at your costs or competitors, value-based pricing requires you to look to your customers. By getting to know the people who decide whether to purchase your product, you ensure that you understand what your customers truly want, and that you are offering the most value for the best price.

    When determining the price point for a product, consider factors such as whether your product is different from your competitors. Will it help your customers to save time or money? Will it help your customers gain a competitive advantage? What features can you develop over time? Answers to these questions will help you determine your product’s value and whether your customers are willing to pay for it. Once you know your customers are willing to pay for your product, you can set a higher price point and then raise prices as you add more value.  The downside to value-based pricing is that it takes time. You must be willing to invest the time to get to know your customers and understand their needs to set effective prices this way. 

    Data analytics allows you to compare and assess different strategies

    With data analytics, you can price according to your target market. Analytics enables companies to dramatically improve profitability by developing optimal pricing strategies to win more contracts and offer the most value to customers. Combining pricing with analytics allows you to leverage your data to understand both the internal and external factors affecting profitability at a granular level.

    In spite of the wealth of data available, many companies are still in the dark when it comes to understanding their customers. Yet, facing growing complexity and a multi-channel business environment, companies must be able to answer fundamental questions such as 'Who is my most profitable customer?' and 'What is my most profitable product or region?' Answering these questions can help you understand your customers and their buying behaviors to create the most effective pricing strategies. In other cases, analytics can highlight your most unprofitable customers so you can realign their discounts or other incentives to increase profits. With analytics, you have a mechanism that functions as both a catalyst and a metrics engine for managing profitability. 

    Source: Phocas Software

  • Wrangling and governing unstructured data

    Unstructured data is the common currency in this era of the Internet of Things (IoT), cognitive computing, mobility and social networks. It’s a core rebusinessIntelligence unstructuredsource for businesses, consumers and society in general. But it’s also a challenge to manage and govern.

    Unstructured data’s prevalence

    How prevalent is unstructured data? Sizing it up can give us a good sense for the magnitude of the governance challenge. If we look at the world around us, we see how billions of things become instrumented and interconnected, generating tons of data. In the Internet of Things, the value of things is measured not only by the data they generate, but also by the way those things securely respond to and interact with people, organizations and other things.

    If we look into public social networks such as Facebook, LinkedIn or Twitter, one of the tasks will be to know what the social network data contains to extract valuable information that can then be matched and linked to the master data. And mobile devices, enabled with the Global Positioning System (GPS), generate volumes of location data that is normally contained in very structured data sets. Matching and linking it to master data profiles will be necessary.

    The volume of unstructured information is growing as never before, mostly because of the increase

    of unstructured information that is stored and managed by enterprises, but is not really well understood. Frequently, unstructured data is intimately linked to structured data—in our databases, in our business processes and in the applications that derive value from it all. In terms of where we store and manage it, the difference between structured and unstructured data is usually that the former resides in databases and data warehouses and the latter in everything else.

    In format, structured data is generated by applications, and unstructured data is free form. In addition, like structured data, unstructured data usually has metadata associated with it. But not always, and therein lies a key problem confronting enterprise information managers in their attempts to govern it all comprehensively.

    Governance of the structured-unstructured data link

    When considering the governance of unstructured data, a focus on the business processes that generate both the data itself and any accompanying metadata is important. Unstructured data, such as audio, documents, email, images and video, is usually created in a workflow or collaboration application, generated by a sensor or other device, or produced upon ingestion into some other system or application. At creation, unstructured data is often but not always associated with structured data, which has its own metadata, glossaries and schemata.

    In some industries, such as oil and gas or healthcare, we handle the unstructured data that streams from the sensors where it originated. In any case, unstructured data is usually created or managed in a business process that is linked to some structured entity, such as a person or asset. Consider several examples: 

    • An insurance claim with structured data in a claims processing application and associated documents such as police records, medical reports and car images
    • A mortgage case file with structured data in a mortgage processing application and associated  pplicant employment status and house assessment documents
    • An invoice with structured data in an asset management application and associated invoice documents
    • An asset with records managed across different applications and associated engineering drawings 

    Governance challenges enter the picture as we attempt to link all this structured and unstructured information together. That linkage, in turn, requires that we understand dependencies and references and find the right data, which is often stored elsewhere in the enterprise and governed by different administrators, under different policies and in response to different mandates.

    What considerations complicate our efforts to combine, integrate and govern structured and unstructured data in a unified fashion? We must know how we control this information, how it is exchanged across different enterprises and what are the regulations and standards to secure delivery of its value and maintain privacy.

    We also need to understand what we are going to do with the data that we collect because just collecting data for future use, just in case, is not the solution for any problems. We can easily shift from competitive advantage to unmanageable complexity.

    Governance perspectives

    Across different industries in a complicated ecosystem of connected enterprises, we handle different types of information that is exchanged, duplicated, made anonymous and duplicated again. In analytics we handle predictive models to provide recommendations resulting in critical decision making. We need to think about models’ lifecycle and track the data sets used to develop such models as well as ownership changes.

    How can governance be applied here? When we speak about information, integration and governance, we usually get different answers. Some, such as a legal record manager, focus on unstructured data curation, document classification and retention to comply with internal policies and external legislation. On the other hand, data warehouse IT groups focus on structured and transactional data and its quality to maintain the best version of the truth.

    But the business usually doesn’t care about what type of information it is. What they want to see is the whole picture that will include all related information from structured, unstructured and other sources with proper governance around it. The importance for integrated metadata management became crucial.

    Data lifecycle governance environments

    To unify governance of structured and unstructured data, enterprises need to remove borders between information silos. In addition, organizations need to be connecting people and processes inside and outside the organization. And they need to make every effort to create trusted and collaborative environments for effective information configuration and management.

    What should span all information assets, both structured and unstructured, is a consistent set of organizational policies, roles, controls and workflows focused on lifecycle data governance.

    Author: Elizabeth Koumpan

    Source: Big Data & Analytics Hub

EasyTagCloud v2.8