8 items tagged "BI "

  • 11 BI tools to consider in 2021

    11 BI tools to consider in 2021

    With more and more data at our fingertips, it’s getting harder to focus on the information relevant to our problems and present it in an actionable way. That’s what business intelligence is all about.

    BI tools make it simpler to corral the right data and visualize it in ways that enable us to understand what it means. But how simple that process gets, and how you can visualize the data depends on the tool: picking the right one for your needs becomes important.

    Here we round up of a dozen popular, highly regarded BI tools to help you uncover what your organization’s data can tell you about your business.

    Top 11 business intelligence tools

    • Board
    • Domo
    • Dundas BI
    • Microsoft Power BI
    • MicroStrategy
    • Oracle Analytics Cloud
    • Qlik
    • SAS
    • Sisense
    • Tableau
    • Tibco


    Board International combines three tools in one: BI, predictive analytics and performance management. While it aims to offer something for everyone, it predominately focuses on finance-oriented BI. It has modules for finance (planning, consolidation), HR (skills mapping, workforce planning), marketing (social media analysis, loyalty and retention monitoring), supply chain (delivery optimization, supplier management), sales (cross-selling and up-selling analysis) and IT (KPIs, service levels). The company is Swiss, but the software is available in English, Spanish, Chinese, Japanese, French, German and Italian. The latest version of its platform has replaced its multidimensional online analytical processing (MOLAP) approach with an in-memory calculation engine.

    • Target audience: The whole enterprise but usually enters via the finance department
    • Notable features: Language support
    • Pricing: License fee per user varies according to role


    Domo is a cloud-based platform focused on business-user-deployed dashboards and ease-of-use. It offers business intelligence tools tailored to various industries (such as financial services, health care, manufacturing and education) and roles (including CEOs, sales, BI professionals and IT workers). CIOs might start by checking out how it handles data from AWS, Jira, GitHub, or New Relic before looking at how over 500 other integrations can help the rest of the enterprise.

    • Target audience: CEOs, sales and marketing, BI professionals
    • Notable features: Robust mobile interface
    • Pricing: On request

    Dundas BI

    Dundas BI from Dundas Data Visualization is used predominantly for creating dashboards and scorecards, the company’s historic strengths, but it can also perform standard and ad-hoc reporting. Analysis and visualization are performed through a web interface that can adapt to users’ skills: Power users and standard users see different features. The latest version has a new in-memory engine, a new natural language query capability, and adds point-and-click trend analysis, support for Linux, and an application development environment for customized analytic applications. Dundas BI has been tailored for 19 industries, including clean tech, mining and construction, in addition to the usual suspects such as banking and healthcare. It sells to large enterprises but specializes in embedded BI.

    • Target audience: C-suite, HR, finance, sales, marketing, customer service
    • Notable features: Flexible, HTML5 interface that adapts to any device
    • Pricing: Based on concurrent users, not named users, with no tie to number of servers or cores

    Microsoft Power BI

    With the Power BI Desktop app for Windows, users can analyze and visualize data from local or cloud sources, publishing their reports to the Power BI platform. It offers data preparation, visual-based discovery, interactive dashboards, and augmented analytics. The free Power BI Desktop version suits isolated users; the Pro version makes collaborative analysis easier, for a monthly fee, leveraging Microsoft Teams, Office365 and SharePoint to control access to raw data and published reports. For enterprises that want to go all-in, the Premium tier makes self-service data prep possible with prebuilt connectors to corporate data held in Microsoft Dynamics 365, Azure SQL Data Warehouse, or third-party sources such as Salesforce.

    • Target audience: Microsoft shops
    • Notable features: Mobile app allows touch-screen annotation of reports
    • Pricing: Desktop: free; Pro: $9.99 per user, per month; Premium: $4,995 per dedicated cloud compute and storage resource


    MicroStrategy targets the enterprise BI market in a broad range of industries with cloud, on-premises and hybrid deployment options. It features a drag-and-drop interface to help users create customized data visualizations and build personalized, real-time dashboards. MicroStrategy touts federated analytics that allow customers to leverage existing investments in data sources such as Tableau, Qlik and Power BI, and blend the data to build reports and insights. It also features enterprise semantics graph, which indexes data assets to enrich data silos with location intelligence and real-time telemetry. MicroStrategy offers a free, 30-day full platform trial.

    • Target audience: The whole enterprise
    • Notable features: Digital credentials allow you to replace physical ID cards, passwords and tokens in favor of mobile role-based identity badges tied to smartphones
    • Pricing: By request

    Oracle Analytics Cloud

    Oracle has spent the past several years bulking out its Oracle Analytics Cloud offering, launched in 2014 as an outgrowth of its flagship Business Intelligence Enterprise Edition suite. In 2020, it added a Cloud HCM offering to provide self-service workforce analytics to HR executives, analysts and line-of-business leaders. Oracle has focused on making its cloud offering intuitive and user-friendly, with powerful reporting and machine learning features. Key features include data preparation, data connectors, visualizations, predictive analytics, a native mobile app, and support for embedded analytics.

    • Target audience: Users in midsize to large enterprises
    • Notable features: Conversational analytics support natural language queries; can automatically generate natural language explanations to explain visualizations and trends
    • Pricing: Enterprise: $80 per user, per month, or $2.1506 Oracle compute unit (OCPU) per hour; Professional: $16 per user, per month, or $1.0753 OCPU per hour; Professional – Bring Your Own License (BYOL): $0.3226 OCPU per hour; Enterprise – BYOL: $0.3226 OCPU per hour

    Qlik Sense

    Qlik’s goal is to give anyone in the enterprise access to all its data — subject, of course, to corporate data governance policies. All that data should be enough to bog down most database engines, but Qlik says its Associative Engine can associate every piece of data with every other piece to make it easier to search for connections. The Associative Engine now has AI and machine learning capabilities that offer context-aware insight suggestions thanks to the Qlik cognitive engine. Qlik Sense, the self-service tool to access that analytical capability, comes in cloud and on-premises versions.

    • Target audience: The whole enterprise
    • Notable features: Associative Engine can analyze all your data, on the fly
    • Pricing: Limited versions: free; Qlik Sense Business: $30 per user, per month; Qlik Sense Enterprise SaaS: Analyzer: $40 per month; Professional: $70 per month; Analyzer Capacity Packs, $100 per month

    SAS Visual Analytics

    SAS’s take on BI is its Visual Analytics tool, offered via its cloud and microservices-based SAS Viya platform. It aims to automatically highlight key relationships in data: The latest version adds automated suggestions for relevant factors, along with insights expressed via visualizations and natural language. Other features include sentiment analysis for extracting data from social media and other texts, automatic generation of charts, mapping, and self-service data preparation. Deployment can be on premises, in public or private clouds, or on the Cloud Foundry platform as a service.

    • Target audience: Users across large enterprises
    • Notable features: Automated analysis functions
    • Pricing: On request


    Sisense’s BI software stack covers everything from the database through ETL and analytics to visualization, and it claims its In-Chip database engine is faster even than in-memory databases. It is best known for embedded BI uses. The latest version adds new machine learning capabilities. Sisense is available on premises or in the cloud. There are solutions for finance, marketing, sales, HR and IT, as well as customer service and operations and logistics departments. Sisense also makes it possible to offer the analytics tools to users outside the enterprise by embedding them in web applications. Sisense acquired Periscope Data in September 2019 and is in the process of integrating advanced analytics capabilities gained through the acquisition.

    • Target audience: Typically SMEs
    • Notable features: Fully web-based client, including for data prep
    • Pricing: On demand, based on an annual fee for software and service


    With Tableau, Tableau Software is covering all the bases: You can run its software on premises, choose a public cloud, or opt to have it fully hosted by Tableau. It offers tailored versions for over a dozen industries, including banking, healthcare and manufacturing, with support for financial, HR, IT, marketing and sales departments, although that’s almost par for the course these days. Tableau’s capabilities include mapping and analysis of surveys and time series data. Its latest trick is drawing on the artificial intelligence techniques of natural language processing to allow users to describe what they want to see, rather than clicking and dragging to create formulaic queries.

    • Target audience: Midsize and larger enterprises
    • Notable features: Tableau draws on natural language processing to enable users to say what they want to see
    • Pricing: Each deployment needs at least one Tableau Creator ($70 per month); for Tableau Server (on-premises or public cloud), others can be Viewers ($12 per month, minimum 100) or Explorers ($35 per month, minimum 5); alternately, with Tableau Online (fully hosted by Tableau), others can be Viewers ($15 per month, minimum 100) or Explorers ($42 per month, minimum 5)


    Tibco Spotfire is a self-service, AI-powered data visualization platform for dashboards, interactive visualization, data preparation, and workflow. The platform offers machine learning-based data preparation capability to support building complex data models. It is deployed across many verticals, including financial services, energy, manufacturing, consumer packaged goods, government, travel and logistics, healthcare, and life sciences. The latest version adds support for Python.

    • Target audience: Analysts and citizen data scientists
    • Notable features: The ability to use data science techniques, geo-analytics, and real-time streaming data using natural language query and natural language generation
    • Pricing: Pricing for Tibco Spotfire Platform (on your servers) and Tibco Cloud Spotfire Enterprise (private service) is available on request; Spotfire for Amazon Web Services starts at $0.99 per hour; Tibco Cloud Spotfire is $125 per month or $1,250 per year for analyst seats, $65 per month or $650 per year for business author seats, $25 per month or $250 per year for consumer seats, and $25 per month or $250 per year for 25GB of library storage

    Authors: Peter Sayer & Thor Olavsrud

    Source: CIO

  • Artificial intelligence: Can Watson save IBM?

    160104-Cloud-800x445The history of artificial intelligence has been marked by seemingly revolutionary moments — breakthroughs that promised to bring what had until then been regarded as human-like capabilities to machines. The AI highlights reel includes the “expert systems” of the 1980s and Deep Blue, IBM’s world champion-defeating chess computer of the 1990s, as well as more recent feats like the Google system that taught itself what cats look like by watching YouTube videos.

    But turning these clever party tricks into practical systems has never been easy. Most were developed to showcase a new computing technique by tackling only a very narrow set of problems, says Oren Etzioni, head of the AI lab set up by Microsoft co-founder Paul Allen. Putting them to work on a broader set of issues presents a much deeper set of challenges.
    Few technologies have attracted the sort of claims that IBM has made for Watson, the computer system on which it has pinned its hopes for carrying AI into the general business world. Named after Thomas Watson Sr, the chief executive who built the modern IBM, the system first saw the light of day five years ago, when it beat two human champions on an American question-and-answer TV game show, Jeopardy!
    But turning Watson into a practical tool in business has not been straightforward. After setting out to use it to solve hard problems beyond the scope of other computers, IBM in 2014 adapted its approach.
    Rather than just selling Watson as a single system, its capabilities were broken down into different components: each of these can now be rented to solve a particular business problem, a set of 40 different products such as language-recognition services that amount to a less ambitious but more pragmatic application of an expanding set of technologies.
    Though it does not disclose the performance of Watson separately, IBM says the idea has caught fire. John Kelly, an IBM senior vice-president and head of research, says the system has become “the biggest, most important thing I’ve seen in my career” and is IBM’s fastest growing new business in terms of revenues.
    But critics say that what IBM now sells under the Watson name has little to do with the original Jeopardy!-playing computer, and that the brand is being used to create a halo effect for a set of technologies that are not as revolutionary as claimed.

    “Their approach is bound to backfire,” says Mr Etzioni. “A more responsible approach is to be upfront about what a system can and can’t do, rather than surround it with a cloud of hype.”
    Nothing that IBM has done in the past five years shows it has succeeded in using the core technology behind the original Watson demonstration to crack real-world problems, he says.

    Watson’s case
    The debate over Watson’s capabilities is more than just an academic exercise. With much of IBM’s traditional IT business shrinking as customers move to newer cloud technologies, Watson has come to play an outsized role in the company’s efforts to prove that it is still relevant in the modern business world. That has made it key to the survival of Ginni Rometty, the chief executive who, four years after taking over, is struggling to turn round the company.
    Watson’s renown is still closely tied to its success on Jeopardy! “It’s something everybody thought was ridiculously impossible,” says Kris Hammond, a computer science professor at Northwestern University. “What it’s doing is counter to what we think of as machines. It’s doing something that’s remarkably human.”

    By divining the meaning of cryptically worded questions and finding answers in its general knowledge database, Watson showed an ability to understand natural language, one of the hardest problems for a computer to crack. The demonstration seemed to point to a time when computers would “understand” complex information and converse with people about it, replicating and eventually surpassing most forms of human expertise.
    The biggest challenge for IBM has been to apply this ability to complex bodies of information beyond the narrow confines of the game show and come up with meaningful answers. For some customers, this has turned out to be much harder than expected.
    The University of Texas’s MD Anderson Cancer Center began trying to train the system three years ago to discern patients’ symptoms so that doctors could make better diagnoses and plan treatments.
    “It’s not where I thought it would go. We’re nowhere near the end,” says Lynda Chin, head of innovation at the University of Texas’ medical system. “This is very, very difficult.” Turning a word game-playing computer into an expert on oncology overnight is as unlikely as it sounds, she says.

    Part of the problem lies in digesting real-world information: reading and understanding reams of doctors’ notes that are hard for a computer to ingest and organise. But there is also a deeper epistemological problem. “On Jeopardy! there’s a right answer to the question,” says Ms Chin but, in the
    medical world, there are often just well-informed opinions.
    Mr Kelly denies IBM underestimated how hard challenges like this would be and says a number of medical organisations are on the brink of bringing similar diagnostic systems online.

    Applying the technology
    IBM’s initial plan was to apply Watson to extremely hard problems, announcing in early press releases “moonshot” projects to “end cancer” and accelerate the development of Africa. Some of the promises evaporated almost as soon as the ink on the press releases had dried. For instance, a far-reaching partnership with Citibank to explore using Watson across a wide range of the bank’s activities, quickly came to nothing.
    Since adapting in 2014, IBM now sells some services under the Watson brand. Available through APIs, or programming “hooks” that make them available as individual computing components, they include sentiment analysis — trawling information like a collection of tweets to assess mood — and personality tracking, which measures a person’s online output using 52 different characteristics to come up with a verdict.

    At the back of their minds, most customers still have some ambitious “moonshot” project they hope that the full power of Watson will one day be able to solve, says Mr Kelly; but they are motivated in the short term by making improvements to their business, which he says can still be significant.
    This more pragmatic formula, which puts off solving the really big problems to another day, is starting to pay dividends for IBM. Companies like Australian energy group Woodside are using Watson’s language capabilities as a form of advanced search engine to trawl their internal “knowledge bases”. After feeding more than 20,000 documents from 30 years of projects into the system, the company’s engineers can now use it to draw on past expertise, like calculating the maximum pressure that can be used in a particular pipeline.
    To critics in the AI world, the new, componentised Watson has little to do with the original breakthrough and waters down the technology. “It feels like they’re putting a lot of things under the Watson brand name — but it isn’t Watson,” says Mr Hammond.
    Mr Etzioni goes further, claiming that IBM has done nothing to show that its original Jeopardy!-playing breakthrough can yield results in the real world. “We have no evidence that IBM is able to take that narrow success and replicate it in broader settings,” he says. Of the box of tricks that is now sold under the Watson name, he adds: “I’m not aware of a single, super-exciting app.”

    To IBM, though, such complaints are beside the point. “Everything we brand Watson analytics is very high-end AI,” says Mr Kelly, involving “machine learning and high-speed unstructured data”. Five years after Jeopardy! the system has evolved far beyond its original set of tricks, adding capabilities such as image recognition to expand greatly the range of real-world information it can consume and process.

    Adopting the system
    This argument may not matter much if the Watson brand lives up to its promise. It could be self-fulfilling if a number of early customers adopt the technology and put in the work to train the system to work in their industries, something that would progressively extend its capabilities.

    Another challenge for early users of Watson has been knowing how much trust to put in the answers the system produces. Its probabilistic approach makes it very human-like, says Ms Chin at MD Anderson. Having been trained by experts, it tends to make the kind of judgments that a human would, with the biases that implies.
    In the business world, a brilliant machine that throws out an answer
    to a problem but cannot explain itself will be of little use, says Mr Hammond. “If you walk into a CEO’s office and say we need to shut down three factories and sack people, the first thing the CEO will say is: ‘Why?’” He adds: “Just producing a result isn’t enough.”
    IBM’s attempts to make the system more transparent, for instance by using a visualisation tool called WatsonPaths to give a sense of how it reached a conclusion, have not gone far enough, he adds.
    Mr Kelly says a full audit trail of Watson’s decision-making is embedded in the system, even if it takes a sophisticated user to understand it. “We can go back and figure out what data points Watson connected” to reach its answer, he says.

    He also contrasts IBM with other technology companies like Google and Facebook, which are using AI to enhance their own services or make their advertising systems more effective. IBM is alone in trying to make the technology more transparent to the business world, he argues: “We’re probably the only ones to open up the black box.”
    Even after the frustrations of wrestling with Watson, customers like MD Anderson still believe it is better to be in at the beginning of a new technology.
    “I am still convinced that the capability can be developed to what we thought,” says Ms Chin. Using the technology to put the reasoning capabilities of the world’s oncology experts into the hands of other doctors could be far-reaching: “The way Amazon did for retail and shopping, it will change what care delivery looks like.”
    Ms Chin adds that Watson will not be the only reasoning engine that is deployed in the transformation of healthcare information. Other technologies will be needed to complement it, she says.
    Five years after Watson’s game show gimmick, IBM has finally succeeded in stirring up hopes of an AI revolution in business. Now, it just has to live up to the promises.

    Source: Financial Times

  • Comparing BI to ERP, which analytical tool fits your purpose?

    Comparing BI to ERP, which analytical tool fits your purpose?

    Since its conception in 1997, Business Intelligence (BI) has been compared to a number of analytical tools, from spreadsheets, to Customer Relationship Management (CRM) tools to Enterprise Resource Planning (ERP) software. For some analytical tools, BI wins by a mile – yes we're specifically talking about spreadsheets. For others, there is no clear winner.

    In this blog, we’d like to discuss ERP and the connection with BI  to help you determine what works for your business.

    describe the imageWhat is Enterprise Resource Planning (ERP)?

    Every department within an organisation typically has its own computer system, optimised to suit the way that department operates. ERP essentially combines these multiple systems into a single, integrated software program that runs off a single database. Of course, the one integrated software is then divided into software modules that are replicas of their older standalone counterparts.

    Gartner states: “ERP tools share a common process and data model, covering broad and deep operational end-to-end processes such as those found in finance, HR, distribution, manufacturing, service and the supply chain”.

    What are the key differences between BI and ERP?

    Strategic-level vs. operational-level analytics

    BI tools are typically leveraged by analysts for high level discussions which involve strategic decisions. A BI tool accesses all of the data in your data warehouse, both strategic and (financial; revenue, profit and growth), and operational (daily sales performance). BI tools enable you to conduct in-depth analyses to generate comprehensive information that can deliver high-level insights. Essentially, BI is a step towards a companywide view of information.

    ERP, on the other hand, is an operational system chock full of operational and transactional data. It will give you an exact view of your business from an operational perspective, but it is not built to perform trend analyses or give you high-level overviews. It is a tool centred around delivering operational insights.

    OLAP vs. OLTP system

    BI is built as an Online Analytical Processing system (OLAP), to provide robust analytical capabilities, such as high-speed access to reports, dashboard management and the development of balanced scorecards. BI also comes with advanced analytical features that allow you to view data from different sources on one page, and in the format or perspective you need.

    ERP, on the other hand, is an Online Transaction Processing system (OLTP), used to record transactions as and when they take place. The data architecture of ERP software is designed to provide high-speed transaction recording, while keeping data space utilization at a minimum.

    Agility vs. efficiency

    Over the past years, there has been a shift of focus in BI - organizations are moving from historical reporting to forecasting and forward planning. Through these future-centric capabilities, BI can make organizations become more agile, allowing them to make strategic-level decisions that take advantage of future conditions.

    ERP software, on the other hand, is built to deliver efficiencies to an organization. These efficiencies come in many forms: better interdepartmental communication, IT cost savings and business process efficiencies. Both Gartner and CIO.com believe that proper ERP implementation can improve an organisation’s overall performance.

    What does this mean for you?

    Before you begin choosing tools, first determine your organisation’s objectives. Once you know what you are trying to achieve, you can identify the right approach to help you achieve it.

    If you have an in-depth understanding of your operational performance, then look at BI to obtain strategic level insights into your performance. While if you need a better understanding of your operational performance and need to make operational improvements, ERP is the tool you need.

    Ultimately, both tools are geared towards business improvement and can deliver significant results.

    Source: Phocas Software

  • Distinguishing between advanced analytics and business intelligence

    Distinguishing between advanced analytics and business intelligence

    Advanced analytics and business intelligence (BI) have more or less the same objective: use data to drive insights that inform business strategy. So what’s the difference? 

    What is business intelligence? 

    Business intelligence is an umbrella term for software and services that provide comprehensive yet straightforward insights about an organization’s current state. Think routine reporting or dashboarding, where data is clearly legible for stakeholders to understand month by month. Examples of business intelligence use cases abound, some of which include unifying data to better track marketing leads or to manage shipping operations across a fleet of trucks. Business intelligence is by no means easy, but it is grounded in practical, everyday uses of data. 

    What is advanced analytics? 

    Advanced analytics employs the use of sophisticated tools and techniques that surpass traditional business intelligence capabilities. Like business intelligence, it is a wide-reaching term that involves many methods and lends itself to many possible use cases.

    Advanced analytics is not meant to replace business intelligence but to augment its efforts. It strives to ask deeper questions of the data, generating insights that not only indicate how the business is currently performing but where its future is headed. If we consider that business intelligence largely aims to point out strengths and weaknesses in current business processes, advanced analytics has the potential to make recommendations and predictions as to how to steer the organization forward. 

    Examples of 5 advanced analytics techniques 

    Let’s take a closer look at some of the techniques that fall under the category of advanced analytics. Rarely will organizations need to use all of these techniques at once as a part of their advanced analytics integration; rather, they are merely some of the many tools in the toolkit of a data professional. 

    1. Forecasting

    Forecasting is the technique of analyzing historical data to predict future outcomes. It considers prior trends to recommend how organizations should plan ahead, such as stocking more inventory for a historically popular sales day. Forecasts can be extremely accurate, but their reliability depends upon the relevance and availability of historical data, as well as the time period to be forecasted.

    2. Machine learning

    Machine learning is the process of training a computer to predict outcomes without it being specifically programmed to do so. Machine learning models are built to model the desired behavior, and as the model is fed more and more training data, its accuracy in predicting outcomes increases. Data, and lots of it, is the key to effective machine learning models.

    3. Data mining and pattern matching

    Data mining is the process of uncovering patterns in large batches of raw data for further analysis. Analysts often don’t know what’s in data warehouses or what they should be looking for; data mining techniques, such as pattern matching, help source the right data from data warehouses based upon connections in the data.

    4. Semantic analysis

    Semantic analysis is the act of determining meaning from text data. By way of semantic analysis, computers can “read” full documents by analyzing its grammatical structure and the relationship of individual words. The technique is particularly useful for marketing teams to be able to analyze social media data or for customer service teams to better understand the effectiveness of online customer support.

    5. Complex event processing

    Complex event processing is the act of aggregating huge volumes of data to help determine the cause-and-effect relationships for any given event. By matching incoming events against a pattern, complex event processing can shed light as to what is happening.

    Benefits of advanced analytics

    It’s widely recognized that an advanced analytics integration offers a competitive edge. Just a few of the benefits that advanced analytics can deliver include: 

    • Better decision-making
      Advanced analytics delivers valuable insights that allow organizations to make better decisions, adjust their company strategy, and plan for the future. 
    • Saved costs
      Identifying overspend or leaking costs through advanced analytics can have a huge impact on the budget over time.
    • Increased innovation
      Through advanced analytics, organizations have developed innovative new products, processes, or sales/marketing strategies that have given them a leg up from the competition.

    Challenges of advanced analytics

    Many organizations encounter roadblocks along their advanced analytics journey, which prevent them from fully realizing these benefits. According to a 2018 McKinsey survey, “fewer than 20 percent [of companies] have maximized the potential and achieved advanced analytics at scale.” Some of the top challenges of advanced analytics include:

    • Cost
      Advanced data analytics will prove its ROI over time, but the upfront costs can be rather costly. Investing in infrastructure and talent, as well as the time required for data strategy and deployment, can be intimidating for organizations to take on.
    • Working with data from multiple sources
      Effective analytics should employ as many data sources as necessary, but gathering and integrating all of these data sources can be challenging.
    • Inaccessible data
      Even after the appropriate amount of data is gathered and centralized, if that data isn’t made accessible to the analysts that need to use it, it will serve little value to the organization.
    • Skills shortage
      Data scientists and data engineers are costly resources and difficult to source. Though user-friendly technologies have lowered the barrier to advanced analytics, many organizations still want a foundational data science team.
    • Poor quality data
      Harvard Business Review called poor quality data “enemy number one” to machine learning initiatives, and that extends to all facets of advanced analytics. If data hasn’t been vetted to meet data quality standards or properly prepared for the requirements of the analysis at hand, it will only lead to faulty or misleading insights. 

    Data preparation & advanced analytics

    Data preparation accounts for up to 80% of total analytic time. It’s where analysts can encounter a minefield of analytic challenges. But, it also presents the biggest opportunity for improvement. Succeed at data preparation and odds are, you’ll see far less advanced data analytics challenges. 

    Traditional data preparation methods like extract, transform, and load (ETL) tools or hand-coding are time-consuming and bar analysts from the process of transforming their own data. Recently, organizations have invested in modern data preparation platforms as a part of their advanced analytics integration, which allows organizations to:

    • Easily connect to a diverse range of data sources. 
    • Identify data quality issues through a visual interface. 
    • Involve non-technical analysts in the process of preparing data. 
    • Integrate structured and unstructured data of any size. 
    • Reduce the total time spent preparing data by up to 90%. 

    Author: Matt Derda

    Source: Trifacta

  • How BI helps shaping the future of retail

    How BI helps shaping the future of retail

    The author of this article, Microstrategy's Nick Barth, joined DynamicAction and LEGO Group onstage at FT Future of Retail 2019 to discuss how to maximize the power of data-driven customer insights.

    The event discussion offered three main takeaways for retail enterprises looking to expand and optimize their data.

    Be smarter about how we look at data

    It's time to wean ourselves off of Excel. When all your data is in a spreadsheet, it's hard to find value within the rows and columns. There are much better ways of visualizing data today, and modern visualization tools benefit everyone in the enterprise with intelligence, not just the data-adept.

    Data lakes aren't the answer

    While a unified warehouse is challenging to achieve, data lakes haven't proven themselves to be better. Business users and analysts alike are referring to them as 'data swamps'. So how can enterprises turn disparate data from multiple sources, and at different levels of usability, into something that elevates the enterprise?

    The answer is an enterprise semantic layer: software that filters data from different sources and provides the governance necessary for a single version of the truth across the organization. When all users, from store managers to executives, are looking at the same numbers, you’re on your way to success as a retailer of the future.

    Keep a pulse on your business

    Retail business leaders need to be in the know about vital KPIs such as store performance and customer experience, and they need this data to be in real time, accurate, and easily accessible. Self-service BI makes this possible, but those looking to the future should take advantage of new technologies to not only make data accessible, but to integrate it into the natural workflow of every employee.

    So what concrete benefits will these important tips deliver to your retail enterprise? Here are three of many:

    • Competitive pricing: Flag pricingdiscrepancies and deliver the best possible value to your customers.
    • Real-time store performance monitoring: Mobile dashboards can deliver important store and customer KPIs to every employee on the floor, helping them boost the customer experience and drive sales.
    • Inventory management: Never experience a stockout again by arming your store ops and supply chain managers with the accurate inventory data they need to take collective action.

    The discussion at FT Future of Retail made it clear that the retail industry needs to embrace data and analytics to keep up with both competitors and consumer demands.

    Author: Nick Barth

    Source: MicroStrategy

  • The most important BI trends for 2020

    The most important Business Intelligence trends for 2020

    Companies are in the midst of many profound changes: The amount of data available and the speed of producing new data has been increasing rapidly for years, and business models as well as process improvements increasingly rely on data and analytics.

    Against this backdrop, a key challenge is emerging: the efficient and, at the same time, innovative use of data is only possible when capabilities for, and the operationalization of, both analytics and data management are ensured. Many companies are already reaching their limits with a ‘the more data the better‘ approach and cannot fully leverage the benefits they expect due to a lack of data quality or analytical skills.

    In addition, there has been an increased focus on data protection since the GDPR came into effect in 2018. Amid a huge flood of information, companies will have to find ways to handle data in a way that not only complies with legal requirements, but also helps to improve processes and make day-to-day business easier.

    This year we asked 2,865 users, consultants and vendors for their views on the most important BI trends. The BARC BI Trend Monitor 2020 illustrates which trends are currently regarded as important in addressing these challenges by a broad group of BI and analytics professionals. Their responses provide a comprehensive picture of regional, company and industry specific differences and offer up-to-the-minute insights into developments in the BI market and the future of BI. Our long-term comparisons also show how trends in business intelligence have developed, making it possible to separate hype from stable trends.

    BARC’s BI Trend Monitor 2020 reflects on the business intelligence and data management trends currently driving the BI market from a user perspective.

    Importance of Business Intelligence trends in 2020 (n=2,865)

    1. MD/MQ management. Importance (1-10 scale): 7.3
    2. Data discovery/visualization. Importance (1-10 scale): 6.9
    3. Establishing data-driven culture. Importance (1-10 scale): 6.9
    4. Data governance. Importance (1-10 scale): 6.8
    5. Self service BI. Importance (1-10 scale): 6.5
    6. Data prep. business users. Importance (1-10 scale): 6.3
    7. Data warehouse modernization. Importance (1-10 scale): 5.9 
    8. Agile BI development. Importance (1-10 scale): 5.8
    9. Real-time analytics. Importance (1-10 scale): 5.6
    10. Advanced analytics/ML/AI. Importance (1-10 scale): 5.5
    11. Big data analytics. Importance (1-10 scale): 5.5
    12. Integrated platforms BI/PM. Importance (1-10 scale): 5.2
    13. Embedded BI and analytics. Importance (1-10 scale): 5.1
    14. Data storytelling. Importance (1-10 scale): 5.1
    15. Mobile BI. Importance (1-10 scale): 5.1
    16. Analytics teams/data labs. Importance (1-10 scale): 5.0
    17. Using external/open data. Importance (1-10 scale): 4.9
    18. Cloud for data and analytics. Importance (1-10 scale): 4.9
    19. Data catalogs. Importance (1-10 scale): 4.2
    20. Process mining. Importance (1-10 scale): 4.1

    The most (and least) important BI trends in 2020

    We asked users, consultants and software vendors of BI and data management technology to give their personal rating of the importance of twenty trending topics that we presented to them.

    Data quality/master data management, data discovery/visualization and data-driven culture are the three topics BI practitioners identify as the most important trends in their work.

    At the other end of the spectrum, cloud for BI and analytics, data catalogs and process mining were voted as the least important of the twenty trends covered in BARC’s survey.

    What do these results tell us?

    While the two most important trends remained the same as last year with master data and data quality management in first position and data discovery in second, third spot is now occupied by establishing a data-driven culture. This trend, which was newly introduced last year and went straight into fifth place in the rankings, is seen as even more important this year. Self-service BI, on the other hand, went down to fifth place this year whereas data governance remains in fourth.

    All in all, these five top trends represent the foundation for organizations to manage their own data and make use of it. Furthermore, it demonstrates that organizations are aware of the relevance of high quality data and its effective use. These trends stand for underlying structures being changed: Organizations want to go beyond the collection of as much data as possible and actively use data to improve their business decisions. This is also supported by data warehouse modernization, which is once again in seventh place this year.

    Some trends have slightly increased in importance since last year (e.g., real-time analytics an integrated platforms for BI and PM). However, they all climbed just one rank with the exception of establishing a data-driven culture, which jumped two places. Therefore, no huge shift can be observed in terms of upward trends.

    The opposite is the case for downward trends: Mobile BI fell from twelfth to fifteenth place this year, continuing its downward trend that started in 2017. It seems as if the mobile application of BI functions is not seen as important anymore, either because it is available now or because requirements have shifted. Advanced analytics/machine learning/AI is ranked one place lower than last year (down from 9 to 10).

    More important than the difference of one rank however is the tendency behind this slight downward trend: In 2018, many hopes were based on new tools using machine learning and artificial intelligence so this topic might have been expected to rise. However, even if we refer to it as a stagnation in perceived importance rather than a 'real' downward trend, this result is surprising.

    Source: BI-Survey

  • The non-negligible value of truly understanding analytics

    The non-negligible value of truly understanding analytics

    Analytics is a term that many people have heard, but unfortunately not nearly as many go out of their way to truly understand in depth. Widely referred to as the 'fabric of the future', analytics is all about improving efficiency and ultimately creating better ways to do things in professional settings. Believe it or not, every thriving business, across practically every industry, has an analytics department (regardless of if it is a legal practice with particular focus on issues like if a head injury is eligible for workers’ compensation benefits, or a security company that serves as the underlying framework for a hotel empire, to name a few examples among the many).

    There is a lot of work that goes into creating a thriving business, but thankfully modern marvels like analytics are seeking to create a much simpler world where it is easier to incorporate, maintain and track data. But underneath all that, why is analytics so important, and why should all business owners, and individuals in general for that matter, make it their business to understand analytics under the surface?

    Analytics history puts your company in a stronger competitive position

    A wise man once said that a smart person learns from his mistakes, but a smarter one learns from the mistakes of others. While you can examine your own analytics for a more secure business plan and to come up with strategies that target the results you desire, it is also a good thing to study the analytics of other companies in that area to help you determine where they went wrong and how you can avoid making the same mistakes and ending up a similar failure. Conversely, looking at successes is also equally important because they can show you what they did right in order to get where they are today.

    At the end of the day, analytics creates a healthy combination of historical and (close to) real time data that merges to forge easy analysis. Having all this information at your disposal ultimately allows your company to not only survive, but positively thrive. Essentially, analytics gives you and your company a competitive edge that looks back on all the data collected at the company throughout the ages, and uses it to set up a cleaner and sounder foundation for ongoing longevity and success moving forward.

    Analytical insights highlight opportunities to reduce company expenses

    Aside from gaining advantage over the business side of things, one of the most impressive and important reasons why every modern company should invest in an analytics approach, is the potential for reduced expenses. When data insights (i.e. analytic algorithms) are created and built upon within the company, the company can more easily identify instances where wasteful spending occurred, and can even, in some cases, work about wasteful spending entirely, opting instead to pivot towards less costly opportunities for the company to make money, rather than to spend money unnecessarily. It can be used to remove any redundancies in the office or workflow and create a more cost-effective environment.

    The company will also be able to increase productivity by analyzing its operations and determining a more effective method of running things. Analytics can also determine which software or processes are outdated and being able to stay on top of things is extremely important in this fast-paced world. Therefore, analytics plays a big part in making sure a company runs smoothly and effectively.

    Advanced analytics provides a stronger understanding of consumers

    Any successful business in any industry can tell you that you must have an adequate understanding of your consumers if you hope to make it in your industry. Historically, the segmentation of consumers was decidedly basic. Today, however, sophisticated analytics (think artificial intelligence, for example) are making it easier than ever for companies to not only segment their consumers into categories, but to do so in much more specific terms than they ever have been before. This is the all-new frontier in business and consumer relationships, and analytics is leading the charge both confidently and enthusiastically. And what is even better? This is just the start. Analytics is only just now beginning to find its footing, meaning that bigger and better things are well on the way.

    When you understand your market, you are at an advantage to provide them with the services they require. You will be able to best give them what they want and when they want it. As a company that is able to provide top notch service along with whatever your clientele is after, your company will thrive. This is the reason data is becoming so valuable. Every company is prioritizing big data as it is able to help them serve their customers better and ultimately, is able to help their company grow in the direction that it needs to.

    Author: Rehan Ijaz

    Source: SmartDataCollective

  • Why you should implement automation in your business

    Why you should implement automation in your business

    When automation is done well, it accomplishes more than just saving time and money. It minimizes errors, improves productivity, increases employee satisfaction, and enhances the customer experience. When incorporated into a business strategy, employees get more done, in the same amount of time, allowing them to focus on the important objectives of their role.

    While automation may not be the latest advancement in technology, it will have the greatest impact on how we do business over the next decade. IT managers who fail to employ automation will likely lose their competitive advantage. Gartner estimates a 25-percent reduction in customer retention over the next year for companies that choose not to incorporateautomation into their business strategy.

    What is automation and why do it?

    Automation enables the workflow to proceed without human oversight. Automation can be deployed in place of traditional manual systems such as entering purchase orders, customer service, data analysis, and reporting. Eventually, nearly all IT teams will automate some aspects of their businesses. As businesses grow, automation will expand customer service without increasing the number of employees. Successful automation enables your existing teams to manage additional customers with the same speed of service. Simply put, automation allows the company to accomplish more with less.

    Automation benefits both the company and its customers. Customers report an improved experience due to better consistency in order fulfilment, faster response times, and lowered costs. Improved customer experience will improve brand loyalty and increase customer lifetime value. Automation empowers companies to optimize the way they allocate internal resources to save money, and take advantage of new opportunities to increase sales. In other words, businesses are either saving or earning money when they automate.


    Automated reporting comes as part of the package with BI solutions. Users can access relevant and timely data on how the business is performing across all domains. By instantly converting raw data into actionable information, automated reporting eliminates the challenges associated with traditional forms of reporting. Now users can see what has happened, what is happening, and what is likely to happen in the future.

    Reports can be generated automatically at set times, such as every monday morning for the weekly sales meeting. Reports also may be triggered by certain events, such as when sales figures fall within a certain range. Self-service analytics also provides users with a customizable dashboard for on-demand reporting based on job role. A dashboard allows users to see what is happening in real-time, and to drill down into the details to see the root cause of problems, as well as to identify new trends and opportunities. From sales teams to inventory managers, users have access to up-to-the-moment data from anywhere, on any device.


    The cloud offers cost savings as well as added security benefits. IT managers and CTOs work with the SaaS providers to determine the level of access to be provided to users in their business. They can determine when devices should be able to access resources and restrict permissions to users based on their job roles. Security should be a priority. Both in private cloud and dedicated SaaS, it is important to manage and minimize data breaches the best way possible. To ensure the ongoing security of customer data, independent regular vulnerability and penetration testing and having a security incident response policy in place is recommended.

    When companies embrace automation, employees have time to work on items that add genuine value to the business, allowing them to be more innovative and increase levels of motivation. Customers also benefit from improved service and experience.

    Source: Phocas Software

EasyTagCloud v2.8