6 items tagged "BI tools"

  • 11 BI tools to consider in 2021

    11 BI tools to consider in 2021

    With more and more data at our fingertips, it’s getting harder to focus on the information relevant to our problems and present it in an actionable way. That’s what business intelligence is all about.

    BI tools make it simpler to corral the right data and visualize it in ways that enable us to understand what it means. But how simple that process gets, and how you can visualize the data depends on the tool: picking the right one for your needs becomes important.

    Here we round up of a dozen popular, highly regarded BI tools to help you uncover what your organization’s data can tell you about your business.

    Top 11 business intelligence tools

    • Board
    • Domo
    • Dundas BI
    • Microsoft Power BI
    • MicroStrategy
    • Oracle Analytics Cloud
    • Qlik
    • SAS
    • Sisense
    • Tableau
    • Tibco

    Board

    Board International combines three tools in one: BI, predictive analytics and performance management. While it aims to offer something for everyone, it predominately focuses on finance-oriented BI. It has modules for finance (planning, consolidation), HR (skills mapping, workforce planning), marketing (social media analysis, loyalty and retention monitoring), supply chain (delivery optimization, supplier management), sales (cross-selling and up-selling analysis) and IT (KPIs, service levels). The company is Swiss, but the software is available in English, Spanish, Chinese, Japanese, French, German and Italian. The latest version of its platform has replaced its multidimensional online analytical processing (MOLAP) approach with an in-memory calculation engine.

    • Target audience: The whole enterprise but usually enters via the finance department
    • Notable features: Language support
    • Pricing: License fee per user varies according to role

    Domo

    Domo is a cloud-based platform focused on business-user-deployed dashboards and ease-of-use. It offers business intelligence tools tailored to various industries (such as financial services, health care, manufacturing and education) and roles (including CEOs, sales, BI professionals and IT workers). CIOs might start by checking out how it handles data from AWS, Jira, GitHub, or New Relic before looking at how over 500 other integrations can help the rest of the enterprise.

    • Target audience: CEOs, sales and marketing, BI professionals
    • Notable features: Robust mobile interface
    • Pricing: On request

    Dundas BI

    Dundas BI from Dundas Data Visualization is used predominantly for creating dashboards and scorecards, the company’s historic strengths, but it can also perform standard and ad-hoc reporting. Analysis and visualization are performed through a web interface that can adapt to users’ skills: Power users and standard users see different features. The latest version has a new in-memory engine, a new natural language query capability, and adds point-and-click trend analysis, support for Linux, and an application development environment for customized analytic applications. Dundas BI has been tailored for 19 industries, including clean tech, mining and construction, in addition to the usual suspects such as banking and healthcare. It sells to large enterprises but specializes in embedded BI.

    • Target audience: C-suite, HR, finance, sales, marketing, customer service
    • Notable features: Flexible, HTML5 interface that adapts to any device
    • Pricing: Based on concurrent users, not named users, with no tie to number of servers or cores

    Microsoft Power BI

    With the Power BI Desktop app for Windows, users can analyze and visualize data from local or cloud sources, publishing their reports to the Power BI platform. It offers data preparation, visual-based discovery, interactive dashboards, and augmented analytics. The free Power BI Desktop version suits isolated users; the Pro version makes collaborative analysis easier, for a monthly fee, leveraging Microsoft Teams, Office365 and SharePoint to control access to raw data and published reports. For enterprises that want to go all-in, the Premium tier makes self-service data prep possible with prebuilt connectors to corporate data held in Microsoft Dynamics 365, Azure SQL Data Warehouse, or third-party sources such as Salesforce.

     
    • Target audience: Microsoft shops
    • Notable features: Mobile app allows touch-screen annotation of reports
    • Pricing: Desktop: free; Pro: $9.99 per user, per month; Premium: $4,995 per dedicated cloud compute and storage resource

    MicroStrategy

    MicroStrategy targets the enterprise BI market in a broad range of industries with cloud, on-premises and hybrid deployment options. It features a drag-and-drop interface to help users create customized data visualizations and build personalized, real-time dashboards. MicroStrategy touts federated analytics that allow customers to leverage existing investments in data sources such as Tableau, Qlik and Power BI, and blend the data to build reports and insights. It also features enterprise semantics graph, which indexes data assets to enrich data silos with location intelligence and real-time telemetry. MicroStrategy offers a free, 30-day full platform trial.

    • Target audience: The whole enterprise
    • Notable features: Digital credentials allow you to replace physical ID cards, passwords and tokens in favor of mobile role-based identity badges tied to smartphones
    • Pricing: By request

    Oracle Analytics Cloud

    Oracle has spent the past several years bulking out its Oracle Analytics Cloud offering, launched in 2014 as an outgrowth of its flagship Business Intelligence Enterprise Edition suite. In 2020, it added a Cloud HCM offering to provide self-service workforce analytics to HR executives, analysts and line-of-business leaders. Oracle has focused on making its cloud offering intuitive and user-friendly, with powerful reporting and machine learning features. Key features include data preparation, data connectors, visualizations, predictive analytics, a native mobile app, and support for embedded analytics.

    • Target audience: Users in midsize to large enterprises
    • Notable features: Conversational analytics support natural language queries; can automatically generate natural language explanations to explain visualizations and trends
    • Pricing: Enterprise: $80 per user, per month, or $2.1506 Oracle compute unit (OCPU) per hour; Professional: $16 per user, per month, or $1.0753 OCPU per hour; Professional – Bring Your Own License (BYOL): $0.3226 OCPU per hour; Enterprise – BYOL: $0.3226 OCPU per hour

    Qlik Sense

    Qlik’s goal is to give anyone in the enterprise access to all its data — subject, of course, to corporate data governance policies. All that data should be enough to bog down most database engines, but Qlik says its Associative Engine can associate every piece of data with every other piece to make it easier to search for connections. The Associative Engine now has AI and machine learning capabilities that offer context-aware insight suggestions thanks to the Qlik cognitive engine. Qlik Sense, the self-service tool to access that analytical capability, comes in cloud and on-premises versions.

    • Target audience: The whole enterprise
    • Notable features: Associative Engine can analyze all your data, on the fly
    • Pricing: Limited versions: free; Qlik Sense Business: $30 per user, per month; Qlik Sense Enterprise SaaS: Analyzer: $40 per month; Professional: $70 per month; Analyzer Capacity Packs, $100 per month

    SAS Visual Analytics

    SAS’s take on BI is its Visual Analytics tool, offered via its cloud and microservices-based SAS Viya platform. It aims to automatically highlight key relationships in data: The latest version adds automated suggestions for relevant factors, along with insights expressed via visualizations and natural language. Other features include sentiment analysis for extracting data from social media and other texts, automatic generation of charts, mapping, and self-service data preparation. Deployment can be on premises, in public or private clouds, or on the Cloud Foundry platform as a service.

    • Target audience: Users across large enterprises
    • Notable features: Automated analysis functions
    • Pricing: On request

    Sisense

    Sisense’s BI software stack covers everything from the database through ETL and analytics to visualization, and it claims its In-Chip database engine is faster even than in-memory databases. It is best known for embedded BI uses. The latest version adds new machine learning capabilities. Sisense is available on premises or in the cloud. There are solutions for finance, marketing, sales, HR and IT, as well as customer service and operations and logistics departments. Sisense also makes it possible to offer the analytics tools to users outside the enterprise by embedding them in web applications. Sisense acquired Periscope Data in September 2019 and is in the process of integrating advanced analytics capabilities gained through the acquisition.

    • Target audience: Typically SMEs
    • Notable features: Fully web-based client, including for data prep
    • Pricing: On demand, based on an annual fee for software and service

    Tableau

    With Tableau, Tableau Software is covering all the bases: You can run its software on premises, choose a public cloud, or opt to have it fully hosted by Tableau. It offers tailored versions for over a dozen industries, including banking, healthcare and manufacturing, with support for financial, HR, IT, marketing and sales departments, although that’s almost par for the course these days. Tableau’s capabilities include mapping and analysis of surveys and time series data. Its latest trick is drawing on the artificial intelligence techniques of natural language processing to allow users to describe what they want to see, rather than clicking and dragging to create formulaic queries.

    • Target audience: Midsize and larger enterprises
    • Notable features: Tableau draws on natural language processing to enable users to say what they want to see
    • Pricing: Each deployment needs at least one Tableau Creator ($70 per month); for Tableau Server (on-premises or public cloud), others can be Viewers ($12 per month, minimum 100) or Explorers ($35 per month, minimum 5); alternately, with Tableau Online (fully hosted by Tableau), others can be Viewers ($15 per month, minimum 100) or Explorers ($42 per month, minimum 5)

    Tibco

    Tibco Spotfire is a self-service, AI-powered data visualization platform for dashboards, interactive visualization, data preparation, and workflow. The platform offers machine learning-based data preparation capability to support building complex data models. It is deployed across many verticals, including financial services, energy, manufacturing, consumer packaged goods, government, travel and logistics, healthcare, and life sciences. The latest version adds support for Python.

    • Target audience: Analysts and citizen data scientists
    • Notable features: The ability to use data science techniques, geo-analytics, and real-time streaming data using natural language query and natural language generation
    • Pricing: Pricing for Tibco Spotfire Platform (on your servers) and Tibco Cloud Spotfire Enterprise (private service) is available on request; Spotfire for Amazon Web Services starts at $0.99 per hour; Tibco Cloud Spotfire is $125 per month or $1,250 per year for analyst seats, $65 per month or $650 per year for business author seats, $25 per month or $250 per year for consumer seats, and $25 per month or $250 per year for 25GB of library storage

    Authors: Peter Sayer & Thor Olavsrud

    Source: CIO

  • Comparing BI software to reporting software: which tool is right for you?

    Comparing BI software to reporting software: which tool is right for you?

    In this report, we define and compare business intelligence software and reporting software to help you decide which one suits your business needs.

    Both business intelligence software and reporting software help businesses analyze their performance based on data-driven insights. But despite overlapping features and functionality, the two tools belong to distinct software categories. This report will identify the similarities and differences between the two software tools to help you understand which one is better suited for your requirements.

    What is business intelligence software?

    Business intelligence (BI) software helps organizations make operational decisions by analyzing historical performance data and other data sources. A BI tool collects raw data from internal and external sources and analyzes it via queries to get actionable insight. It uses interactive dashboards and other forms of visualization to present the data.

    BI tools allow performance management teams to use descriptive analytics and convert complex data into easily understandable visuals such as charts, graphs, infographics, and animations. BI reporting helps analyze data from various categories, including customers, finance, production, human resources, and contacts.

    What is reporting software?

    Reporting software allows real-time access to administrative and operational data to help build multidimensional reports, such as financial statements, operational reports, and progress reports, from diverse data sources. It assists in tracking the success of sales strategies, monitoring responsiveness to regulatory compliance concerns, and managing internal controls or security audits.

    Reporting software also supports the creation of custom reports per a business’s reporting needs. It allows managers and business leaders to measure and track performance metrics from departments such as sales, finance, human resources, and marketing. The software integrates with apps such as customer relationship management (CRM), enterprise resource planning (ERP), and business intelligence to create data summaries, dashboards, and data visualizations.

    What do they have in common?

    There are a few similarities between a business intelligence tool and a reporting tool. Both help operational teams create ad hoc reports, visualize data, and forecast business performance for strategic planning. The data reports generated by business intelligence and reporting tools include various similar elements such as charts, tables, key performance indicators (KPIs), images, hyperlinks, and drilldowns.

    Business intelligence and reporting software shared features:

    • Ad hoc reporting: Create business reports, which meet information requirements, on an as-needed basis.
    • Customizable dashboard: Alter the layout and content of dashboard graphs and charts to strategically track statistics and metrics.
    • Data visualization: Graphically represent data using elements such as charts, graphs and maps.

    Business intelligence features:

    • Collaboration tools: Provide a channel for team members to share media files, communicate, and work together.
    • Data connectors: Connect to big data sources. 
    • Key performance indicators: Evaluate company performance based on business metrics. 
    • Metadata management: Manage and track reference data of files (i.e., tags, title, comments, date of creation, file size). 
    • Performance metrics: Use a set of indicators to track how well an organization, a division, or a particular project is performing. 
    • Predictive analytics: Predict future data based on historical data sets. 
    • Publishing/sharing: Share and publish business intelligence data reports with the organization. 
    • Self-service data preparation: Access, combine, transform, and store data without the help of an IT department. 
    • Strategic planning: Visualize a preferred outcome at a high level, define goals, and identify specific steps to achieve them.

    Reporting software features:

    • Data import/export: Import and export information and reports to and from the program.
    • Drag and drop: Assemble applications and processes by dragging over and arranging pre-built components.
    • Forecasting: Form predictions based on past and present data/trends.
    • Search/filter: Search available resources to locate required information.

    Which tool is right for you?

    If your business needs to collect, track, publish, and share performance metrics, real-time updates, benchmarking, and visual data analytics, then a business intelligence platform is a better option. BI tools offer advanced insights for analyzing data, benchmarking, and analyzing business performance, thus facilitating data-driven decisions.

    A reporting tool, on the other hand, is a good option for micro-reporting requirements, such as reports on how many candidates appeared for an interview or how much revenue your business generated last month. Reporting tools can handle standard volumes of data and can draw limited data streams to produce the final reports.

    A BI platform is commonly used by teams that influence business decisions. Therefore, it might have a more targeted end-user base than reporting software, for which the audience is more general.

    Once you decide which software to use, head to our business intelligence software and reporting software category pages where you can find a sortable list of products, software reviews from verified users, and comprehensive buyers guides.

    If you wish to narrow down your search to only the most popular and highest-rated solutions, visit Capterra’s Shortlist reports for the top business intelligence software and reporting software tools—our reports are based on an analysis of thousands of user reviews.

    Author: Barkha Bali

    Source: Capterra

  • Get the most out of a data lake, avoid building a data swamp

    Get the most out of your data lake, avoid building a data swamp

    As an industry, we’ve been talking about the promise of data lakes for more than a decade. It’s a fantastic concept—to put an end to data silos with a single repos­itory for big data analytics. Imagine having a singular place to house all your data for analytics to support product-led growth and business insight. Sadly, the data lake idea went cold for a while because early attempts were built on Hadoop-based repositories that were on-prem and lacked resources and scalability. We ended up with a “Hadoop hangover.”

    Data lakes of the past were known for management challenges and slow time-to-value. But the accelerated adoption of cloud object storage, along with the expo­nential growth of data, has made them attractive again.

    In fact, we need data lakes to support data analytics now more than ever. While cloud object storage first became popular as a cost-effective way to temporarily store or archive data, it has caught on because it is inexpensive, secure, durable, and elastic. It’s not only cost-effective but it’s easy to stream data in. These features make the cloud a perfect place to build a data lake—with one addressable exception.

    Data lake or data swamp?

    The economics, built-in security, and scalability of cloud object storage encour­age organizations to store more and more data—creating a massive data lake with lim­itless potential for data analytics. Businesses understand that having more data (not less) can be a strategic advantage. Unfortu­nately, many data lake initiatives in recent history failed because the data lake became a data swamp—comprised of cold data that could not be easily accessed or used. Many found that it’s easy to send data to the cloud but making it accessible to users across the organization who can analyze that data and act on the insights from it is difficult. These data lakes became a dumping ground for multi-structured datasets, accumulating and collecting digital dust without a glimmer of the promised strategic advantage.

    Simply put, cloud object storage wasn’t built for general-purpose analytics—just as Hadoop wasn’t. To gain insights, data must be transformed and moved out of the lake into an analytical database such as Splunk, MySQL, or Oracle, depending on the use case. This process is complex, slow, and costly. It’s also a challenge because the industry currently faces a shortage of the data engineers who are needed to cleanse and transform data and build the data pipelines needed to get it into these ana­lytical systems.

    Gartner found that more than half of enterprises plan to invest in a data lake within the next 2 years despite these well-known challenges. There are an incredi­ble number of use cases for the data lake, from investigating cyber-breaches through security logs to researching and improv­ing customer experience. It’s no wonder that businesses are still holding onto the promise of the data lake. So how can we clean up the swamp and make sure these efforts don’t fail? And critically, how do we unlock and provide access to data stored in the cloud—the most significant barrier of all?

    Turning up the heat on cold cloud storage

    It’s possible (and preferable) to make cloud object storage hot for data analytics, but it requires rethinking the architecture. We need to make sure the storage has the look and feel of a database, in essence, turn­ing cloud object storage into a high-per­formance analytics database or warehouse. Having “hot data” requires fast and easy access in minutes—not weeks or months—even when processing tens of terabytes per day. That type of performance requires a dif­ferent approach to pipelining data, avoiding transformation and movement. The architecture needed is as simple as compressing, indexing, and publishing data to tools such as Kibana and/or Looker via well-known APIs in order to store once and move and process less.

    One of the most important ways to turn up the heat on data analytics is by facilitating search. Specifically, search is the ultimate democratizer of data, allow­ing for self-service data stream selection and publishing without IT admins or database engineers. All data should be fully searchable and available for analysis using existing data tools. Imagine giving users the ability to search and query at will, easily asking questions and analyzing data with ease. Most of the better-known data warehouse and data lakehouse platforms don’t provide this critical functionality.

    But some forward-leaning enterprises have found a way. Take, for example, BAI Communications, whose data lake strat­egy embraces this type of architecture. In major commuter cities, BAI provides state-of-the-art communications infra­structure (cellular, Wi-Fi, broadcast, radio, and IP networks). BAI streams its data to a centralized data lake built on Amazon S3 cloud object storage, where it is secure and compliant with numerous government regulations. Using its data lake built on cloud object storage which has been activated for analytics through a multi-API data lake platform, BAI can find, access, and analyze its data faster, more easily, and in a more cost-controlled manner than ever before. The company is using insights generated from its global networks over multiple years to help rail operators maintain the flow of traffic and optimize routes, turning data insights into business value. This approach proved especially valuable when the pandemic hit, since BAI was able to deeply under­stand how COVID-19 impacted public transit networks regionally, all around the world, so they could continue providing critical connectivity to citizens.

    Another example is Blackboard, the leader in education technology serving K–12 education, business, and government clients. Blackboard’s product development team typically used log analytics to monitor cloud deployments of the company’s SaaS-based learning, management system (LMS) in order to troubleshoot application issues, etc. But when COVID-19 hit, millions of students switched to online learning and those log volumes skyrocketed—product usage grew by 3,000% in 2020 when the world went virtual. Its custom-managed ELK (Elasticsearch, Logstash, Kibana) stacks and managed Elasticsearch service for centralized log management couldn’t support the new log volumes—at a time when that log data was most valuable. The Blackboard team needed to be able to ana­lyze short-term data for troubleshooting but also long-term data for deeper analysis and compliance purposes. The Blackboard team moved its log data to a data lake plat­form running directly on Amazon S3 and serving analytics to end users via Kibana, which is included natively under the hood. The company now has day-to-day visibility of cloud computing environments at scale, app troubleshooting and alerting over long periods of time, root cause analysis without data retention limits, and fast resolution of application performance issues.

    Now we’re cooking

    Cloud storage has the potential to truly democratize data analytics for businesses. There’s no better or more cost-effective place to store a company’s treasure trove of information. The trick is unlocking cloud object storage for analytics without data movement or pipelining. Many data lake, warehouse, and even lakehouse providers have the right idea, but their underlying architectures are based on 1970s computer science, making the process brittle, com­plex, and slow.

    If you are developing or implementing a data lake and want to avoid building a swamp—ask yourself these questions:

    • What business use cases or analytics questions should we be able to address with the data lake?
    • How will data get into the data lake?
    • How will users across the organization get access to the data in the lake?
    • What analytics tools need to be con­nected to the data lake to facilitate the democratization of insights?

    It is important to find a solution that allows you to turn up the heat in the data lake with a platform that is cost-effective, elastically scalable, fast, and easily accessible. A winning solution allows business analysts to query all the data in the data lake using the BI tools they know and love, without any data movement, transformation, or gover­nance risk.  

    Author: Thomas Hazel

    Source: Database Trends & Applications

  • Hoe kunt u als klein bedrijf gebruik maken van business intelligence?  

    Hoe kunt u als klein bedrijf gebruik maken van business intelligence?

    Stelt u zich voor dat u geblinddoekt een pen moet zoeken zonder enige hulp of idee waar deze zou kunnen liggen. Dat is min of meer wat u doet wanneer u voor uw bedrijf geen gebruik maakt van data en analyses.

    Het belang van business intelligence

    Veel kleine bedrijven tasten vaak nog in het duister wanneer het gaat om business intelligence en het efficiënt verkrijgen, analyseren en gebruiken van data. Zij denken vaak dat dit alleen weggelegd is voor grote bedrijven met enorme budgetten. Dit is jammer, omdat zij hierdoor de vele voordelen van business intelligence mislopen! Het gaat hierbij onder andere om de volgende voordelen:

    • Het stelt u in staat betere bedrijfsbeslissingen te maken gebaseerd op duidelijke informatie i.p.v. een onderbuikgevoel
    • Business intelligence geeft u een duidelijk overzicht van uw bedrijf, zodat u direct inzicht krijgt in sterke punten van uw organisatie en de knelpunten.
    • Met business intelligence kunt u een beter idee krijgen in het gedrag van (potentiële) klanten en patronen hierin. Met deze informatie kunt u zorgen voor meer tevreden klanten.
    • Business intelligence helpt uw werknemers effectiever te werk te gaan en dus de productiviteit te verbeteren. Hierdoor blijft uw bedrijf competitief!
    • Haal meer omzet en verlaag uw kosten door middel van een beter inzicht in de sales- en bedrijfsprocessen.

    Samengevat zorgt business intelligence dus voor meer omzet, lagere kosten, tevredenere klanten en blijer, effectiever personeel. Redenen genoeg om serieus na te denken over het inzetten van meer data, analytics en business intelligence.

    Waar begint u?

    Ik hoor u denken “Heel leuk en aardig, maar hoe begin ik met het inzetten van business intelligence?”.

    Ten eerste is het belangrijk om één duidelijk doel te stellen voor uw organisatie met de daarbij behorende KPIs (key performance indicators) die voor u relevant zijn. Gezien de veelzijdigheid van business intelligence kan het verleidelijk zijn om dit op alle vlakken van uw bedrijf te implementeren om zo alle hierboven genoemde voordelen te behalen. Dit is een beginnersfout vanwege de complexiteit van business intelligence. U kunt het beste met één specifiek onderdeel beginnen en uitbreiden zodra dat proces goed verloopt.

    Als bedrijf heeft u de volgende 3 dingen nodig:

    • Uitstekende databronnen
    • Een business intelligence systeem
    • Menselijke kennis

    Allereerst is het belangrijk dat u de juiste data verzamelt. Denk hierbij aan website data van Google Analytics, financiële overzichten vanuit uw boekhoudsysteem, klantinformatie vanuit het CRM, marketing en sales informatie of een database van uw bedrijf. Zonder betrouwbare data heeft business intelligence geen zin.

    Welk business intelligence systeem past bij u?

    Afhankelijk van uw bedrijf en capaciteiten zijn er enkele mogelijkheden beschikbaar. Indien u het budget en de mensen heeft kunt u het hele business intelligence proces in-house regelen. Dit is vooral weggelegd voor middelgrote tot grote bedrijven.

    Voor kleinere tot middelgrote bedrijven moet u over het algemeen kijken naar een SaaS (Software as a Service) oplossing. Deze zijn gemaakt met het oog op een simpele integratie met uw databronnen. Deze tools maken het vaak sneller, makkelijker en goedkoper om inzicht te krijgen in uw bedrijf. Ze zijn vaak minder flexibel en robuust dan een eigen oplossing, maar voor de meeste bedrijven hebben ze meer dan genoeg functionaliteiten.

    Enkele toonaangevende services op het gebied van business intelligence zijn Tableau, Cluvio en de Google Data Studio. Hoogstwaarschijnlijk biedt een van deze services alles wat voor uw bedrijf van belang is.

    De menselijke factor van business intelligence

    De platformen hierboven maken business intelligence aanzienlijk makkelijker, maar een specialist op dit gebied blijft noodzakelijk. Indien er binnen uw bedrijf ruimte is voor groei, dan is het wellicht een goed idee om iemand op te leiden tot Business Intelligence professional. Dit kunt u onder andere doen door iemand te laten deelnemen aan de schriftelijke cursus Business Intelligence Professional.

    Een andere training die wij ten zeerste aanraden op het gebied van business intelligence is Data Science & Business Analytics. Met deze training leert u of een van uw collega’s om complexe data te interpreteren en deze te vertalen naar nieuwe ideeën en innovaties. Door deze kennis in huis te halen zorgt u ervoor dat uw bedrijf de concurrentie een stap voor kan blijven door slimmer te werk te gaan en betere beslissingen te maken.

    Bron: IMF Academy

  • Integration Will Accelerate Internet Of Things, Industrial Analytics Growth In 2017

    • internet-of-things-cityscape-graphic-hqEnabling real-time integration across on-premise and cloud platforms often involves integrating SAP, Salesforce, third-party and legacy systems. 2017 will be a break-out year for real-time integration between SAP, Salesforce, and third party systems in support of Internet of Things and Industrial Analytics.
    • McKinsey Global Institute predicts that the Internet of Things (IoT) will generate up to $11T in value to the global economy by 2025
    • Predictive and prescriptive maintenance of machines (79%), customer/marketing related analytics (77%) and analysis of product usage in the field (76%) are the top three applications of Industrial Analytics in the next 1 to 3 years.

    Real-Time Integration Is the Cornerstone Of Industrial Analytics

    Industrial Analytics (IA) describes the collection, analysis and usage of data generated in industrial operations and throughout the entire product lifecycle, applicable to any company that is manufacturing and selling physical products. It involves traditional methods of data capture and statistical modeling. Enabling legacy, third-party and Salesforce, SAP integration is one of the most foundational technologies that Industrial Analytics relies on today and will in the future. Real-time integration is essential for enabling connectivity between Internet of Things (IoT) devices, in addition to enabling improved methods for analyzing and interpreting data. One of the most innovative companies in this area is enosiX, a leading global provider of Salesforce and SAP integration applications and solutions. They’re an interesting startup to watch and have successfully deployed their integration solutions at Bunn, Techtronic Industries, YETI Coolers and other leading companies globally.

    A study has recently been published that highlights just how foundational integration will be to Industrial Analytics and IoT. You can download the Industrial-Analytics-Report-2016-2017.pdf. This study was initiated and governed by the Digital Analytics Association e.V. Germany (DAAG), which runs a professional working group on the topic of Industrial Analytics. Research firm IoT Analytics GmbH was selected to conduct the study. Interviews with 151 analytics professionals and decision-makers in industrial companies were completed as part of the study. Hewlett-Packard Enterprise, data science service companies Comma Soft and Kiana Systems sponsored the research. All research and analysis related steps required for the study including interviewing respondents, data gathering, data analysis and interpretation, were conducted by IoT Analytics GmbH. Please see page 52 of the study for the methodology.

    Key Takeaways:

    • With real-time integration, organizations will be able to Increase revenue (33.1%), increase customer satisfaction (22.1%) and increase product quality (11%) using Industrial Analytics. The majority of industrial organizations see Industrial Analytics as a catalyst for future revenue growth, not primarily as a means of cost reduction. Upgrading existing products, changing the business model of existing products, and creating new business models are three typical approaches companies are taking to generate revenue from Industrial Analytics. Integration is the fuel that will drive Industrial Analytics in 2017 and beyond.

    biggest-benefits-of-industrial-analytics

    • For many manufacturers, the more pervasive their real-time SAP integration is, the more effective their IoT and Industrial Analytics strategies will be. Manufacturers adopting this approach to integration and enabling Industrial Analytics through their operations will be able to attain predictive and prescriptive maintenance of their product machines (79%). This area of preventative maintenance is the most important application of Industrial Analytics in the next 1 – 3 years. Customer/marketing-related analytics (77%) and analysis of product usage in the field (76%) are the second- and third-most important. The following graphic provides an overview of the 13 most important applications of Industrial Analytics.

    Most-important-applications-of-Industrial-Analytics

    • 68% of decision-makers have a company-wide data analytics strategy, 46% have a dedicated organizational unit and only 30% have completed actual projects, further underscoring the enabling role of integration in their analytics and IoT strategies. The study found that out of the remaining 70% of industrial organizations, the majority of firms have ongoing projects in the prototyping phase.
      data-analytics-strategy
    • Business Intelligence (BI) tools, Predictive Analytics tools and Advanced Analytics Platforms will be pivotal to enabling industrial data analysis in the next five years. Business Intelligence Tools such as SAP Business Objects will increase in importance to industrial manufacturing leaders from 39% to 77% in the next five years. Predictive Analytics tools such as HPE Haven Predictive Analytics will increase from 32% to 69%. The role of spreadsheets used for industrial data analytics is expected to decline (i.e., 27% think it is important in 5 years vs. 54% today).

    advanced-analytics-BI

    • The Industrial Analytics technology stack is designed to scale based on the integration of legacy systems, industrial automation apps and systems, MES and SCADA systems integration combined with sensor-based data. IoT Analytics GmbH defines the technology stack based on four components inclouding data sources, necessary infrastructure, analytics tools, and applications. The following graphic illustrates the technology stack and underscores how essential integration is to the vision of Industrial Analytics being realized.

    technology-stack

    • Industrial Internet of Things (IIoT) and Industry 4.0 will rely on real-time integration to enable an era of shop-floor smart sensors that can make autonomous decisions and trade-offs regarding manufacturing execution. IoT Analytics GmbH predicts this will lead to smart processes and smart products that communicate within production environments and learn from their decisions, improving performance over time. The study suggests that Manufacturing Execution System (MES) agents will be vertically integrated into higher level enterprise planning and product change management processes so that these organizations can synchronously orchestrate the flow of data, rather than go through each layer individually.

     game-changer

    Source: business2community.com, 19 december 2016

  • Nieuwsoverzicht BI tools markt juli 2019

    Nieuwsoverzicht BI tools markt juli 2019

    Gedurende de hele maand mei kabbelde de markt voor BI-platforms en analytics zo’n beetje voort, maar begin juni was er breaking news. Zo werd de BI-gemeenschap opgeschrikt door de overname van twee veelbelovende leveranciers van BI tools en kwamen er ontluisterende mislukkingen aan het licht bij twee bekende specialisten in big data oplossingen.

    Looker en Tableau overgenomen

    We beginnen bij de overnames. Eerst was de in het Amerikaanse Santa Cruz gevestigde ´nichespeler´ Looker aan de beurt. Looker, vooral bekend van LookML als alternatief voor SQL, ´verdween´ voor 2.6 miljard dollar in de Google cloud. Nauwelijks van de schrik bekomen, werd pal daarna bekendgemaakt dat Salesforce voor omgerekend 15,7 miljard dollar de in Seattle gevestigde ´leider´ Tableau opslokt. Valt er op termijn voor klanten nog wel wat te kiezen?

    Op dit moment is nog moeilijk in te schatten wat de exacte gevolgen zullen zijn voor huidige en toekomstige klanten, distributeurs en andere partners van de twee overnamedoelwitten. Zoals gebruikelijk zijn er veel geruststellende woorden, wordt er geschermd met synergievoordelen en meer financiële armslag en enorme marketingmogelijkheden voor zowel Looker als Tableau. En dat allemaal onder de vleugels van de twee bekende techgiganten.

    Grotere afhankelijkheid van dominante marktpartijen

    Één ding is zeker, dit nieuwe ´overnamegolfje´ in de BI-wereld betekent voor klanten een nog grotere afhankelijkheid van enkele dominante marktpartijen en dat maakt hun onderhandelingspositie er in het algemeen meestal niet sterker op. Bovendien is het nog maar de vraag in hoeverre de overnemende partijen bereid zijn verder te investeren in innovatie van het producten- en dienstenportfolio van de overgenomen BI-spelers. Ook moet nog blijken hoe lang Google en Salesforce nog bereid zijn om te investeren in de merknamen van Looker en Tableau. Als we verder inzoomen op de aangekondigde acquisities valt een aantal details op.

    Inspelen op digitale transformatie

    Salesforce betaalt de bestaande aandeelhouders van het beursgenoteerde Tableau in aandelen, terwijl Google de aandeelhouders van het privaat gefinancierde Looker in cash betaalt. Ook de transactiewaarde verschilt aanzienlijk. Voor Tableau wordt omgerekend 6 keer meer betaald dan voor Looker. Dat is logisch, omdat Looker tot voor kort nog in de financieringsronde van ´E-funding´ verkeerde en Tableau volwassener is. Tableau is sinds 2013 beursgenoteerd en daardoor heeft het langer kunnen bouwen aan een hogere marktkapitalisatie.

    Data is het fundament

    Salesforce beschikt met Tableau volgens eigen zeggen over een ijzersterke propositie. De bedrijfsculturen komen overeen en zijn volgens hen sterk gericht op het succes van klanten. Gezamenlijk hopen ze te fungeren als een ´supercharger´ voor de digitale transformatie. De nummer 1 op de markt voor CRM systemen en de nummer 1 van end-to-end analytics platforms bundelen hun platforms om klanten te helpen bij die digitale transformatie. ‘Data is het fundament voor elke digitale transformatie’, zo stelt Salesforce. Tableau’s CEO Adam Selipsky verwacht met hulp van Salesforce miljoenen nieuwe eindgebruikers te kunnen bedienen met zijn platform en ´actionable insight´.

    De potentiële markt voor bedrijven die met de digitale transformatie worstelen, wordt door marktonderzoeksbureau IDC geschat op maar liefst 1,8 biljoen Amerikaanse dollars.

    Niet alleen kijken naar data, maar deze ook begrijpen

    Kortom: de leveranciers lijken vooral zichzelf rijk te rekenen. Of eindgebruikers hier nu direct beter van worden, dat valt echter nog te bezien. Salesforce verwacht veel van Customer 360 en Einstein, het eigen platform, waarmee het klanten nu al op Artificial Intelligencegebaseerde inzichten verstrekt op het gebied van verkoop en marketing. In combinatie met het intuïtieve self-service analytics platform van Tableau hoopt het een nog bredere doelgroep aan klanten en eindgebruikers te bereiken in elke fase van de customer journey en tijdens elk touchpoint. De missie is om gebruikers te ondersteunen, zodat ze data niet alleen zien, maar ook begrijpen.

    Alles lijkt business as usual

    Tableau heeft in zestien jaar tijd een klantenbestand opgebouwd van zo’n 86.000 klanten wereldwijd, waaronder Netflix, Verizon en Schneider Electrics. Als onderdeel van Salesforce zal Tableau als zelfstandig merk verder opereren vanuit hetzelfde hoofdkantoor in Seattle en onder leiding van het huidige managementteam en dezelfde CEO. “Never change a winning team”, zo lijkt het devies. De communities van beide partijen, de 1,4 miljoen Trailblazers en de 1 miljoen tellende Tableau-community zullen elkaar bovendien alleen maar versterken, zo schatten beide partijen in.

    Bedrijven hopen op de befaamde hefboomwerking

    Ook Looker verwacht de vruchten te kunnen plukken na de overname door Google, dat formeel onderdeel uitmaakt van het moederbedrijf Alphabet. Na de finalisering van de transactie en de betaling van de overnamesom van 2,6 miljard dollar zal Looker direct opgaan in de Google Cloud.

    Looker, dat in 2012 werd opgericht en een hoofdkantoor heeft in Santa Cruz, verheugt zich met name op een groter bereik en verwacht over meer middelen en intellectueel kapitaal te kunnen beschikken om hun zogenoemde unified platform voor BI, big data-applicaties en embedded analytics aan de man te brengen. Vanuit de Google Cloud zal Looker zijn klanten in staat stellen om te profiteren van de hefboomwerking die ontstaat door analytics, machine learning en AI te combineren.

    Looker heeft een klantenbestand van zo’n 1.700 bedrijven en instellingen, waaronder bekende merken als Amazon, Etsy, IBM, Kickstarter, Lyft, Sony, Spotify en The Economist. Voor Google is de overname van Looker de grootste overname sinds de acquisitie van Nest, de slimme, ‘lerende’ thermostaat, die onder meer het verwarmen van je woning aanpast aan je leefpatroon, hetgeen je  op afstand kunt regelen via je smartphone. Net als Tableau verwacht ook Looker door de overname bedrijven beter te kunnen ondersteunen bij de digitale transformatie.

    De overname van Looker door Google komt niet helemaal uit de lucht vallen. Beide bedrijven onderhielden tot dusverre al een samenwerkingsverband waarbinnen ze 350 gezamenlijke klanten bedienden, waaronder Buzzfeed, Hearst, King en Yahoo!

    Sterk groeiende vraag naar locatiedata

    Nog voor de aangekondigde overname door Salesforce introduceerde Tableau nieuwe mapping-mogelijkheden voor een soepeler en slimmer gebruik van locatiegegevens. Zo biedt versie 2019.2. functionaliteit voor het analyseren van geospatiale data en nieuwe zogenoemde authoring-mogelijkheden voor de samenstellers.

    Vernieuwingen in Ask Data

    Deze nieuwe release breidt de door Mapbox aangedreven technologie voor background mapping uit met vectorkaarten. Gebruikers kunnen hiermee gedetailleerdere locatiedata naar voren brengen en analyseren op basis van achtergrondlagen met meer contextuele informatie. Aanvullende parameter-settings zorgen volgens de leverancier daarnaast voor verbeterde visuele interactiviteit in de nieuwe release. Ook zijn er verbeterde tools beschikbaar gekomen voor het samenstellen van dashboards en vernieuwingen in Ask Data, de natural language processing (NLP) technologie van Tableau. De softwareleverancier signaleert een groeiende vraag naar locatiedata.

    Analyses zonder onderbreking

    Volgens een recent analistenrapport zal 30% van alle interacties met klanten in 2022 worden aangestuurd door realtime analyses van hun locatie. In 2017 was dit nog slechts 4%. De nieuwe vectorkaarten in Tableau bieden gedetailleerde informatie en soepelere verkenningsmogelijkheden dan mapping-oplossingen die gebruikmaken van afbeeldingen. Als gebruikers inzoomen, uitzoomen of schuiven past de software de kaartomvang automatisch aan en hoeft deze geen afbeeldingen te laden, zodat gebruikers geen enkele onderbreking in hun analyse-flow hoeven te ervaren.

    Unicorn van 1 miljard dollar moet deuren sluiten

    In zijn hoogtijdagen werd het in Santa Clara gevestigde MapR nog gewaardeerd op meer dan een 1 miljard dollar. De zogenoemde ´unicorn´ groeide uit tot Apache Hadoop-specialist, maar moet nu de deuren sluiten, tenzij er nog een koper wordt gevonden. Ondanks de enorme hype heeft Hadoop niet zo’n hoge vlucht genomen. Volgens Gartner gebruikte in 2017 slechts 14% van de ondernemingen de als traag en complex ervaren opensourcesoftware. De CEO is samen met 122 werknemers ontslagen. 

    Recente megafusie levert niet gewenste resultaat

    Ook bij Cloudera, ‘the enterprise data cloud company’ is de uittocht begonnen. De CEO is vervangen en de recente megafusie met rivaal Hortonworks heeft blijkbaar niet het gewenste resultaat opgeleverd. De financiële resultaten vallen vies tegen. Bovendien stellen de bestaande klanten van de fusiepartners nieuwe investeringen uit, vanwege de ontstane onzekerheid en het ontbreken van een gezamenlijke roadmap. Ook de concurrentie van de public cloud vendors speelt Cloudera parten.

    Slecht nieuws voor eindgebruikers

    Hoe je het ook bekijkt, dit is allemaal slecht nieuws voor klanten en eindgebruikers die ooit in goed vertrouwen met deze op papier veelbelovende partijen in zee zijn gegaan. Moraal van het verhaal: doe je huiswerk en probeer, voor zover mogelijk, een goed beeld te krijgen van de roadmap, de USPs, de kwaliteit van het management, reviews, gebruikerservaringen, enzovoorts. En last but not least, blijf zoveel mogelijk weg van de hypes!

    Auteur: Eric van der Steen

    Bron: Passionned

EasyTagCloud v2.8