11 items tagged "Google"

  • 2016 wordt het jaar van de kunstmatige intelligentie

    Artificial-intelligence.jpg-1024x678December is traditiegetrouw de periode van het jaar om terug te blikken en oudjaarsdag is daarbij in het bijzonder natuurlijk de beste dag voor. Bij Numrush kijken we echter liever vooruit. Dat deden we begin december al met ons RUSH Magazine. In deze Gift Guide gaven we cadeautips aan de hand van een aantal thema’s waar we komend jaar veel over gaan horen.Eén onderwerp bleef bewust een beetje onderbelicht in onze Gift Guide. Aan de ene kant omdat het niet iets is wat je cadeau geeft, maar ook omdat het eigenlijk de diverse thema’s overstijgt. Ik heb het over kunstmatige intelligentie. Dat is natuurlijk niets nieuws, er is al ontzettend veel gebeurt op dat vlak, maar komend jaar zal de toepassing hiervan nog verder in een stroomversnelling raken.

  • Big Data on the cloud makes economic sense

    With Big Data analytics solutions increasingly being made available to enterprises in the cloud, more and more companies will be able to afford and use them for agility, efficiency and competitiveness

    For almost 10 years, only the biggest of technology firms such as Alphabet Inc.’s Google and Amazon.com Inc.
    used data analytics on a scale that justified the idea of ‘big’ in Big Data. Now more and more firms are
    warming up to the concept. Photo: Bloomberg

    On 27 September, enterprise software company SAP SE completed the acquisition of Altiscale Inc.—a provider of Big Data as-a-Service (BDaaS). The news came close on the heels of data management and analytics company Cloudera Inc. and data and communication services provider CenturyLink Inc. jointly announcing BDaaS services. Another BDaaS vendor, Qubole Inc., said it would offer a big data service solution for the Oracle Cloud Platform.

    These are cases in point of the growing trend to offer big data analytics using a cloud model. Cloud computing allows enterprises to pay for software modules or services used over a network, typically the Internet, on a monthly or periodical basis. It helps firms save relatively larger upfront costs for licences and infrastructure. Big Data analytics solutions enable companies to analyse multiple data sources, especially large data sets, to take more informed decisions.

    According to research firm International Data Corporation (IDC), the global big data technology and services market is expected to grow at a compound annual growth rate (CAGR) of 23.1% over 2014-2019, and annual spending is estimated to reach $48.6 billion in 2019.

    With Big Data analytics solutions increasingly being made available to enterprises in the cloud, more and more companies will be able to afford and use them for agility, efficiency and competitiveness.

    MarketsandMarkets, a research firm, estimates the BDaaS segment will grow from $1.8 billion in 2015 to $7 billion in 2020. There are other, even more optimistic estimates: research firm Technavio, for instance, forecasts this segment to grow at a CAGR of 60% from 2016 to 2020.

    Where does this optimism stem from?

    For almost 10 years, it was only the biggest of technology firms such as Alphabet Inc.’s Google and Amazon.com Inc., that used data analytics on a scale that justified the idea of ‘big’ in Big Data. In industry parlance, three key attributes are often used to understand the concept of Big Data. These are volume, velocity and variety of data—collectively called the 3Vs.

    Increasingly, not just Google and its rivals, but a much wider swathe of enterprises are storing, accessing and analysing a mountain of structured and unstructured data. The trend is necessitated by growing connectivity, falling cost of storage, proliferation of smartphones and huge popularity of social media platforms—enabling data-intensive interactions not only among ‘social friends’ but also among employers and employees, manufacturers and suppliers, retailers and consumers—virtually all sorts of connected communities of people.

    g tech web
    A November 2015 IDC report predicts that by 2020, organisations that are able to analyse all relevant data and deliver actionable information will achieve an extra $430 billion in productivity benefits over their less analytically oriented peers.

    The nascent nature of BDaaS, however, is causing some confusion in the market. In a 6 September article onNextplatform.com, Prat Moghe, founder and chief executive of Cazena—a services vendor—wrote that there is confusion regarding the availability of “canned analytics or reports”. According to him, vendors (solutions providers) should be carefully evaluated and aspects such as moving data sets between different cloud and on-premises systems, ease of configuration of the platform, etc., need to be kept in mind before making a purchase decision.

    “Some BDaaS providers make it easy to move datasets between different engines; others require building your own integrations. Some BDaaS vendors have their own analytics interfaces; others support industry-standard visualization tools (Tableau, Spotfire, etc.) or programming languages like R and Python. BDaaS vendors have different approaches, which should be carefully evaluated,” he wrote.

    Nevertheless, the teething troubles are likely to be far outweighed by the benefits that BDaaS brings to the table. The key drivers, according to the IDC report cited above, include digital transformation initiatives being undertaken by a lot of enterprises; the merging of real life with digital identity as all forms of personal data becomes available in the cloud; availability of multiple payment and usage options for BDaaS; and the ability of BDaaS to put more analytics power in the hands of business users.

    Another factor that will ensure growth of BDaaS is the scarcity of skills in cloud as well as analytics technologies. Compared to individual enterprises, cloud service providers such as Google, Microsoft Corp., Amazon Web Services and International Businsess Machines Corp. (IBM) can attract and retain talent more easily and for longer durations.

    Manish Mittal, managing principal and head of global delivery at Axtria, a medium-sized Big Data analytics solutions provider, says the adoption of BDaaS in India is often driven by business users. While the need is felt by both chief information officers and business leaders, he believes that the latter often drive adoption as they feel more empowered in the organisation.

    The potential for BDaaS in India can be gauged from Axtria’s year-on-year business growth of 60% for the past few years—and there are several niche big data analytics vendors currently operating in the country (besides large software companies).

    Mittal says that the growth of BDaaS adoption will depend on how quickly companies tackle the issue of improving data quality.

    Source: livemint.com, October 10, 2016


  • Business Intelligence software comparison: Looker versus Power BI

    Business Intelligence software comparison: Looker versus Power BI

    Looker and Microsoft Power BI are two popular business intelligence software solutions. Each software has robust features so companies can make better decisions based on their data.

    Here, we compare Looker and Power BI on several factors.

    What is the difference between Looker and Power BI?

    • ​While both software solutions are deployed in the cloud, Power BI is only available in the Microsoft Azure environment. Looker, on the other hand, can be deployed in multiple cloud environments (e.g., Azure, Google Cloud, AWS).
    • Looker requires a custom price quote, but Power BI’s pricing is displayed on its website.
    • Both vendors offer similar customer support features. However, Looker has built-in live chat, while Power BI doesn’t.

    Product Overviews

    The following excerpts from our Looker and Power BI reviews summarize each software solution:


    Google Cloud’s Looker Data Platform is a data-discovery platform that helps companies make better business decisions through real-time access to data. Data, no matter the size, can be analyzed within Looker’s 100% in-database and 100% browser-based platform. Looker analytics integrate with any SQL database or data warehouse, such as Amazon Athena, Greenplum and Microsoft Azure SQL Data Warehouse.

    The platform is accessible on any browser as well as on mobile devices  – reports and data can be shared via email or URLs. Data can be integrated into different applications such as Google Docs, Excel, and even customer and third-party applications. It can also be manually exported in .csv or text files.

    Power BI

    Microsoft Power BI is a cloud-based business intelligence and analytics service that provides users with a full overview of your most critical data. Connecting to all of your data sources, Power BI simplifies data evaluation and sharing with scalable dashboards, interactive reports, embedded visuals and more.

    There are several versions. Power BI Desktop is an on-premise solution that connects to a cloud-based service to easily upload and share data and information across the company. Power BI Mobile has the same capabilities, but in an app, so you can have a 360-degree view of your company anytime, anywhere.

    Power BI has two additional versions with advanced features: Power BI Pro and Power BI Premium. Power BI Pro allows users to share insights with collaborators. They can analyze data by using conversational language (the Q&A feature) and get insights from pre-built visualization and reporting templates. Users can share data in other Microsoft platforms like Teams or Dynamics 365. Power BI Premium allows an organization’s internal or external users (e.g., clients) to view Power BI reports without a license. It also empowers users to prepare and transform big datasets.

    Features Comparisons

    Data Preparation


    Looker connects with over 50 SQL-compliant databases (e.g., Google BigQuery, Amazon RedShift, Snowflake). Its proprietary LookML data modeling tool lets users develop custom data models. After connecting with a SQL database, Looker automatically creates a basic LookML model. Users can then build on that model by adding in their own code or by selecting from over 100 pre-built Looker Blocks, as well as customizing the model with unique metrics. The LookML model can be saved for future use, so users won’t need to write SQL queries each time. LookML can also perform any data transformations at the time of the query, so users won’t need to manually store the data elsewhere.

    Power BI

    Power BI connects with over 100 cloud-based and on-premise data sources (e.g., Excel, Salesforce, Azure SQL database). It includes automated daily refreshes, so data stays up to date. Note that the amount of daily refreshes depends on the tier: Power BI Pro performs up to eight refreshes per day, while Power BI Premium refreshes data up to 48 times daily. Users can prepare their data with the Power Query tool. Power Query has prebuilt transformation functions (e.g., removing columns, filtering rows) that users can simply select from within the interface. If a user needs to add a new transformation function, Power Query automatically adds in an M code, so the user doesn’t need to write any new code.

    Bottom Line

    Both software solutions connect with a variety of data sources. However, their data transformation features are different. Users won’t need to perform any data transformation within Looker as the software handles that automatically. But there’s minimal data transformation work that Power BI users will need to perform manually.

    Data Visualizations and Dashboards


    Looker has interactive visualizations and dashboards that let users drill down into the data at row-level detail. It includes a library of visualization templates in different formats (e.g., timelines, funnel, combined charts, treemaps), but users can build their own visualizations, too. It also includes a drag-and-drop functionality for users to explore and ask questions in their data. Users can also take action or complete a task directly in their data, such as sending an email or a Slack chat, with the Actions feature that has prebuilt integrations for third-party software systems.

    Power BI

    Users can create visualizations using the drag-and-drop tool by choosing from many templates from Microsoft and third-party providers or by creating their own. They can create mobile-optimized reports by selecting the Phone Layout button, and then adjusting the chart or graph. The Power BI Pro tier lets users share the reports with their external or internal colleagues. Those colleagues can add comments to the visualization, as well as subscribe for updates or alerts.

    Bottom Line

    Both software solutions have robust visualization and dashboard options, but Looker currently doesn’t offer the ability for other users to add their comments to the visualization, while Power BI does. We do want to note that due to its Actions features, Looker users can perform tasks in their data without logging onto their third-party system account, while Power BI doesn’t have this functionality.

    Embedded Analytics


    Companies can offer their clients and employees the opportunity to analyze data within their existing third-party application. Developers can build their app with Looker’s native integrations and public APIs, and they can also collaborate using Git version-control and workflow features. Companies can grow their revenue streams by offering product tiers that feature customization, ad hoc analysis, data granularity and admin reporting functionalities.

    Power BI

    With Power BI Embedded on Azure, developers and software vendors can insert Power BI into their software application using JavaScript SDK, and monitor performance and automations with REST APIs. Some of its features include custom-built visuals, the ability for end users to create and edit dashboards, and single- and multi-tenant deployment with row-level security.

    Bottom Line

    Looker has a multi-cloud data platform, but Power BI can only be deployed in the Microsoft Azure environment.



    Looker provides custom pricing that’s tailored to each organization based on deployment scale, number of users and other factors. Prospects will need to contact the vendor directly for a quote. Looker does offer discounts to nonprofits and schools.

    Power BI

    Power BI Desktop is free to individual users. Here’s the pricing for Power BI Pro and Power BI Premium:

    • Pro – The Pro version costs $9.99 per user, per month and includes a mobile app, the ability to publish and share reports, a 1 GB model size limit, eight data refreshes daily, the ability to connect to over 100 data sources, embedded APIs and controls, AI visuals, data security and encryption, metrics for content creation and publishing and up to 10 GB per user maximum storage. Pro is available for free for companies that have the Microsoft 365 E5 solution.
    • Premium (per user) – The Premium per user plan costs $20 per user, per month. It includes all of the features of the Pro plan, plus paginated reports, a 100 GB model size limit, 48 data refreshes daily, advanced AI features, XMLA endpoint read/write connectivity, data flows, the ability to analyze data stored in Azure Data Lake Storage, application lifecycle management and up to 100 TB of maximum storage.
    • Premium (per capacity) – This version starts at $4,995 per month, per dedicated cloud compute and storage resource. It includes all of the features of the Premium per user plan, plus on-premise reporting, a 400 GB model limit, multi-location deployment management, Bring Your Own Key (BYOK) and autoscale add-on.

    Bottom Line

    Looker prefers to provide custom quotes to prospective clients, while Microsoft displays its pricing up front, which may be preferable to some customers.

    Customer Support


    Users can access Looker’s Help Center to:

    • Browse articles
    • Submit a support request

    In addition, the Looker Community allows users to network with other users and receive troubleshooting tips, ideas and best practices.

    Users can also email the support team or contact them via live chat directly from the Looker platform. Looker has the following response times based on severity (during normal business hours):

    • Severity 1 (critical) – Up to one hour
    • Severity 2 (urgent) – Up to four hours
    • Severity 3 (tolerable) and Severity 4 (non-critical) – Up to one business day

    Finally, Looker offers guided learning paths via its Looker Connect solution for users to grow their BI skills.

    Power BI

    Microsoft offers several resources:

    • FAQs
    • In-depth documentation on tools and features
    • Guided learning courses
    • A community forum
    • Samples of Power BI reports and dashboards

    In addition, Power BI Pro users can submit a support ticket.

    Bottom Line

    While both vendors provide customers with a help center and community forum, Looker has live in-app chat, and Power BI doesn’t offer this feature at all.

    Author: Melissa Pardo-Bunte

    Source: BetterBuys

  • DeepMind gaat algoritmes gebruiken om blindheid te voorspellen

    118628 c2f7304fDeepMind, een van de dochterbedrijven van zoekgigant Google, die onderzoek doet naar zelflerende computers, gaat helpen bij onderzoek naar blindheid. DeepMind gaat samenwerken met de Britse gezondheidsorganisatie NHS om zijn technologie te leren de eerste tekenen van blindheid op te sporen.

    Daartoe krijgt DeepMind 1 miljoen geanonimiseerde oogscans aangeleverd. De software gaat die scannen en op basis van meegeleverde informatie zou het moeten weten welke scans een oogziekte vertonen en welke niet. De bedoeling is dat de software uiteindelijk uit zichzelf de eerste tekenen van oogziektes leert te herkennen.

    Het gaat op dit moment om twee vormen van blindheid die relatief veel voorkomen: leeftijdsgebonden maculadegeneratie en diabetische retinopathie. Mensen met diabetes hebben bijvoorbeeld 25 keer zoveel kans om blind te worden als mensen zonder diabetes. Het vroeg herkennen van dit soort gevallen zou kunnen helpen blindheid te voorkomen.

    Het hoofd van de oogafdeling in het ziekenhuis, Professor Peng Tee Khaw, vertelt dat het kan helpen om snel oogziektes op te sporen bij patiënten. "Deze scans zijn ongelofelijk gedetailleerd, gedetailleerder zelfs dan alle andere scans die we van het lichaam hebben. We zien beelden op celniveau. Maar het probleem is tegelijkertijd juist dat het zoveel data biedt."

    Daar komt dan ook de oplossing om DeepMind te gebruiken vandaan. "Ik heb er alle ervaring uit mijn hele leven voor nodig om de geschiedenis van een patiënt te kunnen volgen. Maar patiënten vertrouwen op mijn ervaring om hun toekomst te voorspellen. Als we zelflerende technologie kunnen gebruiken, zouden we dit veel beter kunnen doen, want dan zou ik de ervaring van wel 10.000 levens hebben."

    Bron: Techzine.nl

  • European Union to Scrutinize Usage of Big Data by Large Internet Companies

    Competition Commissioner Margrethe VestagerThe European Union is considering whether the way large Internet companies, such as Alphabet Inc.’s Google or Facebook Inc., collect vast quantities of data is in breach of antitrust rules, the bloc’s competition chief said Sunday.

    “If a company’s use of data is so bad for competition that it outweighs the benefits, we may have to step in to restore a level playing field,” said Margrethe Vestager, European Commissioner for Competition, according to a text of her speech delivered at the Digital Life Design conference in Munich, Germany.

    “We continue to look carefully at this issue,” she said, adding that while no competition problems have yet been found in this area, “this certainly doesn’t mean we never will” find them in the future.

    Her comments highlight the increased focus that regulators give to the use of so-called big data—large sets of personal information that are increasingly important for digital businesses, even though people generally hand over the information voluntarily when they use free services.

    The data can help firms target ways to make business operations more efficient. Companies increasingly are also collecting more data as a greater range of devices—from fitness trackers, smoke detectors to home-heating meters—are being connected to the Web, a phenomenon known as the “Internet of Things.”

    “But if just a few companies control the data you need to satisfy customers and cut costs, that could give them the power to drive their rivals out of the market,” Ms. Vestager said.

    The concern is that huge data sets compiled by large Internet firms could give these companies an unfair advantage by essentially erecting barriers to new competition, some experts say. Incumbent firms would amass detailed profiles of their consumers that would allow them to target advertising with precision, while new rivals could find themselves too far behind to compete.

    This isn’t the first time Ms. Vestager has expressed interest into how companies use big data. On Sunday, she laid out some details about how the European Commission is looking into the issue.

    Ms. Vestager said the commission would be careful to differentiate between different types of data, since some forms of information can become obsolete quickly, making concerns of market dominance moot.

    She also said the EU would look into why some companies can’t acquire information that is as useful as the data that other competing firms have.

    “What’s to stop them [companies] from collecting the same data from their customers, or buying it from a data-analytics company?” she said.

    Lawyers representing U.S. tech firms have said previously that competition concerns over data are misguided. They said data isn’t exclusive since many different companies can hold the same information on people’s names, addresses and credit-card details, for example. It is also easy for consumers to switch between platforms, they said.

    As for how companies protect their consumers’ data, Ms. Vestager said that was beyond her scope and pointed to the new EU-wide data-privacy rules agreed late last year.

    Ms. Vestager also said she would publish a preliminary report in the middle of the year, as the next formal step in an investigation into whether Internet commerce companies, such as Amazon.com Inc., are violating antitrust rules by restricting cross-border trade.

    “With so much at stake, we need to act quickly when we discover problems,” she said, in reference to business contracts that aim to keep national markets separate.

    To start that debate, the commissioner said she would publish a paper before Easter outlining the views of relevant parties affected or involved in geo-blocking, a practice to discriminate via price or the range of goods a company offers based on a customer’s location.

    The commission in September launched a public questionnaire to gather more information about the practice of geo-blocking.

    Source: The Wall Street Journal

  • Google buys French image recognition startup Moodstocks

    524861120Two weeks after Twitter acquired Magic Pony to advance its machine learning smarts for improving users’ experience of photos and videos on its platform, Google is following suit. Today, the maker of Android and search giant announced that it has acquired Moodstocks, a startup based out of Paris that develops machine-learning based image recognition technology for smartphones whose APIs for developers have been described as “Shazam for images.”

    Moodstocks’ API and SDK will be discontinued “soon”, according to an announcement on the company’s homepage. “Our focus will be to build great image recognition tools within Google, but rest assured that current paying Moodstocks customers will be able to use it until the end of their subscription,” the company noted.

    Terms of the deal were not disclosed and it’s not clear how much Moodstocks had raised: CrunchBase doesn’t note any VC money, although when we first wrote about the company back in 2010 we noted that it had raised $500,000 in seed funding from European investors. As a point of reference, Twitter paid a whopping $150 million in cash for its UK acquisition of Magic Pony the other week.

    While Magic Pony was young and acquired while still largely under the radar, Moodstocks has been around since 2008, all the while working around the basic premise of improving image recognition via mobile devices. “Our dream has been to give eyes to machines by turning cameras into smart sensors able to make sense of their surroundings,” the company writes in its acquisition/farewell/hello note.

    It looks like Moodstocks originally tried its hand at creating its own consumer apps, one of which was a social networking app of sorts: it let people snap pictures of media like books, and then add their own annotations about that media that would link up with other people’s annotations, by way of special image recognition behind the scenes that would match up the “fingerprint” in different people’s snaps.

    An interesting idea, but it didn’t take off, and so as the company pivoted to offering its tech to other developers, at least one of its apps, Moodstocks Scanner, turned into tools for testing the SDK before implementing it in your own app.

    Google doesn’t specify whether it will be launching its own SDK for developers to incorporate more imaging services into apps, or whether it will be incorporating the tech solely into its own consumer-facing services. What it does say is that it will be bringing Moodstocks’ team — the startup was co-founded by Denis Brule and Cedric Deltheil — and the company’s tech into its R&D operation based in France.

    In a short statement, Vincent Simonet, who heads up that center, says Google sees Moodstocks’ work contributing to better image searches, a service that is of course already offered in Google but is now going to be improved. “We have made great strides in terms of visual recognition,” he writes (in French), “but there is still much to do in this area.”

    It’s not clear if Moodstocks’ work will remain something intended for smartphones or if it will be applied elsewhere. There are already areas where Moodstocks’ machine learning algorithms could be applied, for example in Google’s searches, to “learn” more about how to find images that are similar and/or related to verbal search terms. Google also could potentially use the tech in an existing app like Photos.

    Or it could make an appearance in a future product that has yet to be launched, although the more obvious use case, for smartphones, is already here: on a small handset with a touchscreen, users are generally less inclined to enter text; and they may be using their own (poor quality) images to find similar ones: in both of these scenarios, having a stronger visual recognition tool (let’s say to snap a pic of something and then use it as a search ‘term’) could come in handy.

    Google has made other acquisitions in France, including FlexyCore (also for improving smartphone performance). It’s also made a number of acquisitions to improve its tech in imaging, such as JetPac and PittPatt for facial recognition. And other large tech companies are also buying up technology in talent in this area. Earlier this year, it emerged thatAmazon had quietly acquired Orbeus, a startup up that also develops photo recognition tech, with its service tapping AI and neural networks.

    Bron: Techcrunch.com


  • Google koopt Anvato ter versterking van cloudplatform

    Google heeft Anvato opgekocht. Dat bedrijf regelt de codering, editing, publicatie en distributie van uiteenlopende video's over meerdere platformen. De zoekgigant wil Anvato bij zijn cloudplatform voegen en de technologie implementeren in zijn eigen diensten. Hoe Google dit voor ogen heeft is niet bekend.

    Wel is bekend dat Amerikaanse televisiezenders als NBCUniversal, Fox Sports en MSNBC gebruik maken van de diensten van Anvato bij het maken en aanbieden van online video’s. Het is een dienst die Google’s eigen cloudplatform nog niet aanbiedt, dus vermoedelijk ligt hierin de reden voor de aankoop.

    "Onze teams gaan samenwerken om cloudoplossingen te bieden om bedrijven in de media en entertainmentindustrie te helpen hun video-infrastructuur te schalen en hoge kwaliteit live-video’s en on-demand content aan consumenten te bieden op elk apparaat – of dat nou een smartphone, tablet of smart-tv is", stelt Google’s senior productmanager Belwadi Srikanth in een statement.

    Het is niet bekend hoeveel Google betaald heeft voor het bedrijf. Bij zijn oprichting in 2007 haalde Anvato zo’n 2,5 miljoen dollar op in een investeringsronde, maar ondertussen zal de waarde van het bedrijf flink gegroeid zijn.

    Bron: Techzine nieuws 

  • Google wil telecomdiensten aanbieden

    Ggoogleoogle is van plan telecomdiensten aan te gaan bieden in de Verenigde Staten, via de mobiele netwerken van de telecombedrijven Sprint en T-Mobile US. Dat meldden ingewijden rond Google.

    Google zou al afspraken gemaakt hebben met Sprint en T-Mobile US over het gebruik van hun netwerken. Als Google inderdaad de telecommarkt opgaat, zal het bedrijf de concurrentie aangaan met aanbieders die mobieltjes met de Android-software van Google verkopen. De grootste telecombedrijven in de Verenigde Staten zijn AT&T en Verizon Communications, gevolgd door Sprint en T-Mobile US.

    Het is nog niet duidelijk hoe breed de telecomdiensten van Google zullen worden, wat de kosten zijn en wanneer de verkoop begint. Mogelijk kan Google eerst op beperkte schaal in een aantal Amerikaanse steden met de dienstverlening beginnen

    Automatiseringsgids, 22 januari 2015

  • Google: a prime example of an intelligent organization basing its decisions on data

    Google: a prime example of an intelligent organization basing its decisions on data

    Google is a company in which fact-based decision-making is part of the DNA and where Googlers (that is what Google calls its employees) speak the language of data as part of their culture. In Google the aim is that all decisions are based on data, analytics and scientific experimentation.

    About Google

    Google is a multinational Internet and software corporation specialized in Internet search, cloud computing, and advertising technologies, based in Mountain View, California, USA. Google’s mission is to organize the world‘s information and make it universally accessible and useful. And with this mission, Google is very serious about using information to inform their decisions.

    Data to inform decision-making

    In companies data should be collected to provide answers to the most important questions and unless you are clear about the questions you need to answer, data is pretty useless. In Google today, the aim is to start with questions and be very clear about the information needs at the outset. Their executive chairman Eric Schmidt says: “We run the company by questions, not by answers. So in the strategy process we’ve so far formulated 30 questions that we have to answer […] You ask it as a question, rather than a pithy answer, and that stimulates conversation. Out of the conversation comes innovation. Innovation is not something that I just wake up one day and say ‘I want to innovate.’ I think you get a better innovative culture if you ask it as a question.” There are thousands of great examples of how Google applies this thinking but let’s look at a great case example from their HR department.

    Fact-based Decision-Making at Google

    Within their global HR function, Google has created a People Analytics Department that supports the organization with making HR decisions with data. One question Google wanted to have an answer to was: Do managers actually matter? This is a question Google has been wrestling with from the outset, where its founders were questioning the contribution managers make. At some point they actually got rid of all managers and made everyone an individual contributor, which didn’t really work and managers were brought back in.

    Project Oxygen

    Within the people analytics department Google has created a group called the Information Lab, which comprises of social scientists who are part of the people analytics department but focus on longer term questions with the aim of conducting innovative research that transforms organizational practice within Google and beyond. This team took on the project of answering the question: Do Managers Matter – codenamed ‘Project Oxygen’. So the objectives and information needs were clearly defined.

    What Data to Use?

    The team first looked at the data sources that already existed, which were performance reviews (top down review of managers) & employee survey (bottom up review of managers). The team took this data and plotted them on a graph which revealed the managers were generally perceived as good. The problem was that the data didn’t really show a lot of variation so the team decided to split the data into the top and bottom quartile.


    Using a regression analysis the team was able to show a big difference between these two groups in terms of team productivity, employee happiness, and employee turnover. In summary, the teams with the better managers were performing better and employees were happier and more likely to stay. While this has confirmed that good managers do actually make a difference, it wouldn’t allow Google to act on the data. The next question they needed an answer to was: What makes a good manager at Google? Answering this question would provide much more usable insights.

    New Data Collection

    So the team introduced two new data collections. The first was a ‘Great Managers Award’ through which employees could nominate managers they feel were particularly good. As part of the nomination employees had to provide examples of behaviours that they felt showed that the managers were good managers. The second data set came from interviews with the managers in each of the two quartiles (bottom and top) to understand what they were doing (the managers didn’t know which quartile they were in). The data from the interviews and from the Great Manager Award nominations was then coded using text analysis. Based on this the analytics team was able to extract the top 8 behaviours of a high scoring manager as well as the top 3 causes why managers are struggling in their role. 

    Using the Insights

    Google used different ways of sharing these insights with the relevant people including a new manager communication that outlined the findings and expectations. But only sharing the insights wasn’t enough, Google saw a need to act on the insights. There were many concrete actions that followed this analysis, here are some key ones:

    • Google started to measure people against these behaviours. For that purpose it introduced a new twice-yearly feedback survey
    • Google decided to continue with the Great Manager Award
    • Google revised the management training

    An Intelligent Organization

    Google is a great example of how good decision-making should be supported by good data and facts. Google clearly followed the five steps I outline in my book ‘The Intelligent Company: Five steps to success with Evidence-based Management’:

    1. Defining the objectives and information needs: ‘Do managers matter?’ and ‘What makes a good manager within Google?’
    2. Collecting the right data: using existing data from performance reviews and employee surveys and creating new data sets from the award nominations and manager’s interviews.
    3. Analysing the data and turning it into insights: simply plotting of the results, regression analysis and text analysis.
    4. Presenting the Information: new communications to the managers
    5. Making evidence-based decisions: revising the training, measuring performance in line with the findings, introducing new feedback mechanisms.

    Author: Bernard Marr

    Source: Smart Data Collective

  • Modern Information Management: Understanding Big Data at Rest and in Motion

    Big data is the buzzword of the century, it seems. But, why is everyone so obsessed with it? Here’s what it’s all about, how companies are gathering it, and how it’s stored and used.

    7979558647 6c822e698d o YO

    What is it?

    Big data is simply large data sets that need to be analyzed computationally in order to reveal patterns, associations, or trends. This data is usually collected by governments and businesses on citizens and customers, respectively.

    The IT industry has had to shift its focus to big data over the last few years because of the sheer amount of interest being generated by big business. By collecting massive amounts of data, companies, like Amazon.com, Google, Walmart, Target, and others, are able to track buying behaviors of specific customers.

    Once enough data is collected, these companies then use the data to help shape advertising initiatives. For example, Target has used its big data collection initiative to help target (no pun intended) its customers with products it thought would be most beneficial given their past purchases.

    How Companies Store and Use It

    There are two ways that companies can use big data. The first way is to use the data at rest. The second way is to use it in motion.

    At Rest Data – Data at rest refers to information that’s collected and analyzed after the fact. It tells businesses what’s already happened. The analysis is done separately and distinctly from any actions that are taken upon conclusion of said analysis.

    For example, if a retailer wanted to analyze the previous month’s sales data. It would use data at rest to look over the previous month’s sales totals. Then, it would take those sales totals and make strategic decisions about how to move forward given what’s already happened.

    In essence, the company is using past data to guide future business activities. The data might drive the retailer to create new marketing initiatives, customize coupons, increase or decrease inventory, or to otherwise adjust merchandise pricing.

    Some companies might use this data to determine just how much of a discount is needed on promotions to spur sales growth.

    Some companies may use it to figure out how much they are able to discount in the spring and summer without creating a revenue problem later on in the year. Or, a company may use it to predict large sales events, like Black Friday or Cyber Monday.

    This type of data is batch processed since there’s no need to have the data instantly accessible or “streaming live.” There is a need, however, for storage of large amounts of data and for processing unstructured data. Companies often use a public cloud infrastructure due to the costs involved in storage and retrieval.

    Data In Motion – Data in motion refers to data that’s analyzed in real-time. Like data at rest, data may be captured at the point of sale, or at a contact point with a customer along the sales cycle. The difference between data in motion and data at rest is how the data is analyzed.

    Instead of batch processing and analyzation after the fact, data in motion uses a bare metal cloud environment because this type of infrastructure uses dedicated servers offering cloud-like features without virtualization.

    This allows for real-time processing of large amounts of data. Latency is also a concern for large companies because they need to be able to manage and use the data quickly. This is why many companies send their IT professionals to Simplilearn Hadoop admin training and then subsequently load them up on cloud-based training and other database training like NoSQL.

    9427663067 713fa3e786 o

    Big Data For The Future

    Some awesome, and potentially frightening, uses for big data are on the horizon. For example, in February 2014, the Chicago Police Department sent uniformed officers to make notification visits to targeted individuals they had identified as potential criminals. They used a computer-generated list which gathered data about those individuals’ backgrounds.

    Another possible use for big data is development of hiring algorithms. More and more companies are trying to figure out ways to hire candidates without trusting slick resume writing skills. New algorithms may eliminate job prospects based on statistics, rather than skillsets, however. For example, some algorithms find that people with shorter commutes are more likely to stay in a job longer.

    So, people who have long commutes are filtered out of the hiring process quickly.

    Finally, some insurance companies might use big data to analyze your driving habits and adjust your insurance premium accordingly. That might sound nice if you’re a good driver, but insurers know that driving late at night increases the risk for getting into an accident. Problem is, poorer people tend to work late shifts and overnights or second jobs just to make ends meet. The people who are least able to afford insurance hikes may be the ones that have to pay them.

    Source: Mobilemag

  • Nieuwsoverzicht BI tools markt juli 2019

    Nieuwsoverzicht BI tools markt juli 2019

    Gedurende de hele maand mei kabbelde de markt voor BI-platforms en analytics zo’n beetje voort, maar begin juni was er breaking news. Zo werd de BI-gemeenschap opgeschrikt door de overname van twee veelbelovende leveranciers van BI tools en kwamen er ontluisterende mislukkingen aan het licht bij twee bekende specialisten in big data oplossingen.

    Looker en Tableau overgenomen

    We beginnen bij de overnames. Eerst was de in het Amerikaanse Santa Cruz gevestigde ´nichespeler´ Looker aan de beurt. Looker, vooral bekend van LookML als alternatief voor SQL, ´verdween´ voor 2.6 miljard dollar in de Google cloud. Nauwelijks van de schrik bekomen, werd pal daarna bekendgemaakt dat Salesforce voor omgerekend 15,7 miljard dollar de in Seattle gevestigde ´leider´ Tableau opslokt. Valt er op termijn voor klanten nog wel wat te kiezen?

    Op dit moment is nog moeilijk in te schatten wat de exacte gevolgen zullen zijn voor huidige en toekomstige klanten, distributeurs en andere partners van de twee overnamedoelwitten. Zoals gebruikelijk zijn er veel geruststellende woorden, wordt er geschermd met synergievoordelen en meer financiële armslag en enorme marketingmogelijkheden voor zowel Looker als Tableau. En dat allemaal onder de vleugels van de twee bekende techgiganten.

    Grotere afhankelijkheid van dominante marktpartijen

    Één ding is zeker, dit nieuwe ´overnamegolfje´ in de BI-wereld betekent voor klanten een nog grotere afhankelijkheid van enkele dominante marktpartijen en dat maakt hun onderhandelingspositie er in het algemeen meestal niet sterker op. Bovendien is het nog maar de vraag in hoeverre de overnemende partijen bereid zijn verder te investeren in innovatie van het producten- en dienstenportfolio van de overgenomen BI-spelers. Ook moet nog blijken hoe lang Google en Salesforce nog bereid zijn om te investeren in de merknamen van Looker en Tableau. Als we verder inzoomen op de aangekondigde acquisities valt een aantal details op.

    Inspelen op digitale transformatie

    Salesforce betaalt de bestaande aandeelhouders van het beursgenoteerde Tableau in aandelen, terwijl Google de aandeelhouders van het privaat gefinancierde Looker in cash betaalt. Ook de transactiewaarde verschilt aanzienlijk. Voor Tableau wordt omgerekend 6 keer meer betaald dan voor Looker. Dat is logisch, omdat Looker tot voor kort nog in de financieringsronde van ´E-funding´ verkeerde en Tableau volwassener is. Tableau is sinds 2013 beursgenoteerd en daardoor heeft het langer kunnen bouwen aan een hogere marktkapitalisatie.

    Data is het fundament

    Salesforce beschikt met Tableau volgens eigen zeggen over een ijzersterke propositie. De bedrijfsculturen komen overeen en zijn volgens hen sterk gericht op het succes van klanten. Gezamenlijk hopen ze te fungeren als een ´supercharger´ voor de digitale transformatie. De nummer 1 op de markt voor CRM systemen en de nummer 1 van end-to-end analytics platforms bundelen hun platforms om klanten te helpen bij die digitale transformatie. ‘Data is het fundament voor elke digitale transformatie’, zo stelt Salesforce. Tableau’s CEO Adam Selipsky verwacht met hulp van Salesforce miljoenen nieuwe eindgebruikers te kunnen bedienen met zijn platform en ´actionable insight´.

    De potentiële markt voor bedrijven die met de digitale transformatie worstelen, wordt door marktonderzoeksbureau IDC geschat op maar liefst 1,8 biljoen Amerikaanse dollars.

    Niet alleen kijken naar data, maar deze ook begrijpen

    Kortom: de leveranciers lijken vooral zichzelf rijk te rekenen. Of eindgebruikers hier nu direct beter van worden, dat valt echter nog te bezien. Salesforce verwacht veel van Customer 360 en Einstein, het eigen platform, waarmee het klanten nu al op Artificial Intelligencegebaseerde inzichten verstrekt op het gebied van verkoop en marketing. In combinatie met het intuïtieve self-service analytics platform van Tableau hoopt het een nog bredere doelgroep aan klanten en eindgebruikers te bereiken in elke fase van de customer journey en tijdens elk touchpoint. De missie is om gebruikers te ondersteunen, zodat ze data niet alleen zien, maar ook begrijpen.

    Alles lijkt business as usual

    Tableau heeft in zestien jaar tijd een klantenbestand opgebouwd van zo’n 86.000 klanten wereldwijd, waaronder Netflix, Verizon en Schneider Electrics. Als onderdeel van Salesforce zal Tableau als zelfstandig merk verder opereren vanuit hetzelfde hoofdkantoor in Seattle en onder leiding van het huidige managementteam en dezelfde CEO. “Never change a winning team”, zo lijkt het devies. De communities van beide partijen, de 1,4 miljoen Trailblazers en de 1 miljoen tellende Tableau-community zullen elkaar bovendien alleen maar versterken, zo schatten beide partijen in.

    Bedrijven hopen op de befaamde hefboomwerking

    Ook Looker verwacht de vruchten te kunnen plukken na de overname door Google, dat formeel onderdeel uitmaakt van het moederbedrijf Alphabet. Na de finalisering van de transactie en de betaling van de overnamesom van 2,6 miljard dollar zal Looker direct opgaan in de Google Cloud.

    Looker, dat in 2012 werd opgericht en een hoofdkantoor heeft in Santa Cruz, verheugt zich met name op een groter bereik en verwacht over meer middelen en intellectueel kapitaal te kunnen beschikken om hun zogenoemde unified platform voor BI, big data-applicaties en embedded analytics aan de man te brengen. Vanuit de Google Cloud zal Looker zijn klanten in staat stellen om te profiteren van de hefboomwerking die ontstaat door analytics, machine learning en AI te combineren.

    Looker heeft een klantenbestand van zo’n 1.700 bedrijven en instellingen, waaronder bekende merken als Amazon, Etsy, IBM, Kickstarter, Lyft, Sony, Spotify en The Economist. Voor Google is de overname van Looker de grootste overname sinds de acquisitie van Nest, de slimme, ‘lerende’ thermostaat, die onder meer het verwarmen van je woning aanpast aan je leefpatroon, hetgeen je  op afstand kunt regelen via je smartphone. Net als Tableau verwacht ook Looker door de overname bedrijven beter te kunnen ondersteunen bij de digitale transformatie.

    De overname van Looker door Google komt niet helemaal uit de lucht vallen. Beide bedrijven onderhielden tot dusverre al een samenwerkingsverband waarbinnen ze 350 gezamenlijke klanten bedienden, waaronder Buzzfeed, Hearst, King en Yahoo!

    Sterk groeiende vraag naar locatiedata

    Nog voor de aangekondigde overname door Salesforce introduceerde Tableau nieuwe mapping-mogelijkheden voor een soepeler en slimmer gebruik van locatiegegevens. Zo biedt versie 2019.2. functionaliteit voor het analyseren van geospatiale data en nieuwe zogenoemde authoring-mogelijkheden voor de samenstellers.

    Vernieuwingen in Ask Data

    Deze nieuwe release breidt de door Mapbox aangedreven technologie voor background mapping uit met vectorkaarten. Gebruikers kunnen hiermee gedetailleerdere locatiedata naar voren brengen en analyseren op basis van achtergrondlagen met meer contextuele informatie. Aanvullende parameter-settings zorgen volgens de leverancier daarnaast voor verbeterde visuele interactiviteit in de nieuwe release. Ook zijn er verbeterde tools beschikbaar gekomen voor het samenstellen van dashboards en vernieuwingen in Ask Data, de natural language processing (NLP) technologie van Tableau. De softwareleverancier signaleert een groeiende vraag naar locatiedata.

    Analyses zonder onderbreking

    Volgens een recent analistenrapport zal 30% van alle interacties met klanten in 2022 worden aangestuurd door realtime analyses van hun locatie. In 2017 was dit nog slechts 4%. De nieuwe vectorkaarten in Tableau bieden gedetailleerde informatie en soepelere verkenningsmogelijkheden dan mapping-oplossingen die gebruikmaken van afbeeldingen. Als gebruikers inzoomen, uitzoomen of schuiven past de software de kaartomvang automatisch aan en hoeft deze geen afbeeldingen te laden, zodat gebruikers geen enkele onderbreking in hun analyse-flow hoeven te ervaren.

    Unicorn van 1 miljard dollar moet deuren sluiten

    In zijn hoogtijdagen werd het in Santa Clara gevestigde MapR nog gewaardeerd op meer dan een 1 miljard dollar. De zogenoemde ´unicorn´ groeide uit tot Apache Hadoop-specialist, maar moet nu de deuren sluiten, tenzij er nog een koper wordt gevonden. Ondanks de enorme hype heeft Hadoop niet zo’n hoge vlucht genomen. Volgens Gartner gebruikte in 2017 slechts 14% van de ondernemingen de als traag en complex ervaren opensourcesoftware. De CEO is samen met 122 werknemers ontslagen. 

    Recente megafusie levert niet gewenste resultaat

    Ook bij Cloudera, ‘the enterprise data cloud company’ is de uittocht begonnen. De CEO is vervangen en de recente megafusie met rivaal Hortonworks heeft blijkbaar niet het gewenste resultaat opgeleverd. De financiële resultaten vallen vies tegen. Bovendien stellen de bestaande klanten van de fusiepartners nieuwe investeringen uit, vanwege de ontstane onzekerheid en het ontbreken van een gezamenlijke roadmap. Ook de concurrentie van de public cloud vendors speelt Cloudera parten.

    Slecht nieuws voor eindgebruikers

    Hoe je het ook bekijkt, dit is allemaal slecht nieuws voor klanten en eindgebruikers die ooit in goed vertrouwen met deze op papier veelbelovende partijen in zee zijn gegaan. Moraal van het verhaal: doe je huiswerk en probeer, voor zover mogelijk, een goed beeld te krijgen van de roadmap, de USPs, de kwaliteit van het management, reviews, gebruikerservaringen, enzovoorts. En last but not least, blijf zoveel mogelijk weg van de hypes!

    Auteur: Eric van der Steen

    Bron: Passionned

EasyTagCloud v2.8