22 items tagged "Technology"

  • BC - Business & Competitive - Intelligence

    BC (Business & Competitive) Intelligence

    Business Intelligence is zo´n begrip dat zich nauwelijks letterlijk laat vertalen. Bedrijfsintelligentie zou in de buurt kunnen komen, maar valt net als andere vormen van intelligentie moeilijk precies te duiden. Bedrijfsinzicht of -begrip komen wellicht nader in de buurt. Andere benaderingen (van andere auteurs) voegen daar bedrijfs- of omgevingsverkenningen als alternatieve vertaling aan toe.

    Om Business en Competive Intelligence goed te begrijpen maken we hier gebruik van een analytisch schema (tabel 1.1). Daarmee wordt het mogelijk de verschillende verschijningsvormen van BI te onderscheiden en daarmee de juiste variant bij het juiste probleem toe te passen. Belangrijk is dat het hierbij gaat om stereotypen! In de praktijk komen mengvormen voor.

    Het uitgangspunt is dat BI wordt gezien als een informatieproces waarbij met behulp van data, kennis of inzicht wordt geproduceerd.

     

    Data over de interne bedrijfsvoering

    Data over de bedrijfsomgeving

    Bedrijfskundige

    benadering

    A

    B

    Technologische

    benadering

    C

    D

    Tabel 1.1

    In de tabel is een bedrijfskundige van een technologische benadering te onderscheiden. BC Intelligence behandeld BI vanuit de bedrijfskundige processen die dienen te worden ondersteund. Er bestaat ook een technologisch perspectief op BI. Het uitgangspunt van deze benadering is veeleer te profiteren van de mogelijkheden die informatietechnologie biedt om bedrijfsinzicht te verkrijgen.. Op de andere as in het schema worden data over de interne bedrijfsvoering (interne data) van data over de bedrijfsomgeving (externe data) onderscheiden. We spreken met nadruk over onderscheiden in plaats van gescheiden categorieën. In de gebruikspraktijk blijken de categorieën namelijk wel te onderscheiden maar nauwelijks te scheiden. Ze kunnen niet zonder elkaar en zijn vaak ondersteunend of complementair.

    Business Intelligence

     

    Data over de interne bedrijfsvoering

    Data over de bedrijfsomgeving

    Bedrijfskundige

    benadering

    A

    B

    Technologische

    benadering

    C

    D

    Hoewel het onderscheid arbitrair is en de term BI net zo goed voor het totale quadrant gereserveerd zou kunnen worden (met CI als deelverzameling) hebben veel BI projecten betrekking op de cellen A en C.

    BI gaat dus vaak op het optimaliseren van bedrijfsprocessen waarbij het accent ligt op het verwerwen van bedrijfsinzicht uit data over de onderneming zelf. Deze data genereren doorgaans kennis over de huidige situatie van de onderneming. Kennis die voor strategievorming en optimalisatie van bedrijfsresultaten (denk aan Performance Management) onontbeerlijk is.

    De technoloische component van BI wordt  door cel C gerepresenteerd. Helaas heeft deze invalshoek bij veel dienstverleners de overhand. Het accent ligt daarbij op de inrichting van een technologische infrastructuur die adequate kennis over de onderneming en haar prestaties mogelijk maakt. In cel C kunnen daarom zowel ETL-tools, datawarehouses als ook analytische applicaties worden gedacht.

    Redactioneel BI-kring:

    In de cel A hebben wij eigenlijk nauwelijks een categorie gedefinieerd.  Wat mij betreft zou daarPerformance Management thuis horen. Die term zou ik dus willen toevoegen. Als Key words kun je denken aan: Key Performance Indicators, Performance Process Management, Organizational Performance,  PDCA (Plan Do Check Act) Cycle, Performance Planning.

    Voor wat betreft C kunnen we verwijzen naar bovenstaande tekst:Datawarehousingen OLAPzijn daar de centrale elementen.Key words zijn dat databases, ETL (Extraction, Transformation and Load), , architecture, data dictionary, metadata, data marts.

    Met betrekking tot OLAP zijn key words:analytische applicaties, reporting, queries, multidimensionale schema’s, spreadsheet, Kubus, data mining.

    Competitive Intelligence

     

    Data over de interne bedrijfsvoering

    Data over de bedrijfsomgeving

    Bedrijfskundige

    benadering

    A

    B

    Technologische

    benadering

    C

    D

    CI is het proces waarin data over de omgeving van de onderneming in een informatieproces worden getransformeerd in ´strategisch bedrijfsinzicht´. Hoewel de term Competitor en Competitive Intelligence vanaf de tachtiger jaren wordt gebruikt heeft deze benadering ook in de jaren zeventig al aandacht gehad onder de noemer 'environmental scanning'.

    CI speelt een belangrijke rol in strategische maar ook andere bedrijfskundige processen. Prestaties van de onderneming, concurrentiepositie, mogelijke toekomstige posities als ook innovatievermogen kunnen slechts worden bepaald met behulp van kennis over de bedrijfsomgeving.

    Redactioneel BI-kring:

    Competitive Intellience heeft dus te maken met alle informatievoorziening die wordt georganiseerd om de concurrentiepositie van ondernemingen te kunnen bepalen, beoordelen en veranderen. Het raakt dus direct aan strategie, strategische intelligence, concurrentie-analyse, concurrentiepositie, en alle intelligence die nodig is om de positie van de onderneming in de omgeving goed te kunnen beoordelen.

    Het organiseren van CI is in organisaties nog steeds zwaar onderbelicht. Het blijkt moeilijk structuur aan te brengen in de noodzakelijke informatieprocessen als ook om ze uit te voeren. De inrichting van een ´systeem´ dat dit proces zou moeten realiseren staat in het middelpunt van de aandacht maar is voor veel organisaties ook nog een brug te ver. Een verantwoorde ontwikkelbenadering vergroot de succeskansen echter aanzienlijk.

    Data over de bedrijfsomgeving zijn vaak ongestructureerd van aard en in de organisatie voorhanden. De kunst is deze data beschikbaar te maken voor de besluitvorming. Wanneer de data niet in de onderneming beschikbaar is verschillen de technieken en instrumenten die moeten worden ingezet om deze data te ontsluiten van de bij BI gebruikte technieken. De technieken varieren vandocumentmanagementsystemen tot information agents die zelfstandig het internet afzoeken naar interessante bouwstenen (data!). Bij het structureren en analyseren van de ongestructureerde documenten wordt text mining gebruikt (in geval van www; web-content-mining).

    Redactioneel BI-kring

    Om competitive Intelligence adequaat te ondersteunen en met name ook primaire data beschikbaar te maken ten behoeve van het proces zijn Collaboration tools populair. Het gaat hier over kennismanagement achtige systemen en shareware toepassingen die de data-, informatie- en kennisdeling faciliteren. Key words: kennismanagagement, shareware, sharepoint, knowledge management.

    Overzicht data categorieen BI-kring

    Cel A format

    • Performance Management

    Key words:Key Performance Indicators, Performance Process Management, Organizational Performance,  PDCA (Plan Do Check Act) Cycle, Performance Planning

    Cel C format

    • Datawarehousing

    Key words:databases, ETL (Extraction, Transformation and Load), , architecture, data dictionary, metadata, data marts., Big Data

    • Online Analytical Processing

    Key words:analytische applicaties, reporting, queries, multidimensionale

     schema’s, spreadsheet, Kubus, data mining, dashboarding.

    Cel B format

    • Competitive Intelligence

    Key words:strategie, strategische intelligence, concurrentie-analyse, concurrentiepositie, competitor intelligence, technological intelligence, environmental scanning, environmental intelligence.

    • Content (Competitive Intelligence als product)

         Key words:

    Cel D format

    • Collaboration

    Key words:kennismanagagement, shareware, sharepoint, knowledge management.

    • Search methodologies

                     Key words:documentmanagement systemen, spider technologie,

    ongestructureerde informatie, information agents, text mining, content mining. Search technologies.

    Integraal tav hele schema (intelligente organisaties hebben het hele model integraal geimplementeerd)

    • Intelligente organisatie

    Key words:Management informatie, Intelligente organisatie, lerende organisatie, organisationeel leren, leren, Intelligence change management.

     

    Bron: Egbert Philips

     

  • Blockchain-based banking backend Vault OS from ex-Googler emerges from stealth mode

    vaultos featureDespite holding the vast majority of the world’s wealth (or perhaps because of that), banks aren’t exactly hotbeds of cutting-edge tech, often relying on decades-old systems for everyday tasks. ThoughtMachine, a company led by ex-Google engineer Paul Taylor, is looking to change that with a modern, fully integrated, blockchain-based banking operating system called Vault OS.

    The bombastic press release announcing the system’s emergence from two years of stealth development makes a lot of promises: the company “has solved the greatest challenge in fintech;” Vault OS is “100% future-proof,” “hugely flexible,” and “fixes broken banking forever.”

    Whether Vault OS is able to live up to its own hype is a question that will have to wait (legacy banking systems aren’t replaced overnight) — but it’s hard to deny that the problem is real and the solution, or at least what the company reveals of it, is compelling.

     

    ThoughtMachine’s Paul Taylor

    The main job of Vault OS is to perform the core function of a bank: essentially, maintaining a huge ledger. That’s something that a blockchain is uniquely suited to doing, of course, a fact that clearly did not escape Taylor, whose previous work led to the speech recognition software used by Google today.

    Each instance of the OS will run its own private blockchain and cryptographic ledger, hosted as a service by ThoughtMachine. Of course, whether banks will be willing to essentially permanently outsource their most fundamental operations is yet another big question.

    The benefits may be worth it: blockchains are secure, scalable, and versatile, and could conceivably replace legacy systems that limit or delay ordinary operations. Transactions would occur in real time, and are safely and centrally stored, allowing for deep data dives by both bankers and consumers. There’s even an API.

    Naturally there are a ton of questions that must be answered, and assurances made, and regulations complied with, before any bank will touch this with a ten-foot pole. I’ve contacted ThoughtMachine with several — will they release code or a whitepaper for inspection? How is data migration handled? What’s the timescale for rollout? — and will update this post if they get back to me.

    Source: Techcrunch

  • Business Intelligence and beyond: predictions for 2016

    businessIt’s been an interesting year for BI – and 2016 looks set to be no different

    Here are some predictions on what we believe next year has in store, in particular for the data and analytics industry.

    1. Cannibalisation of the channel
    Next year will see many vendors looking to take back control, rather than invest in their channel partners. The danger for the channel is that this will result in vendors keeping good deals or redirecting services projects back to themselves. Platforms such as Amazon Web Services and Microsoft Azure have grown exponentially this year. Another risk is the continued trend of vendors developing hosted solutions, via platforms such as these that cut out their channel partners. In response to this, the channel needs to look for vendors with a transparent indirect strategy in place and form mutually beneficially relationships.

  • Business Intelligence nog steeds hot….

    Business Intelligence outdated? Niets is minder waar zo bewees het Heliview congres ‘Decision making by smart technologies’ dat afgelopen dinsdag in de Brabanthallen in Den Bosch werd georganiseerd.

    200 Klantorganisaties luisterden naar presentaties van o.a. Rick van der Lans, Peter Jager, Frank de Nijs en Arent van ‘t Spijker. Naast het bekende geluid was er ook veel nieuws te beluisteren in Den Bosch.

    Nieuwe technologieën maken heel veel meer mogelijk. Social media en, moderne, big data technologie stellen organisaties in staat veel meer waarde uit data te halen. Hoe organisaties dat moeten doen is veelal nog een uitdaging. Toepassing van de technologie is geen doel op zich zelf. Het gaat erom toegevoegde waarde voor organisaties te produceren. Of door optimalisatie van processen. Dan wel door het beter bedienen van de klant door productontwikkeling. In extremis kan data zelfs de motor achter nieuwe business concepten of –modellen zijn. Voorwaarde is wel een heldere bedrijfsvisie (al dan niet geproduceerd met intelligent gebruik van data en informatie). Belangrijk om te voorkomen dat we ongericht miljoenen stuk slaan op nieuwe technologie.

    Voor de aanwezigen was het gehoorde geluid soms bekend, maar soms ook een confrontatie met zichzelf. Een ding is zeker: De rol van data en informatie bij het intelligent zaken doen is nog niet uitgespeeld. Business Intelligence leeft.

    30 JANUARI 2015

  • De tijd van excel nu echt over?

    29 Mei 2015

    De hoeveelheid data waarmee een marketeer of sales verantwoordelijke moet werken zijn de afgelopen jaren enorm toegenomen. Die ontwikkeling is exponentieel. Met de opkomst van e-commerce, de introductie van klantprogramma’s en de opkomst van beacons neemt de hoeveelheid data die we van consumenten krijgen alleen maar toe.

    De praktijk is echter wel dat veel marketeers nog werken met Excel om hun campagnes, business cases en planningen te managen. Als je voor al je klanten een persoonlijke campagne wilt samenstellen dan is dit niet meer houdbaar. Excel is niet te koppelen aan online tools, versiebeheer is een groot probleem en de inhoud is per definitie achterhaald als het bestand geopend wordt. Wij voorspellen dat in 2015 marketeers en salesverantwoordelijken overstappen op online tools om hun proces beter te managen én de data daaruit direct te gebruiken om persoonlijker en relevanter te worden.

    Meer weten over deze tools en de expertise gebieden die noodzakelijk zijn om ze goed te implementeren? Volg BI-kring de komende maanden.

    BI-kring redactie

  • Forrester: Insights to help prepare your organization for 5G

    Forrester: Insights to help prepare your organization for 5G

    5G Presents immense innovation potential

    5G promises to usher in not just new use cases in every industry but also new business models.

    Some of the most relevant use cases across industries, such as those enabled by AR/VR and massive IoT, fit right into improving customer experience and digital transformation. As a change agent, 5G is among the most important technological enablers in this decade and the next. Therefore, investing and taking a deep look at 5G is critical at this time.

    5G Will develop rapidly through 2020 but is still developing nonetheless

    The 5G wireless arms race is fueled by the immense potential, so technology development is intense. Almost all current 5G announcements are regional siloed pilots and enhancements upon 4G LTA rather than actual 'at-scale' 5G standalone deployments. Manufacturers and operators have been aggressively pushing their 5G strategies. However, many challenges and uncertainties are still open: cost of network, monetization of use cases, regulatory challenges and, most importantly, the lack of mature standards.

    2018-19 Was a major leap in 5G standards, but beware the hype

    Through the 3GPP standards body, the industry had agreed to complete the non-standalone (NSA) implementation of 5G New Radio by December 2017, and this facilitated large-scale trials based on the specifications.

    Various sources cite numerous estimates about 5G. According to the International Telecommunications Union (ITU), commercial 5G networks are expected to start deployment after 2020. By 2025, the GSM Association (GSMA) expects 5G connections to reach 1.1 billion, which is estimated to be about 12 percent of total mobile connections. One Ericsson study estimates that 5G-enabled industry digitalization revenues for ICT players will be US$1.3 trillion in 2026. Still, current 5G reality is far from the profound expectations established by its proponents.

    Structuring your 5G thinking

    At Forrester, we have a deep bench of experts who are closely monitoring the developments and hype around 5G.

    Here is a simple framework:

    1. First, understand the background, technology, and the physical and business challenges behind practical implementations of 5G to cut through the hype. 
    1. There is a lot of talk about coverage in rural areas. In fact, bridging the digital divide is often touted to be a big plus of 5G. However, every early investment and the motivation behind it seem to suggest that at least until 5G achieves deployment scale, the digital divide may get worse. 
    1. Further, thoroughly assess your own 5G needs. Many current use cases probably do not need 5G. Hence, clearly understanding and nailing your use cases is an important vision to have. 
    1. Understand how 5G will transform your network operations, impact apps, and customer experience
    1. Finally, ask the right questions to your service provider on 5G timelines, cost, strategy, coverage, and implementation to understand what you can expect and to plan your investments in the coming months.

    Author: Abhijit Sunil

    Source: Forrester

  • Gartner: US government agencies falling behind digital businesses in other industries

    Gartner: US government agencies falling behind digital businesses in other industries

    A Gartner survey of more than 500 government CIOs shows that government agencies are falling behind other industries when it comes to planned investments in digital business initiatives. Just 17% of government CIOs say they’ll be increasing their investments, compared to 34% of CIOs in other industries.

    What’s holding government agencies back? While Gartner notes that their CIOs demonstrate a clear vision for the potential of digital government and emerging technologies, almost half of those surveyed (45%) say they lack the IT and business resources required to execute. Other common barriers include lack of funding (39%), as well as a challenge organizations across all industries struggle with: culture and resistance to change (37%).

    Another key challenge is the ability to scale digital initiatives, where government agencies lag by 5% against all other industries. To catch up, government CIOs see automation as a potential tool. This aligns with respondents’ views on 'game-changing' technologies for government. The top five in order are:

    • Artificial intelligence (AI) and machine learning (27%)
    • Data analytics, including predictive analytics (22%)
    • Cloud (19%)
    • Internet of Things (7%)
    • Mobile, including 5G (6%)

    Of the more than 500 government respondents in Gartner’s survey, 10% have already deployed an AI solution, 39% say they plan to deploy one within the next one to two years, and 36% intend to use AI to enable automation, scale of digital initiatives, and reallocation of human resources within the next two to three years.

    Investing today for tomorrow's success

    When it comes to increased investment this year (2019), BI and data analytics (43%), cyber and information security (43%), and cloud services and solutions (39%) top the tech funding list.

    As previous and current digital government initiatives start to take hold, CIOs are seeing moderate improvements in their ability to meet the increasing demands and expectations of citizens. 65% of CIOs say that their current digital government investments are already paying off. A great example of this is the U.S. Department of Housing and Urban Development’s use of BI and data analytics to modernize its Grants Dashboard.

    Despite budget and cultural change challenges typically associated with digital government initiatives, make no mistake: many agencies are making great strides and are now competing or leading compared to other organizations and industries.

    There’s never been a better time to invest in game changing technologies to both quickly catch up, and potentially take the lead.

    Author: Rick Nelson

    Source: Microstrategy

  • How Big Data Is Changing Disruptive Innovation

    jan16-27-543369765-1024x576Much fanfare has been paid to the term “disruptive innovation” over the past few years. Professor Clayton M. Christensen has even re-entered the fold clarifying what he means when he uses the term. Despite the many differences in application, most people agree on the following. Disruptive innovations are:

    Cheaper (from a customer perspective)

    More accessible (from a usability or distribution perspective)

    And use a business model with structural cost advantages (relative to existing solutions)

    The reason these characteristics of disruption are important are that when all three are present, it’s difficult for an existing business to respond to competition. Whether a company is saddled with fixed infrastructure, highly trained specialist employees, or an outmoded distribution system, quickly adapting to new environments is challenging when one or all of those things becomes obsolete. Firing hundreds of employees, upsetting your core business’ distribution partners, writing off billions of dollars of investment — these things are difficult for managers to even contemplate, and with good reason.

    Historically, the place we’ve looked for hints of oncoming disruptions has been in the low end of the market. Because disruptive products were cheaper, more accessible, and built on new technology architectures, they tended to be crummier than the existing highest-end solutions. Their cost advantage allowed them to reach customers who’d been priced out of an existing market; Apple originally made a computer that was cheap enough for students to learn on, a population that wouldn’t have dreamt of purchasing a DEC minicomputer. Sony famously made the transistor-based television popular based on its “portability.” No one knew that you could reasonably do that prior to the transistor. New technologies, combined with business model innovation, provide the structural cost advantage necessary to take large chunks of the market over time.

    But if you return to the definition above, the fact that low-end entry was typical of a disruptive approach was was never core to the phenomenon. Instead, it was a byproduct. Why? Because any new entrant is hard pressed to deliver superior value to a mature market, where products have been refined over decades.

    But although the low-end approach was pretty common, it wasn’t what was holding incumbent firms captive. It was their own cost structures and their focus on driving marginal profit increases that kept those companies headed down the wrong paths. As long making the right decision on a short-term basis (trying to drive more value out of outdated infrastructure) is the wrong decision on a long-term basis (failing to adopt new technology platforms), CEOs are destined to struggle.

    Unfortunately, the focus on the low-end approach of disruption is actually clouding our ability to spot the things that are: cheaper, more accessible, and built on an advantaged cost structure. Specifically, it appears that data-enabled disruptors often confound industry pundits. To get a sense for the point, just look to a few highly contested examples.

    Is Uber disruptive? The wrong answer would be to say, “No, because their first product started in the high end of the market.” The right answer would be to acknowledge that the platform they ultimately launched allowed them to add lower cost drivers (in the form of UberX) and offer cheaper, more accessible, transportation options with a structural cost advantage to both taxi services and potentially even car ownership. The convenience of the app is only the most obvious, and easiest to copy, factor.

    Were Google’s Android phones disruptive to Nokia? The wrong answer would be to say “No, because the initial smartphones they launched were superior in feature quality to Nokia’s own phones that dominated the global landscape.” The right answer would be to acknowledge that the approach of creating an ecosystem of application development atop its platform allowed them to build far more comprehensive solutions, that were (on the whole) cheaper, more accessible, and structurally cost advantaged over Nokia.

    Is 23andMe potentially disruptive to pharmaceutical companies? The wrong answer would be to say, “No, because they compete in completely different verticals.” One in ancestry and the other in drug development. The right answer would be to acknowledge that 23andMe has a vast amount of data that could enable them to start developing drugs in a cheaper, more accessible, and structurally advantaged model.

    In every one of these examples, the ultimate end is disruption. In every one of these examples, incumbent managers have a short term incentive to ignore the challenge — making best use of their existing infrastructure. Taxi companies tried to leverage regulation to preserve the value of their medallions and drivers. Nokia tried frivolously to protect its closed ecosystem and preserve employment for their thousands of Symbian focused staff members. And you can be certain that Merck, Pfizer, and Roche have strong incentives to make the best use of their high-end R&D functions before embracing the radically different path that 23andMe might take.

    And over the long term, each of these short-term decisions could lead to failure.

    The conversation misses that something new is going on in the world of innovation. With information at the core of most modern disruptions, there are new opportunities to attack industries from different angles. Uber built a platform in a fragmented limo market that let it come into transportation and logistics more broadly. Netflix captured your eyeballs through streaming video and used the data it had to blow up the content production process. Google mapped the world, and then took its understanding of traffic patterns and street layouts to build autonomous cars.

    There is no doubt that disruption is underway here. These players create products that are cheaper and more accessible than their peers. But it’s not necessarily starting at the low end of the market, it’s coming from orthogonal industries with strong information synergy. It’s starting where the source of data is, then building the information enabled system to attack an incumbent industry.

    It’s time for executives, entrepreneurs, and innovators stop quibbling over whether something satisfies the traditional path of disruption. Data-enabled disruption may represent an anomaly to the existing theory, but it’s here — and it’s here to stay. The waste laid to the taxi industry by Uber is example that the new solution had extraordinary cost advantages and that they couldn’t respond. The new questions should be:

    • “How can you adapt in the face of this new type of competition?”
    • “How do you evaluate new threats?”
    • “What capabilities do you need and where do you get them, when data is a critical piece of any new disruption?”


    To succeed in this new environment, threatened businesses need a thoughtful approach to identifying potential threats combined with the will to make the right long-term investments — despite short-term profit incentives.

    Source: Harvard Business review

  • How To Build A Game-Changing Team For Your Business

    corporate-team-buildingGetting a successful business up and running is a key skill for entrepreneurs. Building a team that can take it to the next level is another; and some might say, one of the most difficult to master. The people they need to bring on board must be at ease with autonomy, entrepreneurial, driven, and able to apply their skills to a wide range of tasks. They also need to share the founder’s vision. Above all, to be part of a game-changing growth strategy, they have to be great team players. But where do you find these people and how do you get them on board?

    Harness the team-building power of technology

    Does it matter if your next key player is based in London, New York, Tokyo or Rome? Not if you have access to the technology that can empower key hires to elevate the team and help the business achieve its goals from anywhere in the world. In-app advertising platform Tapdaq has just closed a $6.5million Series A funding round which will be used to further expand the company by hiring the best talent from across the globe.

    “We’re not into letting geography determine who we hire – we want to find the perfect person for the role,” says CEO and cofounder Ted Nash, a serial entrepreneur who has been creating online companies since he was 12 and was the world’s first teenager to achieve 1million App Store downloads.

    He adds: “There are processes you need to put in place to make sure everyone’s doing the job that’s being asked of them and to maintain a strong company culture, but having a global approach to your workforce allows you to tap into top talent from across the world, freeing your company from geographical boundaries.”

    Target the power players

    Business intelligence company DueDil is growing rapidly, doubling in size to 80 people in 2015, which has involved building core functions, such as the sales team, from scratch. Getting the right people in place to lead those key areas is crucial to success, and the London-based firm recently appointed Nick Noyer as VP of marketing. Noyer was previously director of EMEA marketing and international strategy at Box , where he led market entry into Europe.

    DueDil cofounder and CEO Damian Kimmelman said: “We’ve found smart leaders who bring new skillsets to the company, which is important. But for me, it’s critical to look for someone who can show they are hungry to succeed, as I want people alongside me who have something to prove and are motivated by big challenges. If they have that single attribute, they tend to rise to the obstacles we face as a company as we scale.”

    Source : Forbes

  • Investing In Artificial Intelligence

    shutterstock Artificial intelligence is one of the most exciting and transformative opportunities of our time. From my vantage point as a venture investor at Playfair Capital, where I focus on investing and building community around AI, I see this as a great time for investors to help build companies in this space. There are three key reasons.

    First, with 40 percent of the world’s population now online, and more than 2 billion smartphones being used with increasing addiction every day (KPCB), we’re creating data assets, the raw material for AI, that describe our behaviors, interests, knowledge, connections and activities at a level of granularity that has never existed.

    Second, the costs of compute and storage are both plummeting by orders of magnitude, while the computational capacity of today’s processors is growing, making AI applications possible and affordable.

    Third, we’ve seen significant improvements recently in the design of learning systems, architectures and software infrastructure that, together, promise to further accelerate the speed of innovation. Indeed, we don’t fully appreciate what tomorrow will look and feel like.

    We also must realize that AI-driven products are already out in the wild, improving the performance of search engines, recommender systems (e.g., e-commerce, music), ad serving and financial trading (amongst others).

    Companies with the resources to invest in AI are already creating an impetus for others to follow suit — or risk not having a competitive seat at the table. Together, therefore, the community has a better understanding and is equipped with more capable tools with which to build learning systems for a wide range of increasingly complex tasks.

    How Might You Apply AI Technologies?

    With such a powerful and generally applicable technology, AI companies can enter the market in different ways. Here are six to consider, along with example businesses that have chosen these routes:

    • There are vast amounts of enterprise and open data available in various data silos, whether web or on-premise. Making connections between these enables a holistic view of a complex problem, from which new insights can be identified and used to make predictions (e.g., DueDil*, Premise and Enigma).
    • Leverage the domain expertise of your team and address a focused, high-value, recurring problem using a set of AI techniques that extend the shortfalls of humans (e.g., Sift Science or Ravelin* for online fraud detection).
    • Productize existing or new AI frameworks for feature engineering, hyperparameter optimization, data processing, algorithms, model training and deployment (amongst others) for a wide variety of commercial problems (e.g., H2O.ai, Seldon* and SigOpt).
    • Automate the repetitive, structured, error-prone and slow processes conducted by knowledge workers on a daily basis using contextual decision making (e.g., Gluru, x.ai and SwiftKey).
    • Endow robots and autonomous agents with the ability to sense, learn and make decisions within a physical environment (e.g., Tesla, Matternet and SkyCatch).
    • Take the long view and focus on research and development (R&D) to take risks that would otherwise be relegated to academia — but due to strict budgets, often isn’t anymore (e.g., DNN Research, DeepMind and Vicarious).

    There’s more on this discussion here. A key consideration, however, is that the open sourcing of technologies by large incumbents (Google, Microsoft, Intel, IBM) and the range of companies productizing technologies for cheap means that technical barriers are eroding fast. What ends up moving the needle are proprietary data access/creation, experienced talent and addictive products.

    Which Challenges Are Faced By Operators And Closely Considered By Investors?

    I see a range of operational, commercial and financial challenges that operators and investors closely consider when working in the AI space. Here are the main points to keep top of mind:

    Operational

    • How to balance the longer-term R&D route with monetization in the short term? While more libraries and frameworks are being released, there’s still significant upfront investment to be made before product performance is acceptable. Users will often be benchmarking against a result produced by a human, so that’s what you’re competing against.
    • The talent pool is shallow: few have the right blend of skills and experience. How will you source and retain talent?
    • Think about balancing engineering with product research and design early on. Working on aesthetics and experience as an afterthought is tantamount to slapping lipstick onto a pig. It’ll still be a pig.
    • Most AI systems need data to be useful. How do you bootstrap your system w/o much data in the early days?

     Commercial

    • AI products are still relatively new in the market. As such, buyers are likely to be non-technical (or not have enough domain knowledge to understand the guts of what you do). They might also be new buyers of the product you sell. Hence, you must closely appreciate the steps/hurdles in the sales cycle.
    • How to deliver the product? SaaS, API, open source?
    • Include chargeable consulting, set up, or support services?
    • Will you be able to use high-level learnings from client data for others?

    Financial

    • Which type of investors are in the best position to appraise your business?
    • What progress is deemed investable? MVP, publications, open source community of users or recurring revenue?
    • Should you focus on core product development or work closely on bespoke projects with clients along the way?
    • Consider buffers when raising capital to ensure that you’re not going out to market again before you’ve reached a significant milestone. 

    Build With The User In The Loop

    There are two big factors that make involving the user in an AI-driven product paramount. One, machines don’t yet recapitulate human cognition. To pick up where software falls short, we need to call on the user for help. And two, buyers/users of software products have more choice today than ever. As such, they’re often fickle (the average 90-day retention for apps is 35 percent).

    Returning expected value out of the box is key to building habits (hyperparameter optimization can help). Here are some great examples of products that prove that involving the user in the loop improves performance:

    • Search: Google uses autocomplete as a way of understanding and disambiguating language/query intent.
    • Vision: Google Translate or Mapillary traffic sign detection enable the user to correct results.
    • Translation: Unbabel community translators perfect machine transcripts.
    • Email Spam Filters: Google, again, to the rescue.

    We can even go a step further, I think, by explaining how machine-generated results are obtained. For example, IBM Watson surfaces relevant literature when supporting a patient diagnosis in the oncology clinic. Doing so improves user satisfaction and helps build confidence in the system to encourage longer-term use and investment. Remember, it’s generally hard for us to trust something we don’t truly understand.

    What’s The AI Investment Climate Like These Days?

    To put this discussion into context, let’s first look at the global VC market: Q1-Q3 2015 saw $47.2 billion invested, a volume higher than each of the full year totals for 17 of the last 20 years (NVCA).

    We’re likely to breach $55 billion by year’s end. There are roughly 900 companies working in the AI field, most of which tackle problems in business intelligence, finance and security. Q4 2014 saw a flurry of deals into AI companies started by well-respected and achieved academics: Vicarious, Scaled Inference, MetaMind and Sentient Technologies.

    So far, we’ve seen about 300 deals into AI companies (defined as businesses whose description includes such keywords as artificial intelligence, machine learning, computer vision, NLP, data science, neural network, deep learning) from January 1, 2015 through December 1, 2015 (CB Insights).

    In the U.K., companies like Ravelin*, Signal and Gluru* raised seed rounds. approximately $2 billion was invested, albeit bloated by large venture debt or credit lines for consumer/business loan providers Avant ($339 million debt+credit), ZestFinance ($150 million debt), LiftForward ($250 million credit) and Argon Credit ($75 million credit). Importantly, 80 percent of deals were < $5 million in size, and 90 percent of the cash was invested into U.S. companies versus 13 percent in Europe. Seventy-five percent of rounds were in the U.S.

     The exit market has seen 33 M&A transactions and 1 IPO. Six events were for European companies, 1 in Asia and the rest were accounted for by American companies. The largest transactions were TellApart/Twitter ($532 million; $17 million raised), Elastica/Blue Coat Systems ($280 million; $45 million raised) and SupersonicAds/IronSource ($150 million; $21 million raised), which return solid multiples of invested capital. The remaining transactions were mostly for talent, given that median team size at the time of the acquisition was 7 people.

    Altogether, AI investments will have accounted for roughly 5 percent of total VC investments for 2015. That’s higher than the 2 percent claimed in 2013, but still tracking far behind competing categories like adtech, mobile and BI software.

    The key takeaway points are a) the financing and exit markets for AI companies are still nascent, as exemplified by the small rounds and low deal volumes, and b) the vast majority of activity takes place in the U.S. Businesses must therefore have exposure to this market.

    Which Problems Remain To Be Solved?

    Healthcare

    I spent a number of summers in university and three years in grad school researching the genetic factors governing the spread of cancer around the body. A key takeaway I left with is the following: therapeutic development is very challenging, expensive, lengthy and regulated, and ultimately offers a transient solution to treating disease.

    Instead, I truly believe that what we need to improve healthcare outcomes is granular and longitudinal monitoring of physiology and lifestyle. This should enable early detection of health conditions in near real time, driving down cost of care over a patient’s lifetime while consequently improving outcomes.

    Consider the digitally connected lifestyles we lead today. The devices some of us interact with on a daily basis are able to track our movements, vital signs, exercise, sleep and even reproductive health. We’re disconnected for fewer hours of the day than we’re online, and I think we’re less apprehensive to storing various data types in the cloud (where they can be accessed, with consent, by third-parties). Sure, the news might paint a different story, but the fact is that we’re still using the web and its wealth of products.

    On a population level, therefore, we have the chance to interrogate data sets that have never before existed. From these, we could glean insights into how nature and nurture influence the genesis and development of disease. That’s huge.

    Look at today’s clinical model. A patient presents into the hospital when they feel something is wrong. The doctor must conduct a battery of tests to derive a diagnosis. These tests address a single (often late-stage) time point, at which moment little can be done to reverse damage (e.g., in the case of cancer).

    Now imagine the future. In a world of continuous, non-invasive monitoring of physiology and lifestyle, we could predict disease onset and outcome, understand which condition a patient likely suffers from and how they’ll respond to various therapeutic modalities. There are loads of applications for artificial intelligence here: intelligence sensors, signal processing, anomaly detection, multivariate classifiers, deep learning on molecular interactions...

    Some companies are already hacking away at this problem:

    • Sano: Continuously monitor biomarkers in blood using sensors and software.
    • Enlitic/MetaMind/Zebra Medical: Vision systems for decision support (MRI/CT).
    • Deep Genomics/Atomwise: Learn, model and predict how genetic variation influence health/disease and how drugs can be repurposed for new conditions.
    • Flatiron Health: Common technology infrastructure for clinics and hospitals to process oncology data generated from research.
    • Google: Filed a patent covering an invention for drawing blood without a needle. This is a small step toward wearable sampling devices.
    • A point worth noting is that the U.K. has a slight leg up on the data access front. Initiatives like the U.K. Biobank (500,000 patient records), Genomics England (100,000 genomes sequenced), HipSci (stem cells) and the NHS care.data program are leading the way in creating centralized data repositories for public health and therapeutic research.

    Enterprise Automation

    Could businesses ever conceivably run themselves? AI-enabled automation of knowledge work could cut employment costs by $9 trillion by 2020 (BAML). Coupled with the efficiency gains worth $1.9 trillion driven by robots, I reckon there’s a chance for near-complete automation of core, repetitive businesses functions in the future.

    Think of all the productized SaaS tools that are available off the shelf for CRM, marketing, billing/payments, logistics, web development, customer interactions, finance, hiring and BI. Then consider tools like Zapier or Tray.io, which help connect applications and program business logic. These could be further expanded by leveraging contextual data points that inform decision making.

    Perhaps we could eventually re-image the new eBay, where you’ll have fully automated inventory procurement, pricing, listing generation, translation, recommendations, transaction processing, customer interaction, packaging, fulfillment and shipping. Of course, this is probably a ways off.

    I’m bullish on the value to be created with artificial intelligence across our personal and professional lives. I think there’s currently low VC risk tolerance for this sector, especially given shortening investment horizons for value to be created. More support is needed for companies driving long-term innovation, especially considering that far less is occurring within universities. VC was born to fund moonshots.

    We must remember that access to technology will, over time, become commoditized. It’s therefore key to understand your use case, your user, the value you bring and how it’s experienced and assessed. This gets to the point of finding a strategy to build a sustainable advantage such that others find it hard to replicate your offering.

    Aspects of this strategy may in fact be non-AI and non-technical in nature (e.g., the user experience layer ). As such, there’s renewed focus on core principles: build a solution to an unsolved/poorly served high-value, persistent problem for consumers or businesses.

    Finally, you must have exposure to the U.S. market, where the lion’s share of value is created and realized. We have an opportunity to catalyze the growth of the AI sector in Europe, but not without keeping close tabs on what works/doesn’t work across the pond.

    Source: TechCrunch

  • Machine intelligence gaat dit jaar doorbreken

    543700Het Verkenningsinstituut voor Nieuwe Technologie (VINT) van Sogeti trapt het nieuwe jaar af met de ontwikkeling van vier onderzoeksrapporten over ‘machine intelligence’.

    Deze allesomvattende term voor kunstmatige intelligentie adresseert volgens VINT de situatie waarin computers in staat zijn om menselijke intelligentie te evenaren of zelfs voorbij te streven. Na de mobiele revolutie, big data en het internet der dingen verwacht VINT dat ‘machine intelligence’ dit jaar gaat doorbreken. De eerste notitie bevat een ‘executive introductie’ waarin de mogelijkheden en marktomvang van ‘machine intelligence’ worden beschreven, evenals de algehele impact daarvan op de samenleving. Dit eerste rapport maakt deel uit van een vierluik dat in de loop van 2016 verschijnt.

    “Auto’s die voetgangers herkennen en daarop passende maatregelen nemen, computers die een ziektediagnose vaststellen of juridisch advies verstrekken op basis van bestaande jurisprudentie. Dit zijn slechts een paar voorbeelden van de nieuwe generatie ‘machine intelligence’ computers waarbij de balans tussen mens en machine volledig verschuift. Deze intelligentie is straks voor iedereen beschikbaar. Ook moet elk bedrijf in elke sector zich straks realiseren dat kunstmatige intelligentie op een hele andere manier ingezet kan worden dan traditionele IT”, vertelt Menno van Doorn (foto), auteur en directeur van VINT van Sogeti.

    VINT heeft twee verklaringen voor de op handen zijnde doorbraak van kunstmatige intelligentie. In de eerste plaatst is er sprake van een doorbraak op het gebied van hardware. Zo worden onze telefoons hyperintelligent gemaakt met op het brein geïnspireerde neuromorfische chips. De extreem hoge rekenkracht van de nieuwe generatie chips helpt daarbij. De tweede verklaring heeft te maken met de explosieve groei van data. Decennia lang ontbrak het aan voldoende gegevens om verder te komen met ‘machine learning’. Pas nu, in het huidige big data tijdperk waarin grote hoeveelheden data beschikbaar zijn, kunnen de zelflerende algoritmes aan de slag. Dit zelflerende vermogen van machines is cruciaal voor deze nieuwe vorm van intelligentie.

    Vier rapporten

    Na de ‘executive introductie’ over kunstmatige intelligentie komen in de opvolgende rapporten drie onderwerpen aan bod. In het tweede rapport komt de opkomst van de Personal Digital Assistent aan de orde. Daarbij vindt een verschuiving plaats van apps naar cogs (cognitieve systemen). Persuasive Technologies staat centraal in het derde rapport. Dat betreft de gedragsbeinvloeding via machines. Het thema voor het laatste rapport is de invloed van angst voor kunstmatige intelligentie op organisaties die slimmer willen worden.Sander Duivestein, Jaap Bloem, Erik van Ommeren en Menno van Doorn zijn de auteurs van de eerste ‘executive introductie’. Naar verwachting zal het tweede rapport in april 2016 verschijnen. Het eerste trendrapport is te downloaden via deze link.

    Source: Executive-People

  • Making AI actionable within your organization

    Making AI actionable within your organization

    It can be really frustrating to run a successful pilot or implement an AI system without it getting widespread adoption through your organization. Operationalizing AI is a really common problem. It may seem that everyone else is using AI to make a huge difference in their business while you’re struggling to figure out how to operationalize the results you’ve gotten from trying a few AI systems.

    There has been so much advancement in AI, so how can you make this great technology actually translate into actionable business results?

    This is a real common problem that has been touching enterprises of all kind, from the biggest companies to mid-sized businesses.

    Here are a few quick pointers on how to turn your explorations in AI into AI practices leading to real results from investments.

    Pragmatic AI

    Firstly, focus on what gets called 'Pragmatic AI', practical AI that has obvious business applications. It’s going to be a long time before we have 'strong AI', so look for solutions that were made by examining problems that businesses deal with every day and then decide to use artificial intelligence to solve the problem. It’s great that your probabilistic Bayesian system is thinking of the world differently or that a company feels like they’ve taken a shortcut around some of the things that make Deep Learning systems slow to train, but what does that mean for the end user of the artificial intelligence? When you’re looking for a practical solution, look for companies who are always trying to improve their user experience and where a PhD in machine learning isn’t needed to write the code.

    Internal valuations

    Similarly, change the way you are considering bringing an AI solution into your company. AI works best when the company isn’t trying to do a science fair project. It works best when it is trying to solve a real business problem. Before evaluating vendors in any particular AI solution or going out to see how RPA solutions really work, talk to users around your business. Listen to the problems they have and think about what kind of solutions would make a huge difference. By making sure that the first AI solution you bring into your organization aligns to business goals, you are much more likely to succeed in getting widespread adoption and a green light to try additional new technologies when it comes time to review budgets.

    And no matter how technology-forward your organization is, AI adoption works best when everyone can understand the results. Pick a KPI focused problem like conversion, customer service, or NPS where the results can be understood without thinking about technology. This helps get others outside of the science project mentality to open their minds on how AI can be used throughout the business.

    Finally, don’t forget that AI can help in a wide variety of ways. Automation is a great place to use AI within an organization, but remember that in many use cases, humans and computers do more together than separately and great uses for AI technology help your company’s employees do their job better or focus on the right pieces of data. These solutions often provide as much value as pure automation!

    Source: Insidebigdata

  • Microsoft lanceert nieuw big data analytics platform

    microsoft3Microsoft presenteert een nieuwe big data oplossing op basis van de computercode R en de technologie die het binnenhaalde bij de overname van bedrijf Revolution Analytics.

    Technologiereus Microsoft heeft een nieuw geavanceerd analytics platform onthuld op basis van de programmeertaal R: Microsoft R Server. De knowhow deed Microsoft vorig jaar op door de aankoop van Revolution Analytics, een provider van software en diensten in de R computertaal.

    Microsoft R Server werkt met Windows, Linux, Hadoop en Teradata software. De software is gebaseerd op Revolution R Enterprise (RRE), een product van Revolution Analytics. Het pakket Microsoft R Open, dat een “verbeterde distributie van R van Microsoft” voorziet, is een voorwaarde om gebruik te maken van het R Server platform, zo stelt Microsoft.

    “Microsoft R Server is een snel en kostenefficiënt professioneel big data geavanceerd analytics platform dat een waaier aan big data statistics, predictive modeling en machine learning mogelijkheden ondersteunt,” vermeldt de beschrijving van de software op het Microsoft Developer Network, waar het beschikbaar is als download voor abonnees van het netwerk. Het platform is ook te vinden op DreamSpark, het academische programma van Microsoft.

  • Research details developments in the business intelligence (BI) market that is estimated to grow at 10% CAGR to 2020

    HOIThe global business intelligence market report, an analyst says In the past few years, social media has played critical roles in SMEs and mid-sized organizations. Many SMEs are increasingly embracing this trend and integrating their BI software with social media platforms.

    Market outlook of business intelligence market - market research analyst predicts the global business intelligence market to grow at a CAGR of around 10% during the forecast period. The growing adoption of data analytics by organizations worldwide is a key driver for the growth of this market.

    The majority of corporate data sources include data generated from enterprise applications along with newly generated cloud-based and social network data. business intelligence tools are useful in the retrieval and analysis of this vast and growing volume of discrete data.

    They also help optimize business decisions, discover significant weak signals, and develop indicator patterns to identify opportunities and threats for businesses.

    The increased acceptance of cloud BI solutions by SMEs is also boosting the growth of this market. The adoption of cloud services allows end-users to concentrate on core activities rather than managing their IT environment.

    Cloud BI solutions enable applications to be scaled quickly, can be easily integrated with easy integration with third-party applications, and provide security at all levels of the enterprise IT architecture so that these applications can be accessed remotely.

    Market segmentation by technology of the business intelligence market:

    • Traditional BI
    • Mobile BI
    • Cloud BI
    • Social BI

    The mobile BI segment accounts for approximately 20% of the global BI market. It enables the mobile workforce to get business insights by data analysis, using applications optimized for mobile and smart devices.

    The growing smartphone adoption is likely to emerge as a key growth driver for this segment during the forecast period.

    Market segmentation by deployment of the business intelligence market

    • Cloud BI
    • On-premises BI

    The on-premise segment accounted for 86% of the market share during 2015. However, the report anticipates this segment to witness a decline in its shares by the end of the forecast period.

    In this segment, the software is purchased and installed on the server of an enterprise. It requires more maintenance but is highly secure and easy to manage.

    Geographical segmentation of the BI market

    • Americas
    • APAC
    • EMEA

    The Americas dominated the market during 2015, with a market share of around 56%. The high adoption of cloud BI solutions in this region is the major growth contributor for this market.

    The US is the market leader in this region as most of the key vendors are based out of here.

    Competitive landscape and key vendors

    Microsoft is one of the largest BI vendors and offers Power BI, which helps to deliver business-user-oriented, self-service data preparation and analysis needs through Excel 2013 and Office 365. The competitive environment in this market is expected to intensify during the forecast period due to an increase in R&D innovations and mergers.

    The market is also expected to witness a growing trend of acquisitions by the leading players. The key players in the market are expected to diversify their geographical presence during the forecast period.

    The key vendors of the market are -

    • IBM
    • Microsoft
    • Oracle
    • SAP
    • SAS Institute

    Other prominent vendors in the market include Actuate, Alteryx, Board International, Brist, Datawatch, GoodData, Infor, Information Builders, Logi Analytics, MicroStrategy, Panorama Software, Pentaho, Prognoz, Pyramid Analytics, Qlik, Salient Management Company, Tableau, Targit, Tibco Software, and Yellowfin.

    Key questions answered in the report

    • What will the market size and the growth rate be in 2020?
    • What are the key factors driving the BI market?
    • What are the key market trends impacting the growth of the BI market?
    • What are the challenges to market growth?
    • Who are the key vendors in the global BI market?
    • What are the market opportunities and threats faced by the vendors in the BI market?
    • Trending factors influencing the market shares of the Americas, APAC, and EMEA?
    • What are the key outcomes of the five forces analysis of the BI market?

    Source: WhaTech

  • Tech companies and the issue of diversity

    Tech companies and the issue of diversity

    Diversity in the workplace is something that all tech companies should strive for. When appropriately embraced in the tech sector, diversity has been shown to increase financial performance, increase employee retention, foster innovation, and help teams to develop better products. For example, data marketing teams that have equitable hiring practices in regards to gender exemplify this.

    While the benefits of a diverse workplace can help any company thrive, figuring out how exactly to increase diversity in tech workplaces can be a challenge. However, employing a diverse team is not impossible, and the rewards make diversification efforts well worth it.

    Diversity is less common than you might think

    Though the tech industry is far more diverse today than it has been in the past, diversity still remains an issue across the sector. Even if those heading tech companies don’t engage in outright racism by fostering a hostile work environment towards people of color or discouraging the hiring of diverse groups, many tech companies still find themselves with teams that look and think alike. Homogeny creates complacency, insulates a workforce from outside perspectives, and ultimately prevents real innovation and creativity from taking place.

    Tech companies can be complicit in racism through hiring practices, segregation of existing people of color employed by them (whether purposeful or not), or preventing employees from diverse backgrounds from conversing in their native tongue or practicing their cultural traditions.

    Diversity in the tech field isn’t just about improving any one company’s performance, it’s crucial to the continued success of the entire tech sector. By 2020, there will be 1.4 million open computer science-related positions with only around 400,000 qualified candidates to fill them. This makes diversity and inclusion a pressing emergency that needs to be addressed, and quickly.

    The gap in diversity is most noticeable in the tech field when looking at gender. Despite the fact that there are more women graduating with STEM (Science, Technology, Engineering, Mathematics) bachelor’s degrees than men, the bulk of the major companies in Silicon Valley hire far more men than women. With the undeniable fact that there are simply more qualified women than men who can drive success, ingenuity, and creativity, it just makes sense to increase diversity in the workplace by hiring more women.

    Using internship programs, diversity initiatives, and scholarships

    One effective way of combating the lack of diversity in tech workplaces is to implement diversity initiatives with the express purpose of finding qualified, diverse candidates who may have self-selected away from the tech field.

    Tech companies that wish to increase diversity can articulate a company-wide diversity commitment from the highest levels of management, letting all employees know that diversity is a top priority. Then, companies can consider implementing mentor programs to bring in diverse applicants to become proteges, all but ensuring that diverse talent is retained.

    Aside from creating a company culture that prioritizes diversity, tech companies can take things even further by becoming more proactive in getting underrepresented communities into the tech sector. Companies can develop afterschool STEM programs to foster the idea within these communities that the tech field is not only right for them but welcoming to them. Developing afterschool STEM programs directly impacts communities, with over 70% of students participating in afterschool STEM programs showing increased interest in STEM careers. Additionally, companies can offer up STEM-related college scholarships and internship programs to individuals who might not otherwise have the opportunity to enter the tech sector due to economic disadvantages.

    Data scientists are in high demand, and companies that use their influence and financial abilities to help elevate underrepresented groups into those positions will send a powerful signal to future generations of tech workers. While any tech company can make promises to improve diversity in the workplace, those that actually go above and beyond to ensure that minorities have the opportunity and drive to join the tech field are actually walking the walk.

    Diverse recruitment strategy

    While encouraging a diverse workplace is all well and good, companies need to actually implement a recruitment strategy with the purpose of increasing diversity. Diversity strategies in the recruitment process need to make sure that diversity goals and business goalsalign so that business growth accompanies diverse hiring.

    One problem that a lot of tech companies are facing is that diverse hires don’t necessarily see themselves within the company, which often simply extends from the fact that they aren’t represented culturally within the company. With nearly 70% of those seeking employment saying that a company’s diversity portfolio influences their decision about whether or not to work there, the lack of diversity within tech companies can hamstring their recruitment efforts. Getting over this hurdle can be difficult. One way to do it is by rebranding the company to appeal to marginalized groups.

    Implementing big data analysis into your recruiting can also help to naturally foster a more diverse team. Big data can quickly find the most qualified candidates for any particular position completely free of bias, meaning that every applicant is given the same chance at landing an interview regardless of creed, gender, sexuality, or race. Though this approach doesn’t affect who is applying for a tech position, it certainly levels the playing field for those who do apply.

    Even if a company doesn’t use big data in their hiring process, incorporating bias awareness into the recruiting and interview processes can help a company’s HR department improve the diversity within the company. HR can further assist in increasing diversity within the tech field by fostering a culture of inclusion within a company, helping employees to feel empowered and valued, which will allow them to do their best possible work.

    In terms of building trust, big data can also be used to protect employee information. Because machine learning can detect unusual behavior, it can detect when employee data or personally identifiable information may be at risk. In fields where underrepresented people can already feel vulnerable, it’s essential that security measures are in place to allow them to work without fear of threats to their privacy or financial well-being. A good track record in this regard can drastically improve trust (and therefore representation) in the workplace.

    Though diversity in tech workplaces is sorely lacking, there are many ways for companies to take positive steps to improve their diversity. Expanding the search for more diverse hires, developing community programs like after-school STEM programs and scholarships, and making sure that their HR department is approaching the hiring process with as little bias as possible are all great places to start. Tech companies with a diverse workforce recognize the inherent benefits of diversity, and those that don’t are sleeping on what could be the difference between success and failure.

    Source: Datafloq

  • The Data Digest: Sports, Spectators, And Screen Time

    Sports fans around the world are having a heyday: From the Copa America soccer tournament in the US to the European Champions Cup across the pond, and from live Wimbledon matches to the imminent Summer Olympic Games, there is no lack of global sports entertainment at this moment.

    Sports teams have always brought people together as much as divided them — and in today’s age, technology amplifies the drama of fandom. Personal devices play a critical role in how people come together around sports, when fans watch the action unfold, and how they discuss the results.

    For example, Forrester’s latest Consumer Technographics® survey data reveals that consumers worldwide have recently accessed sports applications on their mobile phones and tablets: 

    data-digest 6.30.16 0

    Our previous research shows that consumers often prefer tablets for longer, more engaging, media-rich experiences — and in fact, Forrester’s mobile behavioral data indicates that consumers spend more time on sports apps when using their tablet rather than their mobile phone. However, technology doesn’t only enable sports enthusiasts to get into the game — oftentimes, it also allows more casual fans to multitask.

    In an earlier research report, my colleagues Nicole Dvorak and Kris Arcand note that as personal devices become integral to the sports spectator experience, marketers should design mobile experiences to engage audiences in new ways. They say that “smartphones [and tablets] are the perfect devices to provide newer, more creative extensions.” Marketers can leverage these screens to “appeal to ad fans [and] game fans, engage [the multitaskers], and tempt the party crowds with delicious offers.”

    Source: Forrestor.com

     

  • The future of car ownership that no one is talking about

    carsaleIt has never been more clear that transformational change to one of the world’s largest industries is just around the corner. Car ownership is supposed to change — and when it does, it is predicted to be one of the most monumental displacements of wealth the world economy has ever seen.

    But there are a lot of conflicting opinions and information out there. Industry experts and headlines are telling us that Tesla will rule the world, car manufacturers are becoming mobility companies and Uber and Lyft mean the end of traditional car ownership as we know it. They’re also saying that car dealerships will die at the hands of online car-buying sites such as Beepi, Carvana, Vroom and Shift. On the contrary, headlines are also touting that vehicle sales are at all-time highs, trucks are the leading category of new car sales and millennials are buying cars more than ever.

    There’s also the looming question of autonomous driving technology. Uber and Lyft want self-driving cars to replace their drivers ASAP, and companies like Apple and GM are spending aggressively to position themselves for success in a theoretical, autonomous mobility landscape. But even Google, which has been developing this technology since 2009, is unsure if fully autonomous cars can become viable within the next 30 years.

    What is unequivocal is that there’s a lot of noise around the future of transportation. Like anyone who lives in Silicon Valley, I believe change is coming and that current car ownership models are ripe for disruption. But with personal car sales at an all-time high, the question is when — and how? The assumed agents of change are the likes of Tesla, Uber, Google, Apple or Ford, but lost in all of these predictions is the linchpin for the entire auto industry. That’s the unsexy, yet enormous world of auto finance — the huge market that makes it all work.

    Financing props up the entire auto industry.

    To put it in perspective, auto loan balances in the U.S. total more than $1.06 trillion right now. That number doesn’t even include the enormous leasing market. The largest auto lenders in the U.S. are also household names: banks like Ally, Wells Fargo, Chase and Capital One, and finance arms of car manufacturers like Toyota, GM and Ford. Auto loans, as an asset class, trail only mortgages and student loan balances.

    It’s clearly a massive market, but why does that matter? It matters because U.S. car sales are inextricably linked to a robust auto finance market. Indeed, 86 percent of new car sales in the U.S. are financed. Without individual financing products, car sales don’t happen. Financing props up the entire auto industry. In fact, the U.S. is far more reliant on auto finance than other countries. In China, where current new car sales outpace the U.S., only 26 percent of new car sales are financed.

    What does that mean for the future?

    New types of cars like the Tesla 3 or the Chevy Bolt will continue to emerge, and their features will be heralded as ushering in the future of transportation. For the most part, however, these innovative vehicles are going to be consumed in the traditional manner: They will be personally financed.

    That can’t be the future of car ownership. With the rise of companies like Uber and Lyft, it’s clear that we will need to see advances in new ownership models to support tomorrow’s transportation landscape. In fact, Uber recently received a $1 billion credit facility led by Goldman Sachs to fund new car leases. Uber (and Wall Street) are also recognizing the need for more flexibility with this deal — especially at a time when Americans are making larger monthly payments than ever on their cars and taking out record-size auto loans.

    Other flexible vehicle access products like Ford’s Credit Link (shared vehicle leasing program) and GM’s Maven (on-demand car rentals) give us a glimpse into what the future of car ownership may look like. These types of emerging products will only be successful with a robust technology infrastructure — a stark departure from the differentiators in the auto finance market today.

    If we see a shift away from personal vehicle ownership and households no longer own, on average, 2.06 cars, then this will require an entirely new auto finance infrastructure. Ultimately, Google’s self-driving cars may become ubiquitous — but we’ll have to displace a lot of big names in today’s $1 trillion auto finance market to get there.

    Source: Techcrunch.com

     

  • The risk of undervaluing creativity

    The risk of undervaluing creativity

    Agencies’ creative perspective, the very currency of the business, is at risk and can only be realized by shifting billions from tech to fund creative differentiation.

    “The value of agency creativity is at risk of disappearing”

    The marketing industry is woefully out of balance, from agency/client relationships to new business requirements and compensation. The healthy tension of creativity that once balanced the needs of the brand with the needs of its customers, the commercial effectiveness of the work versus its cultural impact, and the needs of agency economics versus the client’s growth is all eroding. These are now one-sided issues. The tension is no longer healthy. Nowhere is this more evident than in agency economics. Agencies today barely grow at the current rate of inflation. Insourcing, margin compression, cost-cutting, new competitors, and tech priorities threaten the existence of agencies and undermine their value.

    “Customer experience has stagnated”

    Strong evidence of creativity’s languish is already underway. Customer experience has stagnated. Forrester’s Customer Experience Index (CX Index™), a study of 100,000 consumers and 300 brands that has been run for more than a decade and acts as a barometer for CX performance, is flat for the fourth consecutive year. Most brands are stuck in the middle, struggling to improve over competitors. Zero brands are rated to have an excellent experience. Forrester determined that there are four types of CX performance: the Languishers, Lapsers, Locksteppers, and Laggards. No brand is performing well. Worse still, for every 1-point drop in CX Index score, companies lose 2% of their returns. It’s only a matter of time before companies’ growth is impacted.

    “We’ve commoditized the brand and homogenized experiences”

    The issue is that the work looks, feels, and behaves too similar. The industry obsession for meeting every customer need and want for ease and convenience by using technology has left little room for creative differentiation. That has come at a cost. The front door to your brand is a web or app experience that is virtually indistinguishable. Fashion experiences look the same. Quick-service restaurant and coffee apps allow you to order ahead and skip the line. All airline apps allow travelers to check in, manage travel, and use a mobile device as their boarding pass. What can make one brand different from another when the experience is built from the same common technology platform, designed to solve the same user or category need, and programmed for the same two devices? Creativity.

    “We’ve overfunded technology and underfunded creativity”

    Unfortunately, just when creativity is needed the most, business leaders are investing in it the least. Forrester forecasts that spending for adtech, martech, data, and analytics will grow between 9% and 11% through 2022. Agency spending will only grow a mere 2.4%. And client budgeting and priorities are only part of the problem. Agencies are underfunding creativity, too. As of 2014, agencies had spent $12 billion-plus for data and technology resources and acquisitions. While the agency data platforms do power media and audience activation, all but one integrates with the creative process. And creative departments remain skeptical and dismissive.

    “It’s time to fund creative differentiation”

    Forrester developed an ROI for creative agency investment that determined that moving a portion of the marketing budget out of technology and into agency creativity will bring a higher return on investment compared to currently projected spending levels. This serves as a six-year growth plan for CMOs that ultimately helps achieve 20% growth for the entire industry. These are not new dollars but rather a reallocation of currently projected spending that maintains significant adtech and martech investments.

    “It’s time to reinvent creativity”

    To deliver clients the growth they need and customers the experiences they demand, agencies must innovate their structures, capabilities, workforce, and process. Structurally, data, technology, media, and creative should all come together and put creative problem-solving at the center. This means the newly acquired data, tech, and operating agencies should also come together. And especially, it means agencies leaders will need to make consolidation and coordination a priority. Tough decisions must be made in the name of agency brand coherence and a model that is easier for clients to engage. Training today’s workforce to be tomorrow’s data-, technology-, and creative-literate is critical. And creative departments must embrace data- and tech-driven creativity.

    We’re living during one of the most interesting times in the history of the industry, with the opportunity to shape and define it. A whole new era of amazing marketing is only possible if we fund the balance of creativity and technology. Take up the mantle to modernize the industry. Reinvent the creative process.

    Author: Jay Pattisall

    Source: Forrester

  • The role of Machine Learning in the development of text to speech technology

    The role of Machine Learning in the development of text to speech technology

    Machine learning is drastically advancing the development of text to speech technology. Here's how, and why it's so important.

    Machine learning has played a very important role in the development of technology that has a large impact on our everyday lives. However, machine learning is also influencing the direction of technology that is not as commonplace. Text to speech technology is a prime example.

    Text to speech technology predates machine learning by over a century. However, machine learning has made the technology more reliable than ever.

    The progression of text to speech technology in the Machine Learning era

    We live in an era where audiobooks are gaining more appreciation than the traditional pieces of literature. Thus, it comes as no surprise that the Text-to-Speech (TTS) technology is also rapidly becoming popular. It caters to those who need it most, including children who struggle with reading, and those who suffer from a disability. Big data is very useful in assisting these people.

    There are other elements of speech synthetization technology that rely on machine learning. It is now so sophisticated that it can even mimic someone else’s voice.

    Text to Speech (commonly known as TTS) is a piece of assistive technology (that is, any piece of technology that helps individuals overcome their challenges) that reads text out loud, and is available on almost every gadget we have on our hands today. It has taken years for the technology to develop to the point it is at today. Machine learning is changing the direction of this radical technology. However, its journey is one that started in the late eighteenth century.

    The early days of text to speech 

    TTS is a complicated technology that has developed over a long period of time. It all began with the construction of acoustic resonators, which could only produce just the sounds of the vowels. These acoustics were developed in 1779, due to the dedicated work of Christian Kratzenstein. With the advent of semiconductor technology and improvements in signal processing, computer-based TTS devices started hitting the shelves in the 20th century. There was a lot of fascination surrounding the technology during its infancy. This was primarily why Bell Labs’ Vocoder demonstration found its way into the climactic scene of one of the greatest sci-fi flicks of all time: 2001: A Space Odyssey.

    The Machine Learning technology that drives TTS

    A couple of years ago, Medium contributor Utkarsh Saxena penned a great article on speech synthesis technology with machine learning. They talked about two very important machine learning approaches: Parametric TTS and Concatenative TTS. They both help with the development of new speech synthesizing techniques.

    At the heart of it, a TTS engine has a front-end and a back-end component. Modern TTS engines are heavily dependent on machine learning algorithms. The front-end deals with converting the text to phonetics and meaningful sentences. The back-end uses this information to convert symbolic linguistic representation to sound. Good synthesizer technology is key to a good TTS system, which requires sophisticated deep learning neural analysis tools. The audio should be both intelligible and natural, to be able to mimic everyday conversation. Researchers are trying out various techniques to achieve this.

    Concatenation synthesis relies on piecing together multiple segments of recorded speech to form coherent sentences. This technology usually gives way to the most natural-sounding speech. However, it loses out on intelligibility, leading to audible glitches as a result of poor segmentation. Formant synthesis is used when intelligibility takes precedence over natural language. This technology does not use human speech samples, and hence sounds evidently ‘robotic’. The lack of a speech-sample database means that it is relatively lightweight and best suited for embedded system applications. This is because power and memory resources are scarce in these applications. Various other technologies also exist, but the most recent and notable one is the use of machine learning. In fact, recorded speech data helps train deep neural networks. Today’s digital assistants use these extensively.

    The challenges

    Contextual understanding of the text on the screen is one of the main challenges for TTS systems. More often than not, human readers are able to understand certain abbreviations without second thoughts. However, these are very confusing to computer models. A simple example would be to consider two phrases, “Henry VIII” and “Chapter VIII”. Clearly, the former should be read as Henry the Eighth and the latter should be read as Chapter eight. What seems trivial to us is anything but, for front-end developers working at TTS companies like Notevibes.

    They use various predictive models to enhance the user experience. But there is a lack of standard evaluation criteria to judge the accuracy of a TTS system. A lot of variables go into the quality of a particular recording, and these variables are hard to control. This is due to the involvement of both analog and digital processing. However, an increasing number of researchers have begun to evaluate a TTS system based on a fixed set of speech samples.

    That, in a nutshell (a rather big one at that), is an overview of Text to Speech systems. With increased emphasis on AI, ML, DL, etc., it is only a matter of time before we are able to synthesize true-to-life speech for use in our ever-evolving network of things.

    Machine Learning is the core of speech to text technology

    Machine learning is integral to the development of speech to text technology. New speech synthetization tools rely on deep neural algorithms to provide the highest quality outputs as this technology evolves.

    Author: Matt James

    Source: Smart Data Collective

  • TNO: ‘Amsterdam blijft bereikbaar dankzij big data’

    1000Innovatieorganisatie TNO ziet kansen voor big data en Internet of Things-technologie (IoT) om de bereikbaarheid van de metropoolregio Amsterdam te vergroten. “Met big data kunnen we meerdere oplossingen aan elkaar koppelen om de infrastructuur van een stad optimaal te benutten”, zegt Leo Kusters, Managing Director Urbanisation bij TNO.

    Binnen enkele decennia woont 70 procent van de wereldbevolking in grote steden of in sterk verstedelijkte regio’s. Het economische en culturele succes van regio’s als de Randstad trekt veel mensen. De infrastructuur van deze steden wordt daardoor steeds meer belast. Infrastructuur en mobiliteit zijn daarom bepalende factoren voor het succes van de grootstedelijke regio’s.

    Slimme mobiliteit

    Kusters wijst op het project Praktijkproef Amsterdam (PPA), waarin TNO samenwerkt met ARS Traffic & Transport Technology aan het verminderen van files in de regio Amsterdam. “Aan dit project zijn 15.000 automobilisten verbonden”, zegt Kusters. Door weggebruikers beter te informeren over de verkeerssituatie in de stad, verwacht TNO dat het aantal files in de regio Amsterdam afneemt.

    De deelnemers hebben de beschikking over een app waarmee ze op individueel niveau geïnformeerd worden over de beste reiskeuzes die ze kunnen maken. Daarnaast kunnen gebruikers via de app ook zelf incidenten en vertragingen op de weg melden. Hierdoor komen de automobilisten sneller op hun bestemming en kunnen ze rekenen op een betrouwbare reistijd.

    Bijzonder aan dit project is volgens Kusters dat de app ook advies geeft op basis van verkeerslichten die op rood staan. Vervolgens houdt het systeem rekening met deze verkeerslichten om een opstopping op de weg te voorkomen.

    TNO voert een vergelijkbaar project uit met vrachtverkeer in Helmond. Kusters: “Door de stad Helmond loopt een snelweg waar veel vrachtauto’s overheen rijden. Hierdoor is er in de stad veel belasting voor het milieu en de luchtkwaliteit.” In dit project experimenteert TNO met data-analyse om de doorstroming voor de betrokken vrachtwagens te optimaliseren. De chauffeurs krijgen doorlopend snelheidsadviezen om de doorstroming in de stad te verbeteren. Hierdoor hoeven chauffeurs minder te stoppen in de stad. Vrachtwagens verbruiken daardoor minder brandstof.

    Twee vliegen in één klap

    Een grote kans van big data en de toepassing van IoT-technologie ligt volgens Kusters in het combineren van meerdere oplossingen voor optimale benutting van bestaande infrastructuur. Big data kan ook bijdragen aan besparingen in het onderhoud van de infrastructuur, waar Nederland jaarlijks € 6 mrd aan uitgeeft.

    TNO richt zich bijvoorbeeld op het verlengen van de levensduur van bruggen. ”Een essentieel onderdeel van de infrastructuur”, zegt Kusters. “Als bruggen niet werken, staat alles stil.” TNO meet met sensoren de haarscheurtjes in bruggen. “Zo kunnen we precies weten wanneer een brug onderhoud nodig heeft of moet worden vervangen. Dit maakt het mogelijk om de levensduur van de brug ‘op maat’ te verlengen. Dus precies op tijd met een minimum aan overlast voor het verkeer.”

    De levensduur van infrastructuuronderdelen wordt meestal bepaald op basis van theoretische modellen. Kusters: “Omdat de werkelijkheid altijd anders is, ontwikkelt TNO met Rijkswaterstaat nieuwe meetmethodes. Het gebruik van infrastructuur kan in de praktijk intensiever of juist minder intensief zijn in vergelijking met de inschatting uit theoretische modellen, en de schade dus ook. Door big data in te zetten, kunnen we nauwkeurige voorspellingen maken voor het onderhoud van de brug en daarmee kosten besparen.”

    De coöperatieve auto

    Bij deze projecten is de betrokkenheid van verschillende partijen van groot belang, meent Kusters. “Mobiliteit is allang niet meer het alleenrecht van de overheid. De overheid neemt een andere rol aan bij de verduurzaming van infrastructuur en mobiliteit. Ook technologiebedrijven worden steeds belangrijker. Dat zijn bedrijven als TomTom en Google, maar ook een partij als chipleverancier NXP, die kunnen bijdragen aan de ontwikkeling van technologie om voertuigen met elkaar te laten communiceren.”

    De TNO-directeur spreekt over de ‘coöperatieve auto’. “Dat betekent dat alle diensten en modaliteiten waar je als automobilist gebruik van wil maken, aan elkaar worden gekoppeld. Het systeem gaat dan als het ware met je mee denken.”

    De coöperatieve auto maakt gebruik van IoT-technologie om rechtstreeks met andere voertuigen of de infrastructuur te communiceren. Hierdoor houdt de auto continu rekening met de huidige verkeerssituatie en de voertuigen die in dezelfde omgeving rijden. Kusters: “Dat is een grote doorbraak, een efficiënte deels-zelfrijdende auto die altijd oplet en altijd wakker is. Zo kunnen we de wegcapaciteit stevig laten toenemen en een flink deel van de fileproblemen oplossen.”

    Toekomstvisie

    De Managing Director Urbanisation ziet de IoT-toepassingen voor mobiliteit in rap tempo toenemen. "De autonoom zelfrijdende auto in de stad is misschien wel minder ver weg dan we denken”, zegt Kusters. “We hebben al auto’s die zelf kunnen parkeren. In de toekomst betekent dit dat de parkeerproblemen in de grote steden ten einde lopen.”

    Naast de IoT-toepassing voor coöperatieve auto’s, ziet Kusters ook kansen voor verbeteringen aan de infrastructuur. “Het verbonden zijn van mensen en van apparaten zal ook terug te zien zijn op het straatbeeld, zoals wifi op straat, wifi voor auto’s, en slimme LED-verlichting. Dat betekent overigens niet dat al die informatie over één en hetzelfde netwerk zal gaan. De informatie die tijdkritisch is en de verkeersveiligheid beïnvloedt, zal bijvoorbeeld gebruikmaken van een apart netwerk. Dit gaan we in steden en op snelwegen binnen een paar jaar in de praktijk zien.”

    In de toekomst ziet de directeur leefomgeving van TNO ook meer veranderingen in het aanzicht van de binnenstad. “In de stad gaan we meer en meer elektrisch rijden. Dat zien we al in recente openbaar vervoersaanbestedingen.” Ook fietsersaantallen zullen volgens Kusters nog verder groeien. “In een stad als Amsterdam is er dan meer ruimte nodig voor de fiets”, zegt Kusters. “Dit is de enige vorm van mobiliteit die in Amsterdam toeneemt. Meer ruimte voor fietsers is daarom belangrijk. Dat gaat wel ten koste van de parkeerplaatsen van de auto’s, maar hoeft dan niet zomaar ten koste te gaan van de bereikbaarheid.”

    Source: DuurzaamBedrijfsleven

  • What Can Retailers Do To Elude Extinction?

    ExtinctHere's what you didn't learn in school about the disruption affecting retail today. A recent article by consultant Chris H. Petersen, "Seven disruptive trends that will kill the bigstock-Extinct-150-79929610-copy'dinosaurs of retail'" discussed the fate of "25 retail dinosaurs that vanished in the last 25 years" which was the subject of an Entrepreneur article. Those retailers included giants such as Circuit City, Comp USA, Blockbuster, Borders, and Tower Records, companies which literally dominated their category or channel. Others named in the article were retail innovators in their own right until new disruptors outgunned them. The point is that neither longevity, size, or specialization guarantee retail survival today. So how can today's retailers avoid being extinguished by current disruptive innovations?

    Disruptive innovation refers to any enhanced or completely new technology that replaces and disrupts an existing technology, rendering it obsolete. (Picture how we went from the Model T to the KIA; from giant mainframes to personal computers; or from fixed-line telephones to cellphones/smartphones).

    Disruptive innovation is described by Harvard Business professor Clayton Christensen as a process by which a product or service takes root initially in simple applications at the bottom of a market and then relentlessly moves up market, eventually displacing established competitors.

    Today's major disruptive retail trends have led to the rise of the consumer, the rise of technology to help retailers best serve the consumer while wrestling with competitive forces, and the demise of "the old way of doing business."

    I. The Consumer.

    Evolving, innovative, disruptive technology has led to consumer-dominated behavior that reaches across many channels. As we know, today's consumer now shops any time and everywhere using a variety of helping tools.

    The consumer is capable of having a personal, seamless experience across their entire shopping journey to explore, evaluate and purchase, tempered by how retailers DO business, provide service, deal with their competition, etc.

    * The consumer journey starts online, although stores remain a destination for experience.

    What can retailers do? The successful retailer of the future needs to not only master online and offline, but how to connect with the consumer across many touch points, especially social media.

    * Mobile juggernaut. The latest stats show that there are now more cell phones in use than people on this planet. Smartphones now exceed 4.5 billion. Mobile is the majority and will be the preferred screen for shopping.

    What can retailers do? Retail survivors must optimize for mobile engagement, and also broadcast offers and connect with consumers wherever they are. The store of the future will not only have beacons to connect, but to track traffic via mobile as well.

    * Stock availability / Virtual aisle / Endless shelf. More than 50 percent of consumers expect to shop online and see if stock is available in store.

    Omni channel consumers now fully realize that stores can't begin to stock every model, style and color. While consumers can see hundreds if not thousands of products in store, they know that there are millions online.

    What can retailers do? The survivors are literally creating a seamless experience between online, store and mobile apps so the consumer can "have it their way" anywhere, anytime.

    * Consumer experience still rules. Consumer experience still needs to come down to senses: Tactile, visual, and psychological.

    What can retailers do? Virtual dressing rooms, better in-store experiences, and adoption of new disruptive technology to address and satisfy these issues.

    * Personalization of products and services.

    What can retailers do? New survivors are emerging with "mass personalization" opportunities to custom tailor your clothes or curate your personal wardrobe assortment and send it to you.

    * Social Connections and the influence of the opinions of others. Social has become a primary source of research and validation on what to buy. Today's consumers are 14 times more likely to believe the advice of a friend than an ad.

    What can retailers do? Today's major brands are giving much more attention to and spending more dollars on social media than traditional media.

    II. Technology

    Disruptors share the common purpose to create businesses, products and services that are better -- usually less expensive and always more creative, useful, impactful -- and scalable.

    What can retailers do? Put into use as soon as possible disruptive technology solutions such as price and assortment intelligence, behavioral economics, customer experience analytics, predictive analytics, and more to help understand, meet, and outgun the competition and service the customer.

    A Note on Predictive Analytics.

    Dr. Christensen subscribes to predictive analytics as, "the ability to look at data from the past in order to succeed in new ways the future." Predictive analytics solutions, the capability to forecast consumer purchase trends in order to sell the most products at the best prices at any given time are coming on strong.

    Bottom Line For Your Bottom Line

    There's never been a time of more disruptive change in retail. Retailers who are the most adaptable to change -- and not the strongest nor more intelligent of the species -- will be the ones to survive.

    It's a case of keeping yourself on top of the tsunami of change through the mastery of today's and tomorrow's new disruptive technologies.

    *Thanks to Chris H. Petersen, PhD, CEO of Integrated Marketing Solutions, a strategic consultant who specializes in retail, leadership, marketing, and measurement.

    Source: upstreamcommerce.com, February 8, 2015

  • Why You Should Be Securing Big Data In The Cloud?

    160104-Cloud-800x445Combining big data and the cloud is the perfect solution for a company's computing needs. A company's data often requires a computing environment which can quickly and effectively grow and flex, automatically accommodating large amounts of data. The cloud computing environment does just that. There is one question which continually arises when discussing cloud computing.

    How secure is the cloud?

    Securing data, especially big data is a major concern. Companies expect that any data stored in the cloud will be secured and that the security measures will be flexible to keep up with a changing threat environment. There are four ways to keep your big data secure in the cloud. Each will keep your data safe and yet provide the flexibility that is inherent to using the cloud.

    1. Encrypt Your Sensitive Data

    Encrypting your data provides another level of security within your cloud infrastructure. Each security solution must be customized to the project and the data. There is no single type of encryption that will work for every situation. Certain types of premise gateway encryption solutions do not work well with cloud big data situations. Other solutions, including encryption provided by the cloud provider, ask the end user to rely on someone else to encrypt their data. This is often a risky proposition and most companies will not agree.

    Encryption solutions such as split-key encryption were developed specifically for data storage in the cloud. These technologies keep the cloud data safe, providing encryption keys which the customer holds and uses.

    Split-key encryption is the safest and most effective means of encrypting cloud-based data.

    2. Use cloud security technologies which can be scaled to meet changing requirements.

    When it comes to big data, solutions must quickly scale to meet the demand. This is the same for security technologies for cloud data. Ensure any cloud security technology you choose is available and relevant across any and all cloud locations. Additionally, to be effective, any cloud security solution must be able to quickly scale to meet demands and changing requirements.

    Because of the inability to quickly scale and grow, hardware solutions are not a viable option for securing cloud big data. It is not possible to adapt a hardware security module (HSM) quickly enough to meet continuously changing data security requirements.

    Only a cloud-based solution will provide the ease and efficiency to scale quickly in response to demand. These solutions are just as, if not more effective than hardware-based technologies. Additionally, cloud based solutions such as CDNs provide security and are also allow for faster working sites and tools.

    3. Automate where possible

    Many companies are not happy with traditional cloud security solutions because they will not scale quickly to meet demand. Standard encryption technologies typically use an HSM element in their design. Since hardware cannot be automated, these security solutions are limited in their effectiveness within the cloud.

    The best cloud security solutions use virtual appliances instead of hardware within their systems. It is also important to ensure an effective RESTful API is part of any cloud security solution.

    A cloud security solution which includes a RESTful API and a virtual appliance will provide the automation and flexibility required to secure cloud big data.

    4. Never compromise on data security

    Cloud big data security solutions are often a complicated business. As a result, we often see systems which are not quite as comprehensive as they should be. Some cloud security systems designers will take a shortcut to get around the complexities involved in securing big data.

    For example, some systems will use freeware and their encryption tools to secure the data. They may keep the encryption keys in a physical location or on a disc which provides an opportunity for them to be lost or stolen. Using these types of shortcuts can certainly be easier, but they do not provide a viable security solution for cloud data storage.

    Companies must protect their data by mapping and reviewing the sensitivity of their data and then designing a cloud security solution to ensure it is protected. The right security solution can provide great results. Remember, not all cloud data storage is secure. If you have very sensitive or regulated data, you may need to search for another security solution to keep your data protected.

    Source: SmartDataCollective

EasyTagCloud v2.8