32 items tagged "Technology"

  • 5 Best practices to attract (and retain) talent for your organization

    5 Best practices to attract (and retain) talent for your organization

    By applying these best practices, you can bring on talent that keeps pace with innovation, shifting customer needs, and new technologies.

    It’s no secret that there is currently a massive technology talent shortage. As this Wall Street Journal article notes, tech leaders and recruiters alike increasingly feel the pressure to stay competitive, some even going so far as to offer perks like six-figure bonuses and the ability to work from anywhere they want. Hiring tech talent is a massive pain point across many organizations, and it’s at the top of most IT leaders’ to-do lists.

    Aside from outsized perks, what can organizations do to address the talent shortage? The key lies in looking for talent in new places and uncovering ways to connect with and inspire candidates before, during and after the interview process. Here are five ways to identify, hire, and retain the right team.

    1. Partner with schools

    If you’re not already doing so, build relationships with schools and make it a priority to partner with them to fuel student interest in your company. Current students are the future of your business, so working with universities early and often can both grow and keep your talent pool specialized.

    Many schools have begun implementing programs to directly address the shortage of technology talent, including degree programs in industries like cybersecurity and cloud. MIT, for example, among many others, now offers programs that match specific business needs like 'Ethics of AI' and 'Modeling and Optimization for Machine Learning'. Some cloud providers are teaming up with schools to offer programs and specialized degrees, and we have also seen great success in partnering with universities to sponsor research in engineering departments.

    2. Look to untapped pools of talent

    Beyond looking to recent graduates, consider untapped pools of talent to diversify your workforce. While often overlooked because of 'lack of relevant technical experience', veterans offer skills that could greatly impact your existing teams, including strong leadership, productivity and decision-making capabilities. We can look to companies like Salesforce for inspiration: Its veteran program Vetforce connects the military community with open IT positions.

    Another pool of talent often left behind are those who have taken time off and want to restart their careers, including parents with new children or those who had to care for a loved one in a time of need. Returnship programs for example. These programs help professionals with five or more years of work experience, and who have been out of the paid workforce for a minimum of two years, to bridge their transition back into the workforce. We have found excellent, talented employees through this channel.

    3. Ask the right questions

    Once you have a candidate in mind, ask the right interview questions to determine their potential fit on your team. My favorite interview question is 'What isn’t on your resume that you’d like to share?' A resume tells 'what' you did. But it doesn’t tell 'how' you did it. These stories often provide the most critical insight into a candidate. I want to hear how a candidate has overcome adversity and what they learned from their challenges. I prize candidates’ perseverance and determination rather than a list of accomplishments or schools they went to. Tell me what you did with what you had. With the technology industry changing at a rapid pace, we need candidates who are comfortable being uncomfortable in the name of positive change.

    4. Think beyond money with the job offer

    If you think you can entice today’s talent pool just with compensation, think again. Career growth opportunities now rank as the most important factor when looking for a new job. Offering plenty of opportunities for employee training and growth will not only entice potential candidates, but it will also keep current employees on board. We offer an array of training and certification programs so our employees can build marketable skills in enterprise cloud technology. These programs should be all about choice, enabling employees to design the mix of in-person, online, or video training that meet them wherever they work today. Large, high-growth companies can also offer candidates the ability to easily move between different teams at the company, learn from new groups and cross-pollinate ideas.

    5. Simplify redundant tasks

    Identifying areas where a company can simplify to boost productivity can be an equally important step to the above. For example, automating existing, repetitive IT tasks can help free up time to focus on more innovative, creative projects. At our company, we’re using the power of machine learning (ML) and natural language processing to augment our IT helpdesk and customer support services. Using ML technologies, more than 30% of all service requests are automatically resolved, freeing up both time and budget for value-creating activities.

    When it comes to hiring and retaining the best talent, it can feel like you’re in a losing race against a continually changing technology environment. But by keeping these best practices in mind, you can bring on talent that keeps pace with innovation, shifting customer needs, and new technologies.

    Author: David Sangster

    Source: Informationweek

  • BC - Business & Competitive - Intelligence

    BC (Business & Competitive) Intelligence

    Business Intelligence is zo´n begrip dat zich nauwelijks letterlijk laat vertalen. Bedrijfsintelligentie zou in de buurt kunnen komen, maar valt net als andere vormen van intelligentie moeilijk precies te duiden. Bedrijfsinzicht of -begrip komen wellicht nader in de buurt. Andere benaderingen (van andere auteurs) voegen daar bedrijfs- of omgevingsverkenningen als alternatieve vertaling aan toe.

    Om Business en Competive Intelligence goed te begrijpen maken we hier gebruik van een analytisch schema (tabel 1.1). Daarmee wordt het mogelijk de verschillende verschijningsvormen van BI te onderscheiden en daarmee de juiste variant bij het juiste probleem toe te passen. Belangrijk is dat het hierbij gaat om stereotypen! In de praktijk komen mengvormen voor.

    Het uitgangspunt is dat BI wordt gezien als een informatieproces waarbij met behulp van data, kennis of inzicht wordt geproduceerd.

     

    Data over de interne bedrijfsvoering

    Data over de bedrijfsomgeving

    Bedrijfskundige

    benadering

    A

    B

    Technologische

    benadering

    C

    D

    Tabel 1.1

    In de tabel is een bedrijfskundige van een technologische benadering te onderscheiden. BC Intelligence behandeld BI vanuit de bedrijfskundige processen die dienen te worden ondersteund. Er bestaat ook een technologisch perspectief op BI. Het uitgangspunt van deze benadering is veeleer te profiteren van de mogelijkheden die informatietechnologie biedt om bedrijfsinzicht te verkrijgen.. Op de andere as in het schema worden data over de interne bedrijfsvoering (interne data) van data over de bedrijfsomgeving (externe data) onderscheiden. We spreken met nadruk over onderscheiden in plaats van gescheiden categorieën. In de gebruikspraktijk blijken de categorieën namelijk wel te onderscheiden maar nauwelijks te scheiden. Ze kunnen niet zonder elkaar en zijn vaak ondersteunend of complementair.

    Business Intelligence

     

    Data over de interne bedrijfsvoering

    Data over de bedrijfsomgeving

    Bedrijfskundige

    benadering

    A

    B

    Technologische

    benadering

    C

    D

    Hoewel het onderscheid arbitrair is en de term BI net zo goed voor het totale quadrant gereserveerd zou kunnen worden (met CI als deelverzameling) hebben veel BI projecten betrekking op de cellen A en C.

    BI gaat dus vaak op het optimaliseren van bedrijfsprocessen waarbij het accent ligt op het verwerwen van bedrijfsinzicht uit data over de onderneming zelf. Deze data genereren doorgaans kennis over de huidige situatie van de onderneming. Kennis die voor strategievorming en optimalisatie van bedrijfsresultaten (denk aan Performance Management) onontbeerlijk is.

    De technoloische component van BI wordt  door cel C gerepresenteerd. Helaas heeft deze invalshoek bij veel dienstverleners de overhand. Het accent ligt daarbij op de inrichting van een technologische infrastructuur die adequate kennis over de onderneming en haar prestaties mogelijk maakt. In cel C kunnen daarom zowel ETL-tools, datawarehouses als ook analytische applicaties worden gedacht.

    Redactioneel BI-kring:

    In de cel A hebben wij eigenlijk nauwelijks een categorie gedefinieerd.  Wat mij betreft zou daarPerformance Management thuis horen. Die term zou ik dus willen toevoegen. Als Key words kun je denken aan: Key Performance Indicators, Performance Process Management, Organizational Performance,  PDCA (Plan Do Check Act) Cycle, Performance Planning.

    Voor wat betreft C kunnen we verwijzen naar bovenstaande tekst:Datawarehousingen OLAPzijn daar de centrale elementen.Key words zijn dat databases, ETL (Extraction, Transformation and Load), , architecture, data dictionary, metadata, data marts.

    Met betrekking tot OLAP zijn key words:analytische applicaties, reporting, queries, multidimensionale schema’s, spreadsheet, Kubus, data mining.

    Competitive Intelligence

     

    Data over de interne bedrijfsvoering

    Data over de bedrijfsomgeving

    Bedrijfskundige

    benadering

    A

    B

    Technologische

    benadering

    C

    D

    CI is het proces waarin data over de omgeving van de onderneming in een informatieproces worden getransformeerd in ´strategisch bedrijfsinzicht´. Hoewel de term Competitor en Competitive Intelligence vanaf de tachtiger jaren wordt gebruikt heeft deze benadering ook in de jaren zeventig al aandacht gehad onder de noemer 'environmental scanning'.

    CI speelt een belangrijke rol in strategische maar ook andere bedrijfskundige processen. Prestaties van de onderneming, concurrentiepositie, mogelijke toekomstige posities als ook innovatievermogen kunnen slechts worden bepaald met behulp van kennis over de bedrijfsomgeving.

    Redactioneel BI-kring:

    Competitive Intellience heeft dus te maken met alle informatievoorziening die wordt georganiseerd om de concurrentiepositie van ondernemingen te kunnen bepalen, beoordelen en veranderen. Het raakt dus direct aan strategie, strategische intelligence, concurrentie-analyse, concurrentiepositie, en alle intelligence die nodig is om de positie van de onderneming in de omgeving goed te kunnen beoordelen.

    Het organiseren van CI is in organisaties nog steeds zwaar onderbelicht. Het blijkt moeilijk structuur aan te brengen in de noodzakelijke informatieprocessen als ook om ze uit te voeren. De inrichting van een ´systeem´ dat dit proces zou moeten realiseren staat in het middelpunt van de aandacht maar is voor veel organisaties ook nog een brug te ver. Een verantwoorde ontwikkelbenadering vergroot de succeskansen echter aanzienlijk.

    Data over de bedrijfsomgeving zijn vaak ongestructureerd van aard en in de organisatie voorhanden. De kunst is deze data beschikbaar te maken voor de besluitvorming. Wanneer de data niet in de onderneming beschikbaar is verschillen de technieken en instrumenten die moeten worden ingezet om deze data te ontsluiten van de bij BI gebruikte technieken. De technieken varieren vandocumentmanagementsystemen tot information agents die zelfstandig het internet afzoeken naar interessante bouwstenen (data!). Bij het structureren en analyseren van de ongestructureerde documenten wordt text mining gebruikt (in geval van www; web-content-mining).

    Redactioneel BI-kring

    Om competitive Intelligence adequaat te ondersteunen en met name ook primaire data beschikbaar te maken ten behoeve van het proces zijn Collaboration tools populair. Het gaat hier over kennismanagement achtige systemen en shareware toepassingen die de data-, informatie- en kennisdeling faciliteren. Key words: kennismanagagement, shareware, sharepoint, knowledge management.

    Overzicht data categorieen BI-kring

    Cel A format

    • Performance Management

    Key words:Key Performance Indicators, Performance Process Management, Organizational Performance,  PDCA (Plan Do Check Act) Cycle, Performance Planning

    Cel C format

    • Datawarehousing

    Key words:databases, ETL (Extraction, Transformation and Load), , architecture, data dictionary, metadata, data marts., Big Data

    • Online Analytical Processing

    Key words:analytische applicaties, reporting, queries, multidimensionale

     schema’s, spreadsheet, Kubus, data mining, dashboarding.

    Cel B format

    • Competitive Intelligence

    Key words:strategie, strategische intelligence, concurrentie-analyse, concurrentiepositie, competitor intelligence, technological intelligence, environmental scanning, environmental intelligence.

    • Content (Competitive Intelligence als product)

         Key words:

    Cel D format

    • Collaboration

    Key words:kennismanagagement, shareware, sharepoint, knowledge management.

    • Search methodologies

                     Key words:documentmanagement systemen, spider technologie,

    ongestructureerde informatie, information agents, text mining, content mining. Search technologies.

    Integraal tav hele schema (intelligente organisaties hebben het hele model integraal geimplementeerd)

    • Intelligente organisatie

    Key words:Management informatie, Intelligente organisatie, lerende organisatie, organisationeel leren, leren, Intelligence change management.

     

    Bron: Egbert Philips

     

  • Being successful as a data-driven company through good IT management

    Being successful as a data-driven business through good IT management

    Data-driven businesses are far more successful than companies that don’t utilize data to their advantage. Unfortunately, they often find that managing their data effectively can be a challenge.

    Companies that rely on big data need a reliable IT department. You have to make sure that your IT infrastructure is adequately equippedto handle the volume of data your company will be processing and that it will be properly secured.

    Companies that Rely on Big Data Must Have a Functional IT Department

    Due to modern advancements in big data technology, the IT sector is becoming more competitive with each passing day. Companies leverage modern technologies to streamline business operations and gain a competitive edge in this competitive marketplace. According to EasyVista, more than 70% of companies worldwide will have invested in digital technologies in 2022, which will increase in the coming years. This figure will increase as big data becomes even more important.

    These numbers indicate that embracing IT organizational structure best practices 2021 is the key to staying competitive, streamlining business operations, and outperforming competitors.

    While businesses are constantly looking for ways to grow as a whole, sometimes they often invest in the technology that generates no results. In fact, according to Gartner, it is expected that business owners will spend $750 million on investing in ineffective features of IT tools, up from $600 million in 2019. Thus, it is wise to implement a few essential ITSM best practices to combat overspending.

    You have to make sure that you invest in the right technology to make the most of big data. Keep reading to learn how to do this.

    What Is the Role Of IT Management In a Data-Centric Organization?

    IT management plays a crucial role in every organization. Companies that rely on big data have to use it even more. IT managers are responsible for planning, coordinating, communicating, and leading computer-based activities in an organization. Moreover, they are also responsible for researching new technologies and understanding how they can help the business grow.

    In short, they help determine the department’s needs and implement the best practices to fulfill the organization’s system requirements. Besides researching new technologies and streamlining the tech operations, they also seek to mitigate IT risks, such as threats to cybersecurity and misalignment between IT professionals and business requirements.

    The IT department in an organization focuses on meeting the following objectives:

    IT management is also responsible for organized IT business operations. They implement the best practices and regularly measure the performance to understand which areas need improvement. This will help you make the most of your data resources.

    4 Best Practices For IT Management in Companies that Rely on Big Data

    Companies that depend on big data will need to follow these practices to create a functional IT department.

    1. Build A Clear Strategy

    Initiate your journey by defining your business goals and vision. Develop a roadmap vision for your IT service management goals, as well as the types of data that you intend to store. Then, create a systematic approach to measure your efforts, define KPIs, and the state at every stage of the implementation process.

    A roadmap should cover all three essential domains, including:

    • Front-end IT: It includes service design, service operations, and service transition. The perfect example of front-end IT is Service Desk applications.
    • Middle IT: It includes business frameworks and automation efforts.
    • Back-end IT: It includes leveraging modern technologies such as AI, IoT, robotics, etc., to help executives make informed business decisions.

    You have to get this part right as a company that depends on big data.

    2. Invest In The Right Technology

    A business that invests in the right technology invests in its overall growth. It will also make employees more efficient at work, resulting in streamlining business operations. However, choosing the right tool can be challenging, especially for startups.

    The reason is that they don’t determine their business needs, and they have a tight budget, too. However, it doesn’t mean investing in cheap tools and regretting later. Instead, it is wise to monitor your business needs and invest accordingly.

    Digital transformation has taken over the corporate world like a storm. Most businesses have embraced digital transformation technologies to automate tasks and improve the employee experience. Artificial Intelligence performs human-like tasks like problem-solving, speech, and text recognition.

    Moreover, it can accomplish specific tasks by analyzing vast amounts of data and recognizing recurrent patterns in these data recurrent patterns. According to Transparency Market Research, the global market for AI is estimated to gain 36.1% CAGR between 2016 and 2024. It is expected to reach a soaring height of $3,061.35 billion by 2024. Thus, investing in such technologies can benefit your business a lot in the long run.

    3. Seek Help from Top Management

    Your decisions can make or break your business. To ensure business sustainability and revamp the business operations, involving senior management in the process can be a better option. They are well-experienced executives and know what’s good and bad for a business.

    Without the involvement of senior management in the business decision process, revamping the IT department for business success may become an enormous challenge.

    4. Remote Work Expansion

    It would be almost impossible to discuss the recent changes in the corporate world without mentioning the importance of remote working. Thanks to the pandemic; it introduced the remote work or work-from-home model across companies worldwide.

    Today, companies must adapt to modern corporate world changes to boost employees’ productivity and overall business growth. However, if you believe that returning employees to the office will enhance productivity, you are mistaken. According to Flexjobs, more than 94% of companies report that their productivity has been the same (67%) or higher (27%) since employees started working remotely.

    Another benefit of embracing the remote work model is that you are no longer limited to hiring local IT talent. You can boost your business efficiency by hiring talented folks across boundaries.

    IT Management is Crucial for Companies that Depend on Big Data

    Adapting to modern changes is challenging, even for companies that utilize big data effectively. You can’t create a successful data-driven company without a dependable IT department. You need to be clear about your goals and accordingly plan, communicate, and implement the right strategies that contribute to your business growth.

    Author: Angela Steinman

    Source: Smart Data Collective

  • Blockchain-based banking backend Vault OS from ex-Googler emerges from stealth mode

    vaultos featureDespite holding the vast majority of the world’s wealth (or perhaps because of that), banks aren’t exactly hotbeds of cutting-edge tech, often relying on decades-old systems for everyday tasks. ThoughtMachine, a company led by ex-Google engineer Paul Taylor, is looking to change that with a modern, fully integrated, blockchain-based banking operating system called Vault OS.

    The bombastic press release announcing the system’s emergence from two years of stealth development makes a lot of promises: the company “has solved the greatest challenge in fintech;” Vault OS is “100% future-proof,” “hugely flexible,” and “fixes broken banking forever.”

    Whether Vault OS is able to live up to its own hype is a question that will have to wait (legacy banking systems aren’t replaced overnight) — but it’s hard to deny that the problem is real and the solution, or at least what the company reveals of it, is compelling.

     

    ThoughtMachine’s Paul Taylor

    The main job of Vault OS is to perform the core function of a bank: essentially, maintaining a huge ledger. That’s something that a blockchain is uniquely suited to doing, of course, a fact that clearly did not escape Taylor, whose previous work led to the speech recognition software used by Google today.

    Each instance of the OS will run its own private blockchain and cryptographic ledger, hosted as a service by ThoughtMachine. Of course, whether banks will be willing to essentially permanently outsource their most fundamental operations is yet another big question.

    The benefits may be worth it: blockchains are secure, scalable, and versatile, and could conceivably replace legacy systems that limit or delay ordinary operations. Transactions would occur in real time, and are safely and centrally stored, allowing for deep data dives by both bankers and consumers. There’s even an API.

    Naturally there are a ton of questions that must be answered, and assurances made, and regulations complied with, before any bank will touch this with a ten-foot pole. I’ve contacted ThoughtMachine with several — will they release code or a whitepaper for inspection? How is data migration handled? What’s the timescale for rollout? — and will update this post if they get back to me.

    Source: Techcrunch

  • Business Intelligence and beyond: predictions for 2016

    businessIt’s been an interesting year for BI – and 2016 looks set to be no different

    Here are some predictions on what we believe next year has in store, in particular for the data and analytics industry.

    1. Cannibalisation of the channel
    Next year will see many vendors looking to take back control, rather than invest in their channel partners. The danger for the channel is that this will result in vendors keeping good deals or redirecting services projects back to themselves. Platforms such as Amazon Web Services and Microsoft Azure have grown exponentially this year. Another risk is the continued trend of vendors developing hosted solutions, via platforms such as these that cut out their channel partners. In response to this, the channel needs to look for vendors with a transparent indirect strategy in place and form mutually beneficially relationships.

  • Business Intelligence nog steeds hot….

    Business Intelligence outdated? Niets is minder waar zo bewees het Heliview congres ‘Decision making by smart technologies’ dat afgelopen dinsdag in de Brabanthallen in Den Bosch werd georganiseerd.

    200 Klantorganisaties luisterden naar presentaties van o.a. Rick van der Lans, Peter Jager, Frank de Nijs en Arent van ‘t Spijker. Naast het bekende geluid was er ook veel nieuws te beluisteren in Den Bosch.

    Nieuwe technologieën maken heel veel meer mogelijk. Social media en, moderne, big data technologie stellen organisaties in staat veel meer waarde uit data te halen. Hoe organisaties dat moeten doen is veelal nog een uitdaging. Toepassing van de technologie is geen doel op zich zelf. Het gaat erom toegevoegde waarde voor organisaties te produceren. Of door optimalisatie van processen. Dan wel door het beter bedienen van de klant door productontwikkeling. In extremis kan data zelfs de motor achter nieuwe business concepten of –modellen zijn. Voorwaarde is wel een heldere bedrijfsvisie (al dan niet geproduceerd met intelligent gebruik van data en informatie). Belangrijk om te voorkomen dat we ongericht miljoenen stuk slaan op nieuwe technologie.

    Voor de aanwezigen was het gehoorde geluid soms bekend, maar soms ook een confrontatie met zichzelf. Een ding is zeker: De rol van data en informatie bij het intelligent zaken doen is nog niet uitgespeeld. Business Intelligence leeft.

    30 JANUARI 2015

  • De tijd van excel nu echt over?

    29 Mei 2015

    De hoeveelheid data waarmee een marketeer of sales verantwoordelijke moet werken zijn de afgelopen jaren enorm toegenomen. Die ontwikkeling is exponentieel. Met de opkomst van e-commerce, de introductie van klantprogramma’s en de opkomst van beacons neemt de hoeveelheid data die we van consumenten krijgen alleen maar toe.

    De praktijk is echter wel dat veel marketeers nog werken met Excel om hun campagnes, business cases en planningen te managen. Als je voor al je klanten een persoonlijke campagne wilt samenstellen dan is dit niet meer houdbaar. Excel is niet te koppelen aan online tools, versiebeheer is een groot probleem en de inhoud is per definitie achterhaald als het bestand geopend wordt. Wij voorspellen dat in 2015 marketeers en salesverantwoordelijken overstappen op online tools om hun proces beter te managen én de data daaruit direct te gebruiken om persoonlijker en relevanter te worden.

    Meer weten over deze tools en de expertise gebieden die noodzakelijk zijn om ze goed te implementeren? Volg BI-kring de komende maanden.

    BI-kring redactie

  • Diminishing stress related to technology for the people most important to your organization

    Diminishing stress related to technology for the people most important to your organization

    The excitement to adopt the smartest interface is pushing businesses to lose their focus on two important things: employees and customers.

    Your ability to become a successful company of the future depends on developing a cultural mindset that is focused on creating value for the people inside and outside the organization. The myth that technology drives digital transformation has been an ongoing fairy tale because while technology is an important factor, there is another element to the equation that creates a strong dependency on the first — the people. 

    Technology in most ways has a positive effect on business operations, especially in the automation of admin processes that come with communication with customers. While artificial intelligence is getting to know your customers by analyzing their wants and needs, it also means collecting huge amounts of customer insights. Some other digital tools such as chatbots are being used to interact with customers instead of a customer service agent and in some exceptional cases, this works just fine. The belief that these digital tools can improve customer engagement like a miracle is just an illusion however, despite offering an effective and efficient way to speak to large audiences. The reality is that these technologies are greatly designed to replicate some sort of friendliness but are simply not able to offer the much-needed level of human connection customers need. 

    Covid-19 has been a wake-up call for businesses and has aggressively fast forwarded digital adoptions in working practices that most companies were not ready to take on. They were forced to send employees home who had to deal with the implementation and usage of new digital tools, something that should have progressively happened years ago. According to research carried out by Mckinsey, it would have taken businesses more than a year to implement the level of remote working that was enforced as a result of the crisis. Despite the advantages of the nature of these digital technologies, the sudden change led to huge gaps of acceptance among the workforce and this is because employees have different needs, challenges and technical proficiency. 

    Employees and customers are often slow to adapt to new ways of doing things so now it’s time to ask yourself: What can I do to reduce the tech stress of my customers and employees and make their lives easier? I suggest two main things to consider: 

    1. Focus on the people

    Digitalization has many different positive aspects, but the more digital your business becomes the less human touch it can provide. Customers today expect more human interactions and less automated interactions. The role of new tech should make the life of your employees easier and ultimately highly complement their tasks so that they can focus on the emotional side of customer relationships.

    The rush for easily monetizable consumer automated interactions makes it clear for customers that a brand is not authentically engaging with them. A Harvard Business Review study shows that companies are becoming increasingly impersonal by automating as many customer touchpoints as possible. In a highly digitalised world the human factor in customer experience gives your business a distinct competitive edge. The latest technology gets prioritised too often over authentic customer engagement.

    First and foremost you should create an authentic and trusted customer relationship and then with the consent of your customers, use the technology available (predictive analytics and machine learning) to personalise the interactions with them. Not the other way around. 

    2. Reassess your digital initiatives

    As you’ve been experimenting with a huge number of virtual operations and interactions since before the pandemic started, you now have the opportunity to assess which technologies are extremely needed and which are not. The world of business is changing, some things will go back to previous ways, while others will remain changed forever.

    You might think that using many different business tools to automate and improve processes will skyrocket productivity. However, switching between too many apps has some side effects such as lower productivity, higher costs and lack of collaboration in teams, to name a few. Developing a digital state of mind requires you to engage, educate and provide continuous support to your employees. Low employee stress levels and making sure their experience remains positive throughout are as important as deciding which new technology to adopt. Digital transformation should have your people at the core because your people will be those who will make a successful transformation happen. 

    As I’m writing my book on customer-centricity, I find it imperative that companies find the right balance between the use of technology and human interactions. The challenge of the future is not whether artificial intelligence will replace people’s jobs but rather how to create a business culture in which technology and employees are able to walk hand-in-hand to provide human-driven customer experiences.

    Author: Ilenia Vidili

    Source: Entrepreneur

  • Drawing value from data in manufacturing companies

    Drawing value from data in manufacturing companies

    The modern manufacturing world is a delicate dance, filled with interconnected pieces that all need to work perfectly in order to produce the goods that keep the world running. In this article, we explore the unique data and analytics challenges manufacturing companies face every day.

    The world of data in modern manufacturing

    With the datasphere growing exponentially to an expected volume of 175 zettabytes by 2025 , it stands to reason that manufacturing is experiencing the radical impact of this growth, just as much as other business areas.  Manufacturing companies that adopted computerization years ago are already taking the next step as they transform into smart data-driven organizations.

    It’s easy to see why. Manufacturing constantly seeks ways to increase efficiency, reduce costs, and unlock productivity and profitability. Data is a critical tool for identifying where and how that can be done in any manufacturing process. Whatever your department, whether you’re concerned with production, inventory, warehousing, or transportation and other logistics, knowing precisely how your operation is running nd where it can be improved is essential to improving your bottom line.

    From a practical perspective, the computerization and automation of manufacturing hugely increase the data that companies acquire. And cloud data warehouses and data lakes give companies the capability to store these vast quantities of data. However, it’s only useful if you can accurately analyze it and get the insights you need to enhance your business.

    Modern factories are full of machines, sensors, and devices that make up the Internet of Things. All of them generate a trail of performance-tracking data. The challenge for manufacturers is to capture all this data in real-time and use it effectively. To achieve this, they need a BI and analytics platform that can transform the data into actionable insights for their business. And a substantial proportion of this data can be gathered and organized by analytics embedded at the network’s edge, within the manufacturing equipment itself. As a result, you can get insights faster, at the very place that they’re generated, and without the need for IT teams to gather, analyze and generate reports, which is time-consuming and uses resources that could be better applied elsewhere.

    Here are three key areas where data adds value to the manufacturing process to give companies a competitive edge.

    How data enhances product development

    Every part of a business generates big data. Analyzing data from disparate sources to identify relationships between processes, causes, and effects is part of what helps a business hone its product development strategy, manufacturing processes, the marketing and sales of those products, and the logistics of supply chain and delivery.

    We asked Christine Quan, Sisense BI Engineer in sales, how she thinks data helps product development, and she said: 

    'Surveying market data enables you to have a better understanding of customer needs and can also be a way to gather feedbacj for initial product ideas'.

    Indeed, data enables a company to understand its customers better. With this information, it can develop new products or improve existing products to meet customers’ needs. At the same time, data can inform a company about potential markets so it can judge how much risk an innovation carries. Consequently, this risk can be mitigated in the product development process, because the more a manufacturer knows before production, the less of a gamble it is. Furthermore, actionable insights derived from data both before and during production can be used to plan and hone the manufacturing process, and enhance many operational aspects.

    Take BraunAbility, for example. The company manufactures and installs adaptations to personal and commercial vehicles that make them wheelchair accessible. Using a BI and analytics platform, BraunAbility has improved its understanding of customer preferences in different markets. Data has given the company the insights to drive production of the most in-demand products, make informed decisions about what it keeps in stock and even what product discounts should be offered to impact the sales rate positively. With this new information, BraunAbility sees better profit margins across the board.

    Data improves and streamlines production quality control

    The analysis of big data sets generated in the manufacturing process can minimize production defects and keep quality standards high, while at the same time increasing efficiency, wasting less time, and saving more money.

    Embedded analytics are particularly valuable in terms of quality control and optimizing manufacturing efficiency. Computerized and automated monitoring systems, far more sensitive and accurate than the human eye, capture discrepancies more accurately and more cheaply, around the clock. This continuous, smart, machine-based scrutiny significantly decreases the number of tests essential to maintain quality parameters. Data can also be used to calculate the probabilities of delays, to identify, develop and implement backup plans.

    Embedded analytics are also faster and more autonomous than more traditional data analysis. With embedded analytics, it’s no longer necessary for data analysts to feed the data lake to the stand-alone cloud data warehouse, then mash up the data and verify the results. Analytic technology embedded within machinery can do the job at the point at which the data is generated. So, less intervention from data analysts is necessary, decisions can be influenced directly by data and processes are accelerated, using fewer resources.

    Effectively, Big Data enables manufacturers to improve and streamline their processes across production and quality control. As Christine Quan explains: 

    'Setting up a comprehensive data feedback loop enables you to get real-time information about all aspects of your manufacturing processes, so you can rapidly calibrate them to boost production efficiency'.

    his is particularly pertinent to asset-heavy industries such as pharmaceuticals, electronics, and aerospace parts manufacturing, in which superior asset management is critical for efficient and profitable operation.

    Certain processing environments like pharmaceuticals, chemicals, and mining are prone to considerable swings in variability. Coupled with the number and complexity of elements in the production processes in these industries, such companies can find it challenging to maintain the stability and uniformity of processes. They can benefit most from advanced analytics because it provides a highly granular approach to diagnosing and correcting process flaws.

    McKinsey Company gives the example of the biopharmaceutical industry that includes the manufacture of vaccines, hormones, and blood components. They are made using live, genetically engineered cells, and production often involves monitoring hundreds of variables to ensure the purity of the ingredients and the substances being made. Two batches of a particular substance, produced using the same process, can still considerably vary in yield without explanation. This can detrimentally affect capacity and product quality and can attract intensified attention from regulators.

    Advanced data analytics can overcome this issue without incurring huge costs. By segmenting the manufacturing process into clusters of related production activities, gathering data about these, and analyzing the data to show interdependencies, it’s possible to identify stages in the process that influence the variability in yield. Address those and the yield can increase by a value of millions of dollars per product.

    Improving the supply chain and mitigating its risk

    Major manufacturing processes require a lot of raw materials and components that together form a complex supply chain. Inevitably, the larger and more complex the supply chain, the riskier and more prone to problems it is. Many supply chains struggle to gather and make sense of the huge volume of data that they generate. However, Christine points out that: 

    'Having the right data can help you de-risk decisions by providing a more holistic view of your supply chain'.

    That’s because big data analytics and cognitive technologies like machine learning bring visibility to supply chains, and help manufacturers manage them, mitigate the risks, offer a better customer experience, and therefore give them a competitive edge.

    Analyzing data can identify where and how problems are occurring and can even predict where delays and other issues might occur. So, robust analytics allows manufacturers to develop and implement contingency plans that enable them to harmonize the supply chain with manufacturing requirements, sustain the pace of production and maintain maximal efficiency, essential for the ongoing performance of your business.

    Making the changes work for manufacturing

    Of course, manufacturing predates the advent of smart data analytics and in some cases, it takes time for it to catch up with emerging trends. Nevertheless, manufacturers know that to stay ahead they need to adopt new processes and technologies involving data, analytics, AI and machine learning.

    These technologies can drive improvements in modern manufacturing environments that face the challenges of process complexity, variability, capacity, and speed. By applying smart data techniques to the manufacturing process, companies can meet and exceed demand and the requirements of the market, anticipate and avoid possible risks, minimize waste and reduce problems, and maintain high quality standards. Harnessing the power of big data and implementing the right analytics technology will ensure that manufacturers achieve their business goals more efficiently and cost-effectively than ever before. 

    Source: Sisense

    Author: Adam Murray

  • Forrester: Insights to help prepare your organization for 5G

    Forrester: Insights to help prepare your organization for 5G

    5G Presents immense innovation potential

    5G promises to usher in not just new use cases in every industry but also new business models.

    Some of the most relevant use cases across industries, such as those enabled by AR/VR and massive IoT, fit right into improving customer experience and digital transformation. As a change agent, 5G is among the most important technological enablers in this decade and the next. Therefore, investing and taking a deep look at 5G is critical at this time.

    5G Will develop rapidly through 2020 but is still developing nonetheless

    The 5G wireless arms race is fueled by the immense potential, so technology development is intense. Almost all current 5G announcements are regional siloed pilots and enhancements upon 4G LTA rather than actual 'at-scale' 5G standalone deployments. Manufacturers and operators have been aggressively pushing their 5G strategies. However, many challenges and uncertainties are still open: cost of network, monetization of use cases, regulatory challenges and, most importantly, the lack of mature standards.

    2018-19 Was a major leap in 5G standards, but beware the hype

    Through the 3GPP standards body, the industry had agreed to complete the non-standalone (NSA) implementation of 5G New Radio by December 2017, and this facilitated large-scale trials based on the specifications.

    Various sources cite numerous estimates about 5G. According to the International Telecommunications Union (ITU), commercial 5G networks are expected to start deployment after 2020. By 2025, the GSM Association (GSMA) expects 5G connections to reach 1.1 billion, which is estimated to be about 12 percent of total mobile connections. One Ericsson study estimates that 5G-enabled industry digitalization revenues for ICT players will be US$1.3 trillion in 2026. Still, current 5G reality is far from the profound expectations established by its proponents.

    Structuring your 5G thinking

    At Forrester, we have a deep bench of experts who are closely monitoring the developments and hype around 5G.

    Here is a simple framework:

    1. First, understand the background, technology, and the physical and business challenges behind practical implementations of 5G to cut through the hype. 
    1. There is a lot of talk about coverage in rural areas. In fact, bridging the digital divide is often touted to be a big plus of 5G. However, every early investment and the motivation behind it seem to suggest that at least until 5G achieves deployment scale, the digital divide may get worse. 
    1. Further, thoroughly assess your own 5G needs. Many current use cases probably do not need 5G. Hence, clearly understanding and nailing your use cases is an important vision to have. 
    1. Understand how 5G will transform your network operations, impact apps, and customer experience
    1. Finally, ask the right questions to your service provider on 5G timelines, cost, strategy, coverage, and implementation to understand what you can expect and to plan your investments in the coming months.

    Author: Abhijit Sunil

    Source: Forrester

  • Gartner: US government agencies falling behind digital businesses in other industries

    Gartner: US government agencies falling behind digital businesses in other industries

    A Gartner survey of more than 500 government CIOs shows that government agencies are falling behind other industries when it comes to planned investments in digital business initiatives. Just 17% of government CIOs say they’ll be increasing their investments, compared to 34% of CIOs in other industries.

    What’s holding government agencies back? While Gartner notes that their CIOs demonstrate a clear vision for the potential of digital government and emerging technologies, almost half of those surveyed (45%) say they lack the IT and business resources required to execute. Other common barriers include lack of funding (39%), as well as a challenge organizations across all industries struggle with: culture and resistance to change (37%).

    Another key challenge is the ability to scale digital initiatives, where government agencies lag by 5% against all other industries. To catch up, government CIOs see automation as a potential tool. This aligns with respondents’ views on 'game-changing' technologies for government. The top five in order are:

    • Artificial intelligence (AI) and machine learning (27%)
    • Data analytics, including predictive analytics (22%)
    • Cloud (19%)
    • Internet of Things (7%)
    • Mobile, including 5G (6%)

    Of the more than 500 government respondents in Gartner’s survey, 10% have already deployed an AI solution, 39% say they plan to deploy one within the next one to two years, and 36% intend to use AI to enable automation, scale of digital initiatives, and reallocation of human resources within the next two to three years.

    Investing today for tomorrow's success

    When it comes to increased investment this year (2019), BI and data analytics (43%), cyber and information security (43%), and cloud services and solutions (39%) top the tech funding list.

    As previous and current digital government initiatives start to take hold, CIOs are seeing moderate improvements in their ability to meet the increasing demands and expectations of citizens. 65% of CIOs say that their current digital government investments are already paying off. A great example of this is the U.S. Department of Housing and Urban Development’s use of BI and data analytics to modernize its Grants Dashboard.

    Despite budget and cultural change challenges typically associated with digital government initiatives, make no mistake: many agencies are making great strides and are now competing or leading compared to other organizations and industries.

    There’s never been a better time to invest in game changing technologies to both quickly catch up, and potentially take the lead.

    Author: Rick Nelson

    Source: Microstrategy

  • How Big Data Is Changing Disruptive Innovation

    jan16-27-543369765-1024x576Much fanfare has been paid to the term “disruptive innovation” over the past few years. Professor Clayton M. Christensen has even re-entered the fold clarifying what he means when he uses the term. Despite the many differences in application, most people agree on the following. Disruptive innovations are:

    Cheaper (from a customer perspective)

    More accessible (from a usability or distribution perspective)

    And use a business model with structural cost advantages (relative to existing solutions)

    The reason these characteristics of disruption are important are that when all three are present, it’s difficult for an existing business to respond to competition. Whether a company is saddled with fixed infrastructure, highly trained specialist employees, or an outmoded distribution system, quickly adapting to new environments is challenging when one or all of those things becomes obsolete. Firing hundreds of employees, upsetting your core business’ distribution partners, writing off billions of dollars of investment — these things are difficult for managers to even contemplate, and with good reason.

    Historically, the place we’ve looked for hints of oncoming disruptions has been in the low end of the market. Because disruptive products were cheaper, more accessible, and built on new technology architectures, they tended to be crummier than the existing highest-end solutions. Their cost advantage allowed them to reach customers who’d been priced out of an existing market; Apple originally made a computer that was cheap enough for students to learn on, a population that wouldn’t have dreamt of purchasing a DEC minicomputer. Sony famously made the transistor-based television popular based on its “portability.” No one knew that you could reasonably do that prior to the transistor. New technologies, combined with business model innovation, provide the structural cost advantage necessary to take large chunks of the market over time.

    But if you return to the definition above, the fact that low-end entry was typical of a disruptive approach was was never core to the phenomenon. Instead, it was a byproduct. Why? Because any new entrant is hard pressed to deliver superior value to a mature market, where products have been refined over decades.

    But although the low-end approach was pretty common, it wasn’t what was holding incumbent firms captive. It was their own cost structures and their focus on driving marginal profit increases that kept those companies headed down the wrong paths. As long making the right decision on a short-term basis (trying to drive more value out of outdated infrastructure) is the wrong decision on a long-term basis (failing to adopt new technology platforms), CEOs are destined to struggle.

    Unfortunately, the focus on the low-end approach of disruption is actually clouding our ability to spot the things that are: cheaper, more accessible, and built on an advantaged cost structure. Specifically, it appears that data-enabled disruptors often confound industry pundits. To get a sense for the point, just look to a few highly contested examples.

    Is Uber disruptive? The wrong answer would be to say, “No, because their first product started in the high end of the market.” The right answer would be to acknowledge that the platform they ultimately launched allowed them to add lower cost drivers (in the form of UberX) and offer cheaper, more accessible, transportation options with a structural cost advantage to both taxi services and potentially even car ownership. The convenience of the app is only the most obvious, and easiest to copy, factor.

    Were Google’s Android phones disruptive to Nokia? The wrong answer would be to say “No, because the initial smartphones they launched were superior in feature quality to Nokia’s own phones that dominated the global landscape.” The right answer would be to acknowledge that the approach of creating an ecosystem of application development atop its platform allowed them to build far more comprehensive solutions, that were (on the whole) cheaper, more accessible, and structurally cost advantaged over Nokia.

    Is 23andMe potentially disruptive to pharmaceutical companies? The wrong answer would be to say, “No, because they compete in completely different verticals.” One in ancestry and the other in drug development. The right answer would be to acknowledge that 23andMe has a vast amount of data that could enable them to start developing drugs in a cheaper, more accessible, and structurally advantaged model.

    In every one of these examples, the ultimate end is disruption. In every one of these examples, incumbent managers have a short term incentive to ignore the challenge — making best use of their existing infrastructure. Taxi companies tried to leverage regulation to preserve the value of their medallions and drivers. Nokia tried frivolously to protect its closed ecosystem and preserve employment for their thousands of Symbian focused staff members. And you can be certain that Merck, Pfizer, and Roche have strong incentives to make the best use of their high-end R&D functions before embracing the radically different path that 23andMe might take.

    And over the long term, each of these short-term decisions could lead to failure.

    The conversation misses that something new is going on in the world of innovation. With information at the core of most modern disruptions, there are new opportunities to attack industries from different angles. Uber built a platform in a fragmented limo market that let it come into transportation and logistics more broadly. Netflix captured your eyeballs through streaming video and used the data it had to blow up the content production process. Google mapped the world, and then took its understanding of traffic patterns and street layouts to build autonomous cars.

    There is no doubt that disruption is underway here. These players create products that are cheaper and more accessible than their peers. But it’s not necessarily starting at the low end of the market, it’s coming from orthogonal industries with strong information synergy. It’s starting where the source of data is, then building the information enabled system to attack an incumbent industry.

    It’s time for executives, entrepreneurs, and innovators stop quibbling over whether something satisfies the traditional path of disruption. Data-enabled disruption may represent an anomaly to the existing theory, but it’s here — and it’s here to stay. The waste laid to the taxi industry by Uber is example that the new solution had extraordinary cost advantages and that they couldn’t respond. The new questions should be:

    • “How can you adapt in the face of this new type of competition?”
    • “How do you evaluate new threats?”
    • “What capabilities do you need and where do you get them, when data is a critical piece of any new disruption?”


    To succeed in this new environment, threatened businesses need a thoughtful approach to identifying potential threats combined with the will to make the right long-term investments — despite short-term profit incentives.

    Source: Harvard Business review

  • How To Build A Game-Changing Team For Your Business

    corporate-team-buildingGetting a successful business up and running is a key skill for entrepreneurs. Building a team that can take it to the next level is another; and some might say, one of the most difficult to master. The people they need to bring on board must be at ease with autonomy, entrepreneurial, driven, and able to apply their skills to a wide range of tasks. They also need to share the founder’s vision. Above all, to be part of a game-changing growth strategy, they have to be great team players. But where do you find these people and how do you get them on board?

    Harness the team-building power of technology

    Does it matter if your next key player is based in London, New York, Tokyo or Rome? Not if you have access to the technology that can empower key hires to elevate the team and help the business achieve its goals from anywhere in the world. In-app advertising platform Tapdaq has just closed a $6.5million Series A funding round which will be used to further expand the company by hiring the best talent from across the globe.

    “We’re not into letting geography determine who we hire – we want to find the perfect person for the role,” says CEO and cofounder Ted Nash, a serial entrepreneur who has been creating online companies since he was 12 and was the world’s first teenager to achieve 1million App Store downloads.

    He adds: “There are processes you need to put in place to make sure everyone’s doing the job that’s being asked of them and to maintain a strong company culture, but having a global approach to your workforce allows you to tap into top talent from across the world, freeing your company from geographical boundaries.”

    Target the power players

    Business intelligence company DueDil is growing rapidly, doubling in size to 80 people in 2015, which has involved building core functions, such as the sales team, from scratch. Getting the right people in place to lead those key areas is crucial to success, and the London-based firm recently appointed Nick Noyer as VP of marketing. Noyer was previously director of EMEA marketing and international strategy at Box , where he led market entry into Europe.

    DueDil cofounder and CEO Damian Kimmelman said: “We’ve found smart leaders who bring new skillsets to the company, which is important. But for me, it’s critical to look for someone who can show they are hungry to succeed, as I want people alongside me who have something to prove and are motivated by big challenges. If they have that single attribute, they tend to rise to the obstacles we face as a company as we scale.”

    Source : Forbes

  • Investing In Artificial Intelligence

    shutterstock Artificial intelligence is one of the most exciting and transformative opportunities of our time. From my vantage point as a venture investor at Playfair Capital, where I focus on investing and building community around AI, I see this as a great time for investors to help build companies in this space. There are three key reasons.

    First, with 40 percent of the world’s population now online, and more than 2 billion smartphones being used with increasing addiction every day (KPCB), we’re creating data assets, the raw material for AI, that describe our behaviors, interests, knowledge, connections and activities at a level of granularity that has never existed.

    Second, the costs of compute and storage are both plummeting by orders of magnitude, while the computational capacity of today’s processors is growing, making AI applications possible and affordable.

    Third, we’ve seen significant improvements recently in the design of learning systems, architectures and software infrastructure that, together, promise to further accelerate the speed of innovation. Indeed, we don’t fully appreciate what tomorrow will look and feel like.

    We also must realize that AI-driven products are already out in the wild, improving the performance of search engines, recommender systems (e.g., e-commerce, music), ad serving and financial trading (amongst others).

    Companies with the resources to invest in AI are already creating an impetus for others to follow suit — or risk not having a competitive seat at the table. Together, therefore, the community has a better understanding and is equipped with more capable tools with which to build learning systems for a wide range of increasingly complex tasks.

    How Might You Apply AI Technologies?

    With such a powerful and generally applicable technology, AI companies can enter the market in different ways. Here are six to consider, along with example businesses that have chosen these routes:

    • There are vast amounts of enterprise and open data available in various data silos, whether web or on-premise. Making connections between these enables a holistic view of a complex problem, from which new insights can be identified and used to make predictions (e.g., DueDil*, Premise and Enigma).
    • Leverage the domain expertise of your team and address a focused, high-value, recurring problem using a set of AI techniques that extend the shortfalls of humans (e.g., Sift Science or Ravelin* for online fraud detection).
    • Productize existing or new AI frameworks for feature engineering, hyperparameter optimization, data processing, algorithms, model training and deployment (amongst others) for a wide variety of commercial problems (e.g., H2O.ai, Seldon* and SigOpt).
    • Automate the repetitive, structured, error-prone and slow processes conducted by knowledge workers on a daily basis using contextual decision making (e.g., Gluru, x.ai and SwiftKey).
    • Endow robots and autonomous agents with the ability to sense, learn and make decisions within a physical environment (e.g., Tesla, Matternet and SkyCatch).
    • Take the long view and focus on research and development (R&D) to take risks that would otherwise be relegated to academia — but due to strict budgets, often isn’t anymore (e.g., DNN Research, DeepMind and Vicarious).

    There’s more on this discussion here. A key consideration, however, is that the open sourcing of technologies by large incumbents (Google, Microsoft, Intel, IBM) and the range of companies productizing technologies for cheap means that technical barriers are eroding fast. What ends up moving the needle are proprietary data access/creation, experienced talent and addictive products.

    Which Challenges Are Faced By Operators And Closely Considered By Investors?

    I see a range of operational, commercial and financial challenges that operators and investors closely consider when working in the AI space. Here are the main points to keep top of mind:

    Operational

    • How to balance the longer-term R&D route with monetization in the short term? While more libraries and frameworks are being released, there’s still significant upfront investment to be made before product performance is acceptable. Users will often be benchmarking against a result produced by a human, so that’s what you’re competing against.
    • The talent pool is shallow: few have the right blend of skills and experience. How will you source and retain talent?
    • Think about balancing engineering with product research and design early on. Working on aesthetics and experience as an afterthought is tantamount to slapping lipstick onto a pig. It’ll still be a pig.
    • Most AI systems need data to be useful. How do you bootstrap your system w/o much data in the early days?

     Commercial

    • AI products are still relatively new in the market. As such, buyers are likely to be non-technical (or not have enough domain knowledge to understand the guts of what you do). They might also be new buyers of the product you sell. Hence, you must closely appreciate the steps/hurdles in the sales cycle.
    • How to deliver the product? SaaS, API, open source?
    • Include chargeable consulting, set up, or support services?
    • Will you be able to use high-level learnings from client data for others?

    Financial

    • Which type of investors are in the best position to appraise your business?
    • What progress is deemed investable? MVP, publications, open source community of users or recurring revenue?
    • Should you focus on core product development or work closely on bespoke projects with clients along the way?
    • Consider buffers when raising capital to ensure that you’re not going out to market again before you’ve reached a significant milestone. 

    Build With The User In The Loop

    There are two big factors that make involving the user in an AI-driven product paramount. One, machines don’t yet recapitulate human cognition. To pick up where software falls short, we need to call on the user for help. And two, buyers/users of software products have more choice today than ever. As such, they’re often fickle (the average 90-day retention for apps is 35 percent).

    Returning expected value out of the box is key to building habits (hyperparameter optimization can help). Here are some great examples of products that prove that involving the user in the loop improves performance:

    • Search: Google uses autocomplete as a way of understanding and disambiguating language/query intent.
    • Vision: Google Translate or Mapillary traffic sign detection enable the user to correct results.
    • Translation: Unbabel community translators perfect machine transcripts.
    • Email Spam Filters: Google, again, to the rescue.

    We can even go a step further, I think, by explaining how machine-generated results are obtained. For example, IBM Watson surfaces relevant literature when supporting a patient diagnosis in the oncology clinic. Doing so improves user satisfaction and helps build confidence in the system to encourage longer-term use and investment. Remember, it’s generally hard for us to trust something we don’t truly understand.

    What’s The AI Investment Climate Like These Days?

    To put this discussion into context, let’s first look at the global VC market: Q1-Q3 2015 saw $47.2 billion invested, a volume higher than each of the full year totals for 17 of the last 20 years (NVCA).

    We’re likely to breach $55 billion by year’s end. There are roughly 900 companies working in the AI field, most of which tackle problems in business intelligence, finance and security. Q4 2014 saw a flurry of deals into AI companies started by well-respected and achieved academics: Vicarious, Scaled Inference, MetaMind and Sentient Technologies.

    So far, we’ve seen about 300 deals into AI companies (defined as businesses whose description includes such keywords as artificial intelligence, machine learning, computer vision, NLP, data science, neural network, deep learning) from January 1, 2015 through December 1, 2015 (CB Insights).

    In the U.K., companies like Ravelin*, Signal and Gluru* raised seed rounds. approximately $2 billion was invested, albeit bloated by large venture debt or credit lines for consumer/business loan providers Avant ($339 million debt+credit), ZestFinance ($150 million debt), LiftForward ($250 million credit) and Argon Credit ($75 million credit). Importantly, 80 percent of deals were < $5 million in size, and 90 percent of the cash was invested into U.S. companies versus 13 percent in Europe. Seventy-five percent of rounds were in the U.S.

     The exit market has seen 33 M&A transactions and 1 IPO. Six events were for European companies, 1 in Asia and the rest were accounted for by American companies. The largest transactions were TellApart/Twitter ($532 million; $17 million raised), Elastica/Blue Coat Systems ($280 million; $45 million raised) and SupersonicAds/IronSource ($150 million; $21 million raised), which return solid multiples of invested capital. The remaining transactions were mostly for talent, given that median team size at the time of the acquisition was 7 people.

    Altogether, AI investments will have accounted for roughly 5 percent of total VC investments for 2015. That’s higher than the 2 percent claimed in 2013, but still tracking far behind competing categories like adtech, mobile and BI software.

    The key takeaway points are a) the financing and exit markets for AI companies are still nascent, as exemplified by the small rounds and low deal volumes, and b) the vast majority of activity takes place in the U.S. Businesses must therefore have exposure to this market.

    Which Problems Remain To Be Solved?

    Healthcare

    I spent a number of summers in university and three years in grad school researching the genetic factors governing the spread of cancer around the body. A key takeaway I left with is the following: therapeutic development is very challenging, expensive, lengthy and regulated, and ultimately offers a transient solution to treating disease.

    Instead, I truly believe that what we need to improve healthcare outcomes is granular and longitudinal monitoring of physiology and lifestyle. This should enable early detection of health conditions in near real time, driving down cost of care over a patient’s lifetime while consequently improving outcomes.

    Consider the digitally connected lifestyles we lead today. The devices some of us interact with on a daily basis are able to track our movements, vital signs, exercise, sleep and even reproductive health. We’re disconnected for fewer hours of the day than we’re online, and I think we’re less apprehensive to storing various data types in the cloud (where they can be accessed, with consent, by third-parties). Sure, the news might paint a different story, but the fact is that we’re still using the web and its wealth of products.

    On a population level, therefore, we have the chance to interrogate data sets that have never before existed. From these, we could glean insights into how nature and nurture influence the genesis and development of disease. That’s huge.

    Look at today’s clinical model. A patient presents into the hospital when they feel something is wrong. The doctor must conduct a battery of tests to derive a diagnosis. These tests address a single (often late-stage) time point, at which moment little can be done to reverse damage (e.g., in the case of cancer).

    Now imagine the future. In a world of continuous, non-invasive monitoring of physiology and lifestyle, we could predict disease onset and outcome, understand which condition a patient likely suffers from and how they’ll respond to various therapeutic modalities. There are loads of applications for artificial intelligence here: intelligence sensors, signal processing, anomaly detection, multivariate classifiers, deep learning on molecular interactions...

    Some companies are already hacking away at this problem:

    • Sano: Continuously monitor biomarkers in blood using sensors and software.
    • Enlitic/MetaMind/Zebra Medical: Vision systems for decision support (MRI/CT).
    • Deep Genomics/Atomwise: Learn, model and predict how genetic variation influence health/disease and how drugs can be repurposed for new conditions.
    • Flatiron Health: Common technology infrastructure for clinics and hospitals to process oncology data generated from research.
    • Google: Filed a patent covering an invention for drawing blood without a needle. This is a small step toward wearable sampling devices.
    • A point worth noting is that the U.K. has a slight leg up on the data access front. Initiatives like the U.K. Biobank (500,000 patient records), Genomics England (100,000 genomes sequenced), HipSci (stem cells) and the NHS care.data program are leading the way in creating centralized data repositories for public health and therapeutic research.

    Enterprise Automation

    Could businesses ever conceivably run themselves? AI-enabled automation of knowledge work could cut employment costs by $9 trillion by 2020 (BAML). Coupled with the efficiency gains worth $1.9 trillion driven by robots, I reckon there’s a chance for near-complete automation of core, repetitive businesses functions in the future.

    Think of all the productized SaaS tools that are available off the shelf for CRM, marketing, billing/payments, logistics, web development, customer interactions, finance, hiring and BI. Then consider tools like Zapier or Tray.io, which help connect applications and program business logic. These could be further expanded by leveraging contextual data points that inform decision making.

    Perhaps we could eventually re-image the new eBay, where you’ll have fully automated inventory procurement, pricing, listing generation, translation, recommendations, transaction processing, customer interaction, packaging, fulfillment and shipping. Of course, this is probably a ways off.

    I’m bullish on the value to be created with artificial intelligence across our personal and professional lives. I think there’s currently low VC risk tolerance for this sector, especially given shortening investment horizons for value to be created. More support is needed for companies driving long-term innovation, especially considering that far less is occurring within universities. VC was born to fund moonshots.

    We must remember that access to technology will, over time, become commoditized. It’s therefore key to understand your use case, your user, the value you bring and how it’s experienced and assessed. This gets to the point of finding a strategy to build a sustainable advantage such that others find it hard to replicate your offering.

    Aspects of this strategy may in fact be non-AI and non-technical in nature (e.g., the user experience layer ). As such, there’s renewed focus on core principles: build a solution to an unsolved/poorly served high-value, persistent problem for consumers or businesses.

    Finally, you must have exposure to the U.S. market, where the lion’s share of value is created and realized. We have an opportunity to catalyze the growth of the AI sector in Europe, but not without keeping close tabs on what works/doesn’t work across the pond.

    Source: TechCrunch

  • Machine intelligence gaat dit jaar doorbreken

    543700Het Verkenningsinstituut voor Nieuwe Technologie (VINT) van Sogeti trapt het nieuwe jaar af met de ontwikkeling van vier onderzoeksrapporten over ‘machine intelligence’.

    Deze allesomvattende term voor kunstmatige intelligentie adresseert volgens VINT de situatie waarin computers in staat zijn om menselijke intelligentie te evenaren of zelfs voorbij te streven. Na de mobiele revolutie, big data en het internet der dingen verwacht VINT dat ‘machine intelligence’ dit jaar gaat doorbreken. De eerste notitie bevat een ‘executive introductie’ waarin de mogelijkheden en marktomvang van ‘machine intelligence’ worden beschreven, evenals de algehele impact daarvan op de samenleving. Dit eerste rapport maakt deel uit van een vierluik dat in de loop van 2016 verschijnt.

    “Auto’s die voetgangers herkennen en daarop passende maatregelen nemen, computers die een ziektediagnose vaststellen of juridisch advies verstrekken op basis van bestaande jurisprudentie. Dit zijn slechts een paar voorbeelden van de nieuwe generatie ‘machine intelligence’ computers waarbij de balans tussen mens en machine volledig verschuift. Deze intelligentie is straks voor iedereen beschikbaar. Ook moet elk bedrijf in elke sector zich straks realiseren dat kunstmatige intelligentie op een hele andere manier ingezet kan worden dan traditionele IT”, vertelt Menno van Doorn (foto), auteur en directeur van VINT van Sogeti.

    VINT heeft twee verklaringen voor de op handen zijnde doorbraak van kunstmatige intelligentie. In de eerste plaatst is er sprake van een doorbraak op het gebied van hardware. Zo worden onze telefoons hyperintelligent gemaakt met op het brein geïnspireerde neuromorfische chips. De extreem hoge rekenkracht van de nieuwe generatie chips helpt daarbij. De tweede verklaring heeft te maken met de explosieve groei van data. Decennia lang ontbrak het aan voldoende gegevens om verder te komen met ‘machine learning’. Pas nu, in het huidige big data tijdperk waarin grote hoeveelheden data beschikbaar zijn, kunnen de zelflerende algoritmes aan de slag. Dit zelflerende vermogen van machines is cruciaal voor deze nieuwe vorm van intelligentie.

    Vier rapporten

    Na de ‘executive introductie’ over kunstmatige intelligentie komen in de opvolgende rapporten drie onderwerpen aan bod. In het tweede rapport komt de opkomst van de Personal Digital Assistent aan de orde. Daarbij vindt een verschuiving plaats van apps naar cogs (cognitieve systemen). Persuasive Technologies staat centraal in het derde rapport. Dat betreft de gedragsbeinvloeding via machines. Het thema voor het laatste rapport is de invloed van angst voor kunstmatige intelligentie op organisaties die slimmer willen worden.Sander Duivestein, Jaap Bloem, Erik van Ommeren en Menno van Doorn zijn de auteurs van de eerste ‘executive introductie’. Naar verwachting zal het tweede rapport in april 2016 verschijnen. Het eerste trendrapport is te downloaden via deze link.

    Source: Executive-People

  • Making AI actionable within your organization

    Making AI actionable within your organization

    It can be really frustrating to run a successful pilot or implement an AI system without it getting widespread adoption through your organization. Operationalizing AI is a really common problem. It may seem that everyone else is using AI to make a huge difference in their business while you’re struggling to figure out how to operationalize the results you’ve gotten from trying a few AI systems.

    There has been so much advancement in AI, so how can you make this great technology actually translate into actionable business results?

    This is a real common problem that has been touching enterprises of all kind, from the biggest companies to mid-sized businesses.

    Here are a few quick pointers on how to turn your explorations in AI into AI practices leading to real results from investments.

    Pragmatic AI

    Firstly, focus on what gets called 'Pragmatic AI', practical AI that has obvious business applications. It’s going to be a long time before we have 'strong AI', so look for solutions that were made by examining problems that businesses deal with every day and then decide to use artificial intelligence to solve the problem. It’s great that your probabilistic Bayesian system is thinking of the world differently or that a company feels like they’ve taken a shortcut around some of the things that make Deep Learning systems slow to train, but what does that mean for the end user of the artificial intelligence? When you’re looking for a practical solution, look for companies who are always trying to improve their user experience and where a PhD in machine learning isn’t needed to write the code.

    Internal valuations

    Similarly, change the way you are considering bringing an AI solution into your company. AI works best when the company isn’t trying to do a science fair project. It works best when it is trying to solve a real business problem. Before evaluating vendors in any particular AI solution or going out to see how RPA solutions really work, talk to users around your business. Listen to the problems they have and think about what kind of solutions would make a huge difference. By making sure that the first AI solution you bring into your organization aligns to business goals, you are much more likely to succeed in getting widespread adoption and a green light to try additional new technologies when it comes time to review budgets.

    And no matter how technology-forward your organization is, AI adoption works best when everyone can understand the results. Pick a KPI focused problem like conversion, customer service, or NPS where the results can be understood without thinking about technology. This helps get others outside of the science project mentality to open their minds on how AI can be used throughout the business.

    Finally, don’t forget that AI can help in a wide variety of ways. Automation is a great place to use AI within an organization, but remember that in many use cases, humans and computers do more together than separately and great uses for AI technology help your company’s employees do their job better or focus on the right pieces of data. These solutions often provide as much value as pure automation!

    Source: Insidebigdata

  • Measuring and managing sustainability for IT leaders  

    Measuring and managing sustainability for IT leaders

    How can IT leaders know if they’re tracking greenhouse gas emissions comprehensively? The introduction of AI and machine learning are painting a clearer picture.

    As companies attempt to take sustainability to the next level and gain a more complete view of their greenhouse gas emissions, there’s a growing need to quantify results and track progress.

    If you can’t measure it, you can’t manage it,” says Autumn Stanish, associate principal analyst at Gartner, Inc. “In order to take initiatives to the next level -- particularly as organizations look to expand beyond Scope 1 and Scope 2 tracking -- there’s a need for more advanced and granular measurement tools.”

    It’s no small problem. Boston Consulting Group (BCG) reports that while 85% of companies are interested in reducing their emissions, only 9% of companies measure their total emissions comprehensively. Worse, only 11% have reduced their emissions in line with their goals over the last five years.

    How can companies get a better handle on their carbon footprint? How can CIOs and other IT leaders ensure that tools are in place for tracking emissions comprehensively? Although developing a framework remains a challenge, the introduction of AI and machine learning are changing the picture. “Tracking tools are becoming more refined and more useful,” Stanish says.

    Emerging Tech for Measuring Emissions

    Gaining insight into sustainability is becoming easier. Tools for measuring Scope 1 emissions (produced by company facilities or vehicles) and Scope 2 categories (purchased energy) have advanced considerably over the last few years. Yet, most organizations still lack an extended view of external emissions, referred to as Scope 3. These emissions extend out to the value chain and include products that have been sold.

    This lack of visibility is making it difficult for organizations to assemble a strategic framework and road map. BCG found that 57% of companies that measure all three types of Scope emissions see a significant decrease in emissions versus 31% that only partially measure emissions. Adding to the challenge: A measurement system must be accurate to pay dividends. Remarkably, firms BCG surveyed admitted a 30% to 40% error rate on their measurements.

    “It’s difficult to obtain a comprehensive view of a company’s footprint, says Mike Lyons, a managing director at BCG. “It’s very easy to get the carbon accounting or a boundary wrong, especially as organizations attempt to get a handle on Scope 3 emissions and understand product and technology lifecycles at a granular level.” In addition, a lack of expertise within organizations, even among environmental, social, and governance (ESG) teams, serves as an impediment.

    Most of today’s tools generate numbers based on widely used carbon accounting methodologies while allowing users to view their results against specific goals and targets. For example, software tools and platforms such as Salesforce Sustainability CloudSphericsEnviziSource Intelligence and Carbon Analytics provide dashboards that extend out to Scope 3 emission categories.

    Cloud providers, including AWS, Azure and Google Cloud, also offer tools that provide insights into compute cycles, energy consumption, and carbon output. For example, Google has several tools that allow organizations to track carbon emissions, including Carbon Footprint, which highlights gross carbon emissions data in reports and disclosures, visualizes carbon insights via dashboards and charts, and offers tools designed to reduce gross emissions from cloud applications and infrastructure.

    Tools tracking Scope 1 and Scope 2 emissions typically plug in power and fuel consumption, using power bills, meter readings and other sources. Many rely on aggregate and average figures collected from reports, documents, audits, and user inputs. Highly distributed businesses and organizations gauging Scope 3 emissions face steeper challenges. “Things can get difficult if you are a retailer and have thousands of stores, all with different bills at different rates, and you start peering into the supply chain,” says Casey Herman, ESG Leader at PwC US. “The question becomes, how do you accumulate all the data and convert everything into carbon output?”

    It's critical to understand how equipment, data centers, systems, and devices consume greenhouse gas emissions on a more granular level, Herman points out. “Many tools use conversation factors that may or may not be accurate.” Although major equipment manufacturers often share data about their products, assembling all the pieces into a complete picture can prove daunting. “Many business and IT leaders realize that they are missing lots of data or they have the carbon accounting wrong,” Lyons says. For now, “They have no way to understand what is really taking place.”

    Dialing Down Emissions

    BCG found that 86% of organizations continue to use spreadsheets to track carbon emissions. Overall, 53% of business and IT leaders say that they have trouble making and tracking decisions. An incomplete picture of assets and consumption is partly to blame but business leaders also complained that measurements take place too infrequently, and a lack of automation is a problem.

    More advanced platforms that incorporate AI and machine learning are emerging. BCG, for example, has introduced an artificial intelligence-based software platform called CO2 AI that strives for a more complete and accurate view across the supply chain. Its software connects to ERP systems and pulls operational data about materials that go into products; the physical movements of planes, trains, and trucks; e-waste streams, and much more. It essentially creates a digital twin of the enterprise.

    Meanwhile, Tata Consultancy Services (TCS) has developed a suite of solutions, including a product called TCS Clever Energy, that tap the IoT, AI, machine learning, and the cloud to help organizations decipher intricate energy performance factors, including heating and cooling, process energy optimization, demand response, intelligent tariff management, emission management and sustainability compliances with integration to sensors, meters, and assets across the organization. It runs on the Azure Cloud platform.

    The goal, Lyons says, is to gain a deeper understanding of how various options, trade-offs, and decisions impact the carbon reduction process. As organizations delve deeper into the space, there’s also an opportunity to run simulations and identify cost savings and potential funding issues. “It’s possible to view what-if scenarios and understand their impact in 2030 or 2050. An organization can spot gaps, including funding, and identify steps to address them,” he says.

    Of course, as firms venture into the realm of Scope 3, success typically revolves around other companies sharing data, which can present obstacles. As Lyons puts it: “Right now, there’s no expectation of sharing data among companies and, in some cases, a business may do so at its peril.” He says that in order for businesses to further advance initiatives, there’s a need to develop ecosystems that allow organizations to share data securely and sometimes anonymously across partners and supply chains.

    Herman says that organizations should focus on a strategy that incorporates tools and calculators but also presses vendors to provide more detailed information about the carbon footprint of their products. While there’s a need to gather, verify and vet various methods and data to ensure that everyone and everything is in sync, the approach helps build a framework for greenhouse gas emissions reduction. Along with training and an ongoing focus on integrating data into environmental, social, and governance programs, it’s possible to adopt a framework of continual improvement and progress.

    Concludes Stanish: “We’re setting moonshot goals for greenhouse gas reduction. Organizations must adopt better tools and processes to gauge progress and deliver meaningful and actionable insights.”

    Author: Samuel Greengard

    Source: InformationWeek

  • Microsoft lanceert nieuw big data analytics platform

    microsoft3Microsoft presenteert een nieuwe big data oplossing op basis van de computercode R en de technologie die het binnenhaalde bij de overname van bedrijf Revolution Analytics.

    Technologiereus Microsoft heeft een nieuw geavanceerd analytics platform onthuld op basis van de programmeertaal R: Microsoft R Server. De knowhow deed Microsoft vorig jaar op door de aankoop van Revolution Analytics, een provider van software en diensten in de R computertaal.

    Microsoft R Server werkt met Windows, Linux, Hadoop en Teradata software. De software is gebaseerd op Revolution R Enterprise (RRE), een product van Revolution Analytics. Het pakket Microsoft R Open, dat een “verbeterde distributie van R van Microsoft” voorziet, is een voorwaarde om gebruik te maken van het R Server platform, zo stelt Microsoft.

    “Microsoft R Server is een snel en kostenefficiënt professioneel big data geavanceerd analytics platform dat een waaier aan big data statistics, predictive modeling en machine learning mogelijkheden ondersteunt,” vermeldt de beschrijving van de software op het Microsoft Developer Network, waar het beschikbaar is als download voor abonnees van het netwerk. Het platform is ook te vinden op DreamSpark, het academische programma van Microsoft.

  • Research details developments in the business intelligence (BI) market that is estimated to grow at 10% CAGR to 2020

    HOIThe global business intelligence market report, an analyst says In the past few years, social media has played critical roles in SMEs and mid-sized organizations. Many SMEs are increasingly embracing this trend and integrating their BI software with social media platforms.

    Market outlook of business intelligence market - market research analyst predicts the global business intelligence market to grow at a CAGR of around 10% during the forecast period. The growing adoption of data analytics by organizations worldwide is a key driver for the growth of this market.

    The majority of corporate data sources include data generated from enterprise applications along with newly generated cloud-based and social network data. business intelligence tools are useful in the retrieval and analysis of this vast and growing volume of discrete data.

    They also help optimize business decisions, discover significant weak signals, and develop indicator patterns to identify opportunities and threats for businesses.

    The increased acceptance of cloud BI solutions by SMEs is also boosting the growth of this market. The adoption of cloud services allows end-users to concentrate on core activities rather than managing their IT environment.

    Cloud BI solutions enable applications to be scaled quickly, can be easily integrated with easy integration with third-party applications, and provide security at all levels of the enterprise IT architecture so that these applications can be accessed remotely.

    Market segmentation by technology of the business intelligence market:

    • Traditional BI
    • Mobile BI
    • Cloud BI
    • Social BI

    The mobile BI segment accounts for approximately 20% of the global BI market. It enables the mobile workforce to get business insights by data analysis, using applications optimized for mobile and smart devices.

    The growing smartphone adoption is likely to emerge as a key growth driver for this segment during the forecast period.

    Market segmentation by deployment of the business intelligence market

    • Cloud BI
    • On-premises BI

    The on-premise segment accounted for 86% of the market share during 2015. However, the report anticipates this segment to witness a decline in its shares by the end of the forecast period.

    In this segment, the software is purchased and installed on the server of an enterprise. It requires more maintenance but is highly secure and easy to manage.

    Geographical segmentation of the BI market

    • Americas
    • APAC
    • EMEA

    The Americas dominated the market during 2015, with a market share of around 56%. The high adoption of cloud BI solutions in this region is the major growth contributor for this market.

    The US is the market leader in this region as most of the key vendors are based out of here.

    Competitive landscape and key vendors

    Microsoft is one of the largest BI vendors and offers Power BI, which helps to deliver business-user-oriented, self-service data preparation and analysis needs through Excel 2013 and Office 365. The competitive environment in this market is expected to intensify during the forecast period due to an increase in R&D innovations and mergers.

    The market is also expected to witness a growing trend of acquisitions by the leading players. The key players in the market are expected to diversify their geographical presence during the forecast period.

    The key vendors of the market are -

    • IBM
    • Microsoft
    • Oracle
    • SAP
    • SAS Institute

    Other prominent vendors in the market include Actuate, Alteryx, Board International, Brist, Datawatch, GoodData, Infor, Information Builders, Logi Analytics, MicroStrategy, Panorama Software, Pentaho, Prognoz, Pyramid Analytics, Qlik, Salient Management Company, Tableau, Targit, Tibco Software, and Yellowfin.

    Key questions answered in the report

    • What will the market size and the growth rate be in 2020?
    • What are the key factors driving the BI market?
    • What are the key market trends impacting the growth of the BI market?
    • What are the challenges to market growth?
    • Who are the key vendors in the global BI market?
    • What are the market opportunities and threats faced by the vendors in the BI market?
    • Trending factors influencing the market shares of the Americas, APAC, and EMEA?
    • What are the key outcomes of the five forces analysis of the BI market?

    Source: WhaTech

  • Running a tech company? Use data science and AI to improve

    Running a tech company? Use data science and AI to improve

    There are a lot of great benefits of artificial intelligence with startups that can't be overlooked.

    As a tech company, you will always be looking for ways to develop. Using data science and artificial intelligence can be useful for this type of growth. While they share some similarities, there are also some differences between the two. You may be surprised to hear about the amazing benefits that AI offers for startups, especially those in the tech sector.

    Artificial Intelligence

    You may have heard artificial intelligence being referred to as AI in countless movies and TV shows. In real life, it’s used for creating improvement rather than turning on humanity, as it is so often shown to be doing on screen, even though this makes interesting viewing.

    AI has many uses, such as helping with translations, analyzing complex information and decision-making. It also has the ability to learn and therefore improve and adapt.

    Rodrigo Liang is CEO of SambaNova, which provides both hardware and software to businesses for the purpose of analyzing data. While this can be classed as data science, one difference is that data science tends to use a predictive model to make its analysis, while AI can be capable of analyzing based on learned knowledge and facts. This information may not have been programmed, which is why AI can be more precise and take factors into account that weren’t previously considered.

    Data science

    Data science covers a broad range of techniques, including statistics, design and development. It can be used to achieve quick mathematical calculations and find hidden patterns and trends in the data it analyzes, but it needs an element of human intervention. One difference is that using AI can remove the need for human input as it learns and develops.

    The programming for data science relies on already having statistics and predictive trends to work with. This information can then be used to find patterns and other details that might not be immediately obvious without hours, days or even weeks of human analysis.

    Both AI and data science can be used interchangeably depending on what is required, and they can complement each other.

    The benefits to your tech company

    One way that AI can be used to benefit your tech company is to carry out risk analysis. Otherwise, this can be an expensive task, particularly in the event of human error. It also saves time, as AI can process and analyze large amounts of information much quicker than a person can. Therefore, although the initial outlay might be high, the savings to your business will more than compensate for this. One example of this is fraud detection, which in some cases could be enough to force a business to close if it’s not caught or prevented in time.

    AI can also help with translating different languages. Most businesses rely on trading with customers and other businesses around the world, but the language barrier can make that more difficult. If you need to meet with or send emails to clients, or create content for speakers of other languages, hiring a translator can be expensive. It’s also risky if you’re dealing with sensitive information. That’s why AI is so popular for translating. It not only saves money but also inspires trust in your company, as the information is kept secure.

    Data science can be used to spot trends and patterns in your business. This is useful if you need to cut costs in areas that are losing money for your business, or if you need to focus your attention on more successful aspects to boost these further. No successful tech company will want to continue spending money on the parts of it that aren’t cost-efficient. AI can work well with data science here, by thinking logically to find a viable solution and make improvements.

    Although AI can translate human facial expressions, tone of voice and body languages to interpret human emotion, the ability to find solutions using logic can be an advantage to your tech company. While you and your employees may try to operate fairly and make the right decisions, it’s difficult to be completely impartial. We have our own thoughts and opinions, and these can shape our decisions, whether or not we want them to.

    When it comes to repetition in the workplace, nobody wants to be stuck doing the same thing repeatedly. It’s no good for the morale of your employees, and they will eventually leave if they don’t feel like they’re getting job satisfaction. It may feel like using data science and AI in your tech company will kill off jobs for humans. However, it’s just as likely that they can be better placed doing other less menial tasks within your company. If using these technologies results in fewer financial losses and more gains, then there is no reason why employees can’t be relocated elsewhere in the company on more appealing tasks.

    AI offers great advantages for tech startups

    AI and data science can be great for your tech company, removing or lowering risk, increasing profits, and generally helping you run your company with fewer problems, and with fewer job losses than you might think. Any initial costs will usually be recuperated.

    Author: Matt James

    Source: Smart Data Collective

  • Supporting Sustainability with Business Intelligence and Data Analytics

    Supporting Sustainability with Business Intelligence and Data Analytics

    Digital tools and technologies are helping businesses rewire, reformulate, and repackage to support sustainability. But of course, IT needs to support those efforts.

    There’s mounting pressure on organizations of all shapes and sizes to take sustainability efforts to the next level. But while much of the focus is on IT executives to wring out inefficiencies in data centers, servers and on various devices, there’s another aspect that often winds up overlooked: the role of IT in supporting more sustainable products and services.

    Various digital tools and technologies -- including analytics, artificial intelligence and digital twins, computer aided design, machine learning, and deep learning -- can help businesses rewire, reformulate, and repackage products and services to meet the needs of different business groups, including R&D, operations, and logistics.

    For a consumer goods company, this may translate into a bottle that’s derived from plant-based materials. For an airline, it might mean moving to synthetic hydrocarbon fuels that cost less and dramatically reduce the waste stream. For a clothing retailer, it’s likely about using recycled fabrics and more sustainable materials. For just about everyone, there’s a need to reduce packaging materials.

    Make no mistake, as businesses look to improve environmental, social, and governance (ESG) metrics, reduce carbon emissions and minimize environmental impacts, IT input is crucial. Organizations require the right IT foundation -- increasingly an agile cloud-first framework -- to support ESG initiatives and unleash innovation at scale.

    “Just as digital transformation required every company to become a technology company, with technology at its heart, now every business needs to become sustainable -- and technology is again taking centerstage,” explains Sanjay Podder, managing director and technology sustainability lead at Accenture.

    Unlocking Value

    There are more than altruistic reasons to weave sustainability into the fabric of an organization. Nearly two-thirds of consumers (66%) plan to make more sustainable or ethical purchases, according to a recent Accenture and World Economic Forum report. Companies with ESG programs in the top quartile realized financial returns about 21% better than peers for a seven-year period ending in 2020. They also achieved 2.6 times higher total shareholder returns.

    Seeding technology innovation across an enterprise requires broader and deeper communication and collaboration than in the past, says Aapo Markkanen, an analyst in the technology and service providers research unit at Gartner. “There’s a need to innovate and iterate faster, and in a more dynamic way. Technology must enable processes such as improved materials science and informatics and simulations.”

    Digital twins are typically at the center of the equation, says Mark Borao, a partner at PwC. Various groups, such as R&D and operations, must have systems in place that allow teams to analyze diverse raw materials, manufacturing processes, and recycling and disposal options --and understand how different factors are likely to play out over time -- and before an organization “commits time, money and other resources to a project,” he says.

    These systems “bring together data and intelligence at a massive scale to create virtual mirrored worlds of products and processes,” Podder adds. In fact, they deliver visibility beyond Scope 1 and Scope 2 emissions, and into Scope 3 emissions. “It’s vital to understand the impact of a change both short-term and long-term, and the ripple effect resulting from various decisions and trade-offs,” Markkanen explains.

    For example, a more sustainable agricultural product used for packaging may eliminate plastic along with fuel and natural resources. Yet plant-based materials can introduce new challenges. This includes product freshness and shelf life, and different demands on the environment and various resources. It can also lead to new problems, such as developing a separate waste stream system to dispose of the bottles.

    This data framework is also crucial for identifying issues and factors that can easily fly under the radar, such as how an industry-wide shift toward a more sustainable source material -- say bamboo or used cooking oil -- impacts sourcing, pricing, transportation and shipping, and environmental concerns.

    Sustainability By the Numbers

    There’s good news. Tools and technologies now exist to support next-generation sustainability efforts and business executives have gotten the memo. Accenture found that 73% of CEOs identified “becoming a truly sustainable and responsible business” as a top priority for their organization over the next three years.

    Cloud and software providers, including AWS, Azure and Google, offer digital twin solutions -- as well as other tools to facilitate projects. It’s possible to plug in sensors and other components that gather data and run sophisticated simulations. Other technologies such as blockchain, machine learning and deep learning and various specialized design and engineering tools are also valuable and can ratchet up initiatives further.

    For example, “Blockchain provides a way to improve transparency and traceability in global supply chains and is increasingly being used to help consumers verify companies’ claims about being resource positive and environmentally friendly,” Podder points out. This information, particularly when used with and sustainability software that tracks carbon emissions, can find its way onto corporate websites, annual ESG reports and other pubic-facing systems.

    When companies get the equation right, remarkable outcomes follow. For example, Coca Cola is moving away from petroleum-based packaging. In October 2021, it unveiled a beverage bottle made from 100% plant-based material. United Airlines is transitioning to sustainable aviation fuel -- made from renewable waste such as old cooking oil and animal fat. It reduces carbon emissions by 80%. Old Navy is producing flip-flops from renewable materials, such as sugarcane and denim made from recycled cotton.

    Technology Unleashes Innovation

    The news isn’t all rosy, however. Only 7% of respondents to Accenture’s sustainable tech survey reported that they have fully integrated their business, technology and sustainability strategies. Addressing these gaps and achieving a “sustainable DNA” involves a three-step process that better links sustainability and profitably. Accenture describes this as “Diagnose, Define and Develop.”

    Not surprisingly, it all starts at the C-suite. “CIOs must have a seat at the table on sustainability decisions. But most do not. Only 49% of CIOs are empowered to help set sustainability goals, and only 45% are assessed on achieving them,” he says. Yet, it’s also vital for IT leaders to help educate other groups about various digital tools and what role they can play within an enterprise sustainability strategy.

    Make no mistake, as climate change accelerates, consumer demand for sustainable products and services increases, and the financial incentives grow, new and reformulated products and services will become the new normal. Businesses of all shapes and sizes will be required to make changes. Says Markkanen: “The tools and technology exist to innovate and iterate faster than ever.”

    Author: Samuel Greengard

    Source: InformationWeek

  • Taking advantage of automation technology in competitive intelligence

    Taking advantage of automation technology in competitive intelligence

    If you’re a market or product researcher, or an intelligence specialist, you’re probably already aware of the extent to which technologies like artificial intelligence (AI) and machine learning have altered the business landscape over the past decade. But many professionals still view AI with suspicion, even mistrust, after being over-sold on its capabilities and underinformed about its limitations.

    If you’re one of those people, it’s time to give AI another (cautious) chance.

    Hybrid solutions, sometimes called smart workflows, combine automation technology with human analysis in order to improve the efficiency and accuracy of a business process. Smart workflows automate repetitive, tedious, and time-consuming tasks, and freeing up time for humans to handle more the complex, strategic tasks that machines still struggle to execute. If you’re reluctant to trust a computer to conduct important research, smart workflows allow you to build in human checks and balances wherever you see fit.

    Here are three ways automation technology can save you time at different stages of your competitive intelligence process.

    Data collection 

    A competitive intelligence process is only as thorough as its data collection method. If you’re missing information during the initial intelligence gathering stage, none of your hard work afterwards can correct that deficit, you’ll always be left with an incomplete set of facts. That’s why automated intelligence gathering is growing in popularity, even among small-to-midsized businesses. By allowing a machine to do the first-line data collection, you remove the potential for human error in terms of overlooking relevant company names, keywords, or phrases. With ongoing maintenance, an automated data collection system can drastically reduce the number of manhours spent searching for information.

    Classification

    When you’re dealing with a high volume of information, an automated classification system can give structure to the raw intelligence data, breaking it down into more manageable chunks for researchers and analysts to work with. Even if the information is particularly complication, a human curator can clean up pre-sorted data much more quickly than a raw, unorganized feed. The combination of machine power and human intelligence saves time and reduces employee burnout.

    Distribution

    Managing your competitive intelligence distribution process can be a time-consuming job in its own right, especially if you’re doing it all manually. Instead, many businesses are switching to an automated report model that takes your pre-classified intelligence data and distributes it to the intelligence users who need to see it, and have the right data skills to use it. Users can generally control what type of news they receive and when they receive it, without burdening the intelligence team with dozens, or even hundreds of unique schedules and content requests.

    Source: CI Radar

  • TDWI interview: the future of data and analytics

    TDWI interview: the future of data and analytics

    From migration struggles to vendor lock-in, we look at some of the most challenging enterprise problems and changes ahead with Raj Verma of MemSQL.

    Raj Verma is co-CEO at MemSQL. He brings more than 25 years of global experience in enterprise software and operating at scale. He was instrumental in growing TIBCO software and has served as CMO, EVP global sales, and COO.

    We asked Mr. Verma to tell us what lies ahead for enterprises in everything from data strategies to data architectures and management.

    What technology or methodology must be part of an enterprise's data or analytics strategy if it wants to be competitive today? Why?

    Raj Verma: Enterprises need to have the information to make decisions in the window of opportunity provided. Data is increasing by the second, and the useful life for data is diminishing. Any company that has postponed digitization is regretting it today. Having a strategy that is real time is what the current business environment requires. Some would say real time is not enough because you have to be predictive in understanding and determining how your customers or your resources are going to react to certain events that are likely to happen.

    Data strategies that do not have vendor lock-in will do well. The multi-billion-dollar empires in enterprise software are built on holding enterprises for ransom. To stay competitive, enterprises must avoid being locked in and subservient to one vendor. There needs to be an exit button enabling enterprises to leave and take their data out at any time. Having a good technology that is also philosophically aligned with their own organization is a best practice I recommend.

    What one emerging technology are you most excited about and think has the greatest potential? What's so special about this technology?

    Amino therapy is going to be game-changing. It will be phenomenal for physical and mental diseases. Anything we can do to hasten those research projects will be extraordinary.

    From a data perspective, I'm impressed with technology that has been built on first principles. First principles thinking means taking a fresh look at a problem and, with the latest research and technologies at hand, thinking through a better way to solve it, from the ground up.

    First-principles thinking simplifies seemingly complex problems and unleashes a wide range of creative possibilities. Technologies that were built on first principles allow companies to avoid lock-in and give a real-time view of what is happening within their organizations. They are also inherently flexible, so they are easier to deploy in new environments, such as hybrid cloud and multi-cloud.

    For databases and analytics, first principles thinking means determining what you want your database to do. Today, people want to work with the newest data, not just historical data, and they need to answer queries generated by AI programs and machine learning models quickly.

    To meet these new requirements, you need the conceptually simple, inherently scalable architecture. That's why we call MemSQL the Database of Now! Our technology is being used by large telecommunications providers to develop heat maps for regions with large COVID-19 infection rates to see where people are congregating and point out areas to avoid. It's helping to get medical supplies to the hospitals and frontline workers that need these products most. This requires technology that is based on first principles.

    What is the single biggest challenge enterprises face today? How do most enterprises respond (and is it working)?

    The biggest challenge that large enterprises of today face is how to migrate off their legacy platforms in an efficient, economical, and agile manner and align to new business realities. Competition for larger enterprises is only a click away. No one is safe in today's economy.

    The companies that do the hard and heavy lifting to respond to that will be surprised. Enterprises should be paranoid about how bulletproof their data architecture is. To standardize on a technology or a company is probably the worst decision you can make today. It is time for best-of-breed solutions that deliver the best of every world.

    Is there a new technology in data or analytics that is creating more challenges than most people realize? How should enterprises adjust their approach to it?

    The best way to kill an organization is to have bad data architecture and use that for decision making. The organizations that are rushing to buy analytics technology should pause to make sure their data strategy is right and confirm that they can make informed decisions with the data they have.

    Enterprises need to look at today's challenging landscape, including the heavy lifting of understanding various data sets, curating them in the right manner, governing the data sets with the right attributes, and identifying the dependability of one source of data over another. A good analytics or BI tool will bubble up both the good and the bad information. Unless you have strong convictions that your data architecture is right, don't go into AI blindly right now.

    What initiative is your organization spending the most time/resources on today? In other words, what internal project(s) is your enterprise focused on so that your company (not your customers) benefit from your own data or business analytics?

    The number one initiative for MemSQL is to have a thriving hybrid platform that is easy to consume both in a self-deployed manner and in the public cloud while also offering companies the utmost ease of use.

    Internally, we've added implementation of our customer data by adding support tickets with adoption and customer success. With our managed services, we've added to the number of users and queries that people can have on our trial in the cloud. Throughout all of this, we have always kept one initiative in mind: how do we help our customers get the most out of our own technology? We're drinking our own data. We've found it extremely useful to make investment decisions, to have interventions at customer sites to make them more successful, and to reach the outcomes that customers partnered with us for.

    Where do you see analytics and data management headed in 2020 and beyond? What's just over the horizon that we haven't heard much about yet?

    Data management will have to get a lot easier to perform and adopt. How do you make such a complex science easy to understand and consume? Analytics has already exploited the ease-of- use side. Analytics will have to get a lot wider in its ability to suck in data and provide a more holistic view of the organization. It must be nimbler to attract more sources of data inputs. Many AI tools are still extremely pricey. We can see a lot of those costs coming down as the user base expands.

    Describe your product/solution and the problem it solves for enterprises.

    The problem that MemSQL solves is how to marry real-time transactional data with historical data to make the best decision in the window of opportunity provided. MemSQL does it at scale, on commodity hardware, and with unprecedented speed and ease of use through a hybrid, multicloud environment.

    Author: James E. Powell

    Source: TDWI

  • Tech companies and the issue of diversity

    Tech companies and the issue of diversity

    Diversity in the workplace is something that all tech companies should strive for. When appropriately embraced in the tech sector, diversity has been shown to increase financial performance, increase employee retention, foster innovation, and help teams to develop better products. For example, data marketing teams that have equitable hiring practices in regards to gender exemplify this.

    While the benefits of a diverse workplace can help any company thrive, figuring out how exactly to increase diversity in tech workplaces can be a challenge. However, employing a diverse team is not impossible, and the rewards make diversification efforts well worth it.

    Diversity is less common than you might think

    Though the tech industry is far more diverse today than it has been in the past, diversity still remains an issue across the sector. Even if those heading tech companies don’t engage in outright racism by fostering a hostile work environment towards people of color or discouraging the hiring of diverse groups, many tech companies still find themselves with teams that look and think alike. Homogeny creates complacency, insulates a workforce from outside perspectives, and ultimately prevents real innovation and creativity from taking place.

    Tech companies can be complicit in racism through hiring practices, segregation of existing people of color employed by them (whether purposeful or not), or preventing employees from diverse backgrounds from conversing in their native tongue or practicing their cultural traditions.

    Diversity in the tech field isn’t just about improving any one company’s performance, it’s crucial to the continued success of the entire tech sector. By 2020, there will be 1.4 million open computer science-related positions with only around 400,000 qualified candidates to fill them. This makes diversity and inclusion a pressing emergency that needs to be addressed, and quickly.

    The gap in diversity is most noticeable in the tech field when looking at gender. Despite the fact that there are more women graduating with STEM (Science, Technology, Engineering, Mathematics) bachelor’s degrees than men, the bulk of the major companies in Silicon Valley hire far more men than women. With the undeniable fact that there are simply more qualified women than men who can drive success, ingenuity, and creativity, it just makes sense to increase diversity in the workplace by hiring more women.

    Using internship programs, diversity initiatives, and scholarships

    One effective way of combating the lack of diversity in tech workplaces is to implement diversity initiatives with the express purpose of finding qualified, diverse candidates who may have self-selected away from the tech field.

    Tech companies that wish to increase diversity can articulate a company-wide diversity commitment from the highest levels of management, letting all employees know that diversity is a top priority. Then, companies can consider implementing mentor programs to bring in diverse applicants to become proteges, all but ensuring that diverse talent is retained.

    Aside from creating a company culture that prioritizes diversity, tech companies can take things even further by becoming more proactive in getting underrepresented communities into the tech sector. Companies can develop afterschool STEM programs to foster the idea within these communities that the tech field is not only right for them but welcoming to them. Developing afterschool STEM programs directly impacts communities, with over 70% of students participating in afterschool STEM programs showing increased interest in STEM careers. Additionally, companies can offer up STEM-related college scholarships and internship programs to individuals who might not otherwise have the opportunity to enter the tech sector due to economic disadvantages.

    Data scientists are in high demand, and companies that use their influence and financial abilities to help elevate underrepresented groups into those positions will send a powerful signal to future generations of tech workers. While any tech company can make promises to improve diversity in the workplace, those that actually go above and beyond to ensure that minorities have the opportunity and drive to join the tech field are actually walking the walk.

    Diverse recruitment strategy

    While encouraging a diverse workplace is all well and good, companies need to actually implement a recruitment strategy with the purpose of increasing diversity. Diversity strategies in the recruitment process need to make sure that diversity goals and business goalsalign so that business growth accompanies diverse hiring.

    One problem that a lot of tech companies are facing is that diverse hires don’t necessarily see themselves within the company, which often simply extends from the fact that they aren’t represented culturally within the company. With nearly 70% of those seeking employment saying that a company’s diversity portfolio influences their decision about whether or not to work there, the lack of diversity within tech companies can hamstring their recruitment efforts. Getting over this hurdle can be difficult. One way to do it is by rebranding the company to appeal to marginalized groups.

    Implementing big data analysis into your recruiting can also help to naturally foster a more diverse team. Big data can quickly find the most qualified candidates for any particular position completely free of bias, meaning that every applicant is given the same chance at landing an interview regardless of creed, gender, sexuality, or race. Though this approach doesn’t affect who is applying for a tech position, it certainly levels the playing field for those who do apply.

    Even if a company doesn’t use big data in their hiring process, incorporating bias awareness into the recruiting and interview processes can help a company’s HR department improve the diversity within the company. HR can further assist in increasing diversity within the tech field by fostering a culture of inclusion within a company, helping employees to feel empowered and valued, which will allow them to do their best possible work.

    In terms of building trust, big data can also be used to protect employee information. Because machine learning can detect unusual behavior, it can detect when employee data or personally identifiable information may be at risk. In fields where underrepresented people can already feel vulnerable, it’s essential that security measures are in place to allow them to work without fear of threats to their privacy or financial well-being. A good track record in this regard can drastically improve trust (and therefore representation) in the workplace.

    Though diversity in tech workplaces is sorely lacking, there are many ways for companies to take positive steps to improve their diversity. Expanding the search for more diverse hires, developing community programs like after-school STEM programs and scholarships, and making sure that their HR department is approaching the hiring process with as little bias as possible are all great places to start. Tech companies with a diverse workforce recognize the inherent benefits of diversity, and those that don’t are sleeping on what could be the difference between success and failure.

    Source: Datafloq

  • The benefits of AI in CRM

    The benefits of AI in CRM

    AI has the power to liberate organizations from CRM-related manual processes and improve customer engagement, sales insights, and social networking, for starters.

    Customer relationship management systems have become indispensable tools in running business operations. In an always-connected and increasingly digital business world, manually-updated CRMs can quickly become unwieldy drags on office productivity and efficiency. This makes CRM platforms prime use cases of how companies across industries can leverage solutions using artificial intelligence technology. 

    There are still several misconceptions about AI. Fundamentally, AI is about understanding the importance of data analysis and how we can train machines to synthesize, analyze, and act on data as a human being could. At its core, AI technology is about liberating the human mind from mundane tasks.

    The most common misconception about AI is that it is synonymous with automation, but in fact it’s so much more. The only shared elements between AI and automation are the reliance on data and the goal of streamlining convenience. To liken AI to automation beyond those commonalities is to ignore the greater elegance of an AI system. Whereas automated systems must be manually configured to execute monotonous, repetitive tasks, AI systems are independently adaptive once they have data to process, meaning that they learn as they go without continuous monitoring.

    Thus AI has the power to liberate organizations from manual processes connected to the CRM system. Having built businesses in the AI space for years, most recently with AI solutions at Introhive, I can attest to a number of key advantages AI brings to CRM systems. I’ve outlined three big ones below.

    Better customer engagement

    Because AI solutions reduce the burden of data entry and other time-consuming administrative functions, teams have greater bandwidth to help customers and prospective customers. AI technology greatly assists with customer service and support functions as well. Chat and email bots ensure that customers get quick answers and direction. New AI technologies analyze voice and text in real time to help agents most effectively serve prospective and existing clients.

    An often overlooked consequence of AI-assisted customer engagement is improved employee engagement. If an employee can be freed from the more time-consuming administrative responsibilities of their role, they can focus on the more meaningful aspects of their jobs: creative thinking, problem solving, and client support. Improved employee satisfaction and retention tends to follow.

    Keener sales insights

    Having a centralized record of clean customer sales data and communications means better sales forecasting and clearer insights into how deals were won or lost. AI integration ensures that your CRM captures customer data in the desired format using document, image, and speech recognition technologies. Just as better fuel quality leads to stronger engine performance on the road, capturing high-quality data is crucial for accurate and insightful AI predictions, which in turn strengthen business operations. 

    High-quality customer data and accurate, AI-driven predictions also have important implications for how teams are organized to manage various accounts. Utilizing an AI application is effectively like giving a data scientist to each staff member to make sense of disparate business data that would otherwise be lost or delayed in analysis. 

    Smarter social networking

    This last point, smart networking, is of particular interest to me as I’ve spent the past decade thinking about the intersections of technology and human relationships. While new technologies are often criticized for making social interactions less authentic, I’ve long championed the power that technology has to strengthen connections and relationships, especially in business.

    Introhive specializes in AI-driven relationship intelligence to better understand one’s client base and branching networks of prospective clients and referrals. AI integrations mine CRM platforms for transactional data, social contacts, and communication history. This information is then collected, grouped, analyzed, and distributed automatically so teams have insights about best engagement practices with customers, vendors, and other stakeholders.

    As companies emerge from the pandemic and try to further capitalize on the gains made in virtual networking over the past year and a half, I foresee greater adoption of AI integrations across businesses, especially within the CRM market. 

    Author: Jody Glidden

    Source: InfoWorld

  • The Data Digest: Sports, Spectators, And Screen Time

    Sports fans around the world are having a heyday: From the Copa America soccer tournament in the US to the European Champions Cup across the pond, and from live Wimbledon matches to the imminent Summer Olympic Games, there is no lack of global sports entertainment at this moment.

    Sports teams have always brought people together as much as divided them — and in today’s age, technology amplifies the drama of fandom. Personal devices play a critical role in how people come together around sports, when fans watch the action unfold, and how they discuss the results.

    For example, Forrester’s latest Consumer Technographics® survey data reveals that consumers worldwide have recently accessed sports applications on their mobile phones and tablets: 

    data-digest 6.30.16 0

    Our previous research shows that consumers often prefer tablets for longer, more engaging, media-rich experiences — and in fact, Forrester’s mobile behavioral data indicates that consumers spend more time on sports apps when using their tablet rather than their mobile phone. However, technology doesn’t only enable sports enthusiasts to get into the game — oftentimes, it also allows more casual fans to multitask.

    In an earlier research report, my colleagues Nicole Dvorak and Kris Arcand note that as personal devices become integral to the sports spectator experience, marketers should design mobile experiences to engage audiences in new ways. They say that “smartphones [and tablets] are the perfect devices to provide newer, more creative extensions.” Marketers can leverage these screens to “appeal to ad fans [and] game fans, engage [the multitaskers], and tempt the party crowds with delicious offers.”

    Source: Forrestor.com

     

  • The future of car ownership that no one is talking about

    carsaleIt has never been more clear that transformational change to one of the world’s largest industries is just around the corner. Car ownership is supposed to change — and when it does, it is predicted to be one of the most monumental displacements of wealth the world economy has ever seen.

    But there are a lot of conflicting opinions and information out there. Industry experts and headlines are telling us that Tesla will rule the world, car manufacturers are becoming mobility companies and Uber and Lyft mean the end of traditional car ownership as we know it. They’re also saying that car dealerships will die at the hands of online car-buying sites such as Beepi, Carvana, Vroom and Shift. On the contrary, headlines are also touting that vehicle sales are at all-time highs, trucks are the leading category of new car sales and millennials are buying cars more than ever.

    There’s also the looming question of autonomous driving technology. Uber and Lyft want self-driving cars to replace their drivers ASAP, and companies like Apple and GM are spending aggressively to position themselves for success in a theoretical, autonomous mobility landscape. But even Google, which has been developing this technology since 2009, is unsure if fully autonomous cars can become viable within the next 30 years.

    What is unequivocal is that there’s a lot of noise around the future of transportation. Like anyone who lives in Silicon Valley, I believe change is coming and that current car ownership models are ripe for disruption. But with personal car sales at an all-time high, the question is when — and how? The assumed agents of change are the likes of Tesla, Uber, Google, Apple or Ford, but lost in all of these predictions is the linchpin for the entire auto industry. That’s the unsexy, yet enormous world of auto finance — the huge market that makes it all work.

    Financing props up the entire auto industry.

    To put it in perspective, auto loan balances in the U.S. total more than $1.06 trillion right now. That number doesn’t even include the enormous leasing market. The largest auto lenders in the U.S. are also household names: banks like Ally, Wells Fargo, Chase and Capital One, and finance arms of car manufacturers like Toyota, GM and Ford. Auto loans, as an asset class, trail only mortgages and student loan balances.

    It’s clearly a massive market, but why does that matter? It matters because U.S. car sales are inextricably linked to a robust auto finance market. Indeed, 86 percent of new car sales in the U.S. are financed. Without individual financing products, car sales don’t happen. Financing props up the entire auto industry. In fact, the U.S. is far more reliant on auto finance than other countries. In China, where current new car sales outpace the U.S., only 26 percent of new car sales are financed.

    What does that mean for the future?

    New types of cars like the Tesla 3 or the Chevy Bolt will continue to emerge, and their features will be heralded as ushering in the future of transportation. For the most part, however, these innovative vehicles are going to be consumed in the traditional manner: They will be personally financed.

    That can’t be the future of car ownership. With the rise of companies like Uber and Lyft, it’s clear that we will need to see advances in new ownership models to support tomorrow’s transportation landscape. In fact, Uber recently received a $1 billion credit facility led by Goldman Sachs to fund new car leases. Uber (and Wall Street) are also recognizing the need for more flexibility with this deal — especially at a time when Americans are making larger monthly payments than ever on their cars and taking out record-size auto loans.

    Other flexible vehicle access products like Ford’s Credit Link (shared vehicle leasing program) and GM’s Maven (on-demand car rentals) give us a glimpse into what the future of car ownership may look like. These types of emerging products will only be successful with a robust technology infrastructure — a stark departure from the differentiators in the auto finance market today.

    If we see a shift away from personal vehicle ownership and households no longer own, on average, 2.06 cars, then this will require an entirely new auto finance infrastructure. Ultimately, Google’s self-driving cars may become ubiquitous — but we’ll have to displace a lot of big names in today’s $1 trillion auto finance market to get there.

    Source: Techcrunch.com

     

  • The risk of undervaluing creativity

    The risk of undervaluing creativity

    Agencies’ creative perspective, the very currency of the business, is at risk and can only be realized by shifting billions from tech to fund creative differentiation.

    “The value of agency creativity is at risk of disappearing”

    The marketing industry is woefully out of balance, from agency/client relationships to new business requirements and compensation. The healthy tension of creativity that once balanced the needs of the brand with the needs of its customers, the commercial effectiveness of the work versus its cultural impact, and the needs of agency economics versus the client’s growth is all eroding. These are now one-sided issues. The tension is no longer healthy. Nowhere is this more evident than in agency economics. Agencies today barely grow at the current rate of inflation. Insourcing, margin compression, cost-cutting, new competitors, and tech priorities threaten the existence of agencies and undermine their value.

    “Customer experience has stagnated”

    Strong evidence of creativity’s languish is already underway. Customer experience has stagnated. Forrester’s Customer Experience Index (CX Index™), a study of 100,000 consumers and 300 brands that has been run for more than a decade and acts as a barometer for CX performance, is flat for the fourth consecutive year. Most brands are stuck in the middle, struggling to improve over competitors. Zero brands are rated to have an excellent experience. Forrester determined that there are four types of CX performance: the Languishers, Lapsers, Locksteppers, and Laggards. No brand is performing well. Worse still, for every 1-point drop in CX Index score, companies lose 2% of their returns. It’s only a matter of time before companies’ growth is impacted.

    “We’ve commoditized the brand and homogenized experiences”

    The issue is that the work looks, feels, and behaves too similar. The industry obsession for meeting every customer need and want for ease and convenience by using technology has left little room for creative differentiation. That has come at a cost. The front door to your brand is a web or app experience that is virtually indistinguishable. Fashion experiences look the same. Quick-service restaurant and coffee apps allow you to order ahead and skip the line. All airline apps allow travelers to check in, manage travel, and use a mobile device as their boarding pass. What can make one brand different from another when the experience is built from the same common technology platform, designed to solve the same user or category need, and programmed for the same two devices? Creativity.

    “We’ve overfunded technology and underfunded creativity”

    Unfortunately, just when creativity is needed the most, business leaders are investing in it the least. Forrester forecasts that spending for adtech, martech, data, and analytics will grow between 9% and 11% through 2022. Agency spending will only grow a mere 2.4%. And client budgeting and priorities are only part of the problem. Agencies are underfunding creativity, too. As of 2014, agencies had spent $12 billion-plus for data and technology resources and acquisitions. While the agency data platforms do power media and audience activation, all but one integrates with the creative process. And creative departments remain skeptical and dismissive.

    “It’s time to fund creative differentiation”

    Forrester developed an ROI for creative agency investment that determined that moving a portion of the marketing budget out of technology and into agency creativity will bring a higher return on investment compared to currently projected spending levels. This serves as a six-year growth plan for CMOs that ultimately helps achieve 20% growth for the entire industry. These are not new dollars but rather a reallocation of currently projected spending that maintains significant adtech and martech investments.

    “It’s time to reinvent creativity”

    To deliver clients the growth they need and customers the experiences they demand, agencies must innovate their structures, capabilities, workforce, and process. Structurally, data, technology, media, and creative should all come together and put creative problem-solving at the center. This means the newly acquired data, tech, and operating agencies should also come together. And especially, it means agencies leaders will need to make consolidation and coordination a priority. Tough decisions must be made in the name of agency brand coherence and a model that is easier for clients to engage. Training today’s workforce to be tomorrow’s data-, technology-, and creative-literate is critical. And creative departments must embrace data- and tech-driven creativity.

    We’re living during one of the most interesting times in the history of the industry, with the opportunity to shape and define it. A whole new era of amazing marketing is only possible if we fund the balance of creativity and technology. Take up the mantle to modernize the industry. Reinvent the creative process.

    Author: Jay Pattisall

    Source: Forrester

  • The role of Machine Learning in the development of text to speech technology

    The role of Machine Learning in the development of text to speech technology

    Machine learning is drastically advancing the development of text to speech technology. Here's how, and why it's so important.

    Machine learning has played a very important role in the development of technology that has a large impact on our everyday lives. However, machine learning is also influencing the direction of technology that is not as commonplace. Text to speech technology is a prime example.

    Text to speech technology predates machine learning by over a century. However, machine learning has made the technology more reliable than ever.

    The progression of text to speech technology in the Machine Learning era

    We live in an era where audiobooks are gaining more appreciation than the traditional pieces of literature. Thus, it comes as no surprise that the Text-to-Speech (TTS) technology is also rapidly becoming popular. It caters to those who need it most, including children who struggle with reading, and those who suffer from a disability. Big data is very useful in assisting these people.

    There are other elements of speech synthetization technology that rely on machine learning. It is now so sophisticated that it can even mimic someone else’s voice.

    Text to Speech (commonly known as TTS) is a piece of assistive technology (that is, any piece of technology that helps individuals overcome their challenges) that reads text out loud, and is available on almost every gadget we have on our hands today. It has taken years for the technology to develop to the point it is at today. Machine learning is changing the direction of this radical technology. However, its journey is one that started in the late eighteenth century.

    The early days of text to speech 

    TTS is a complicated technology that has developed over a long period of time. It all began with the construction of acoustic resonators, which could only produce just the sounds of the vowels. These acoustics were developed in 1779, due to the dedicated work of Christian Kratzenstein. With the advent of semiconductor technology and improvements in signal processing, computer-based TTS devices started hitting the shelves in the 20th century. There was a lot of fascination surrounding the technology during its infancy. This was primarily why Bell Labs’ Vocoder demonstration found its way into the climactic scene of one of the greatest sci-fi flicks of all time: 2001: A Space Odyssey.

    The Machine Learning technology that drives TTS

    A couple of years ago, Medium contributor Utkarsh Saxena penned a great article on speech synthesis technology with machine learning. They talked about two very important machine learning approaches: Parametric TTS and Concatenative TTS. They both help with the development of new speech synthesizing techniques.

    At the heart of it, a TTS engine has a front-end and a back-end component. Modern TTS engines are heavily dependent on machine learning algorithms. The front-end deals with converting the text to phonetics and meaningful sentences. The back-end uses this information to convert symbolic linguistic representation to sound. Good synthesizer technology is key to a good TTS system, which requires sophisticated deep learning neural analysis tools. The audio should be both intelligible and natural, to be able to mimic everyday conversation. Researchers are trying out various techniques to achieve this.

    Concatenation synthesis relies on piecing together multiple segments of recorded speech to form coherent sentences. This technology usually gives way to the most natural-sounding speech. However, it loses out on intelligibility, leading to audible glitches as a result of poor segmentation. Formant synthesis is used when intelligibility takes precedence over natural language. This technology does not use human speech samples, and hence sounds evidently ‘robotic’. The lack of a speech-sample database means that it is relatively lightweight and best suited for embedded system applications. This is because power and memory resources are scarce in these applications. Various other technologies also exist, but the most recent and notable one is the use of machine learning. In fact, recorded speech data helps train deep neural networks. Today’s digital assistants use these extensively.

    The challenges

    Contextual understanding of the text on the screen is one of the main challenges for TTS systems. More often than not, human readers are able to understand certain abbreviations without second thoughts. However, these are very confusing to computer models. A simple example would be to consider two phrases, “Henry VIII” and “Chapter VIII”. Clearly, the former should be read as Henry the Eighth and the latter should be read as Chapter eight. What seems trivial to us is anything but, for front-end developers working at TTS companies like Notevibes.

    They use various predictive models to enhance the user experience. But there is a lack of standard evaluation criteria to judge the accuracy of a TTS system. A lot of variables go into the quality of a particular recording, and these variables are hard to control. This is due to the involvement of both analog and digital processing. However, an increasing number of researchers have begun to evaluate a TTS system based on a fixed set of speech samples.

    That, in a nutshell (a rather big one at that), is an overview of Text to Speech systems. With increased emphasis on AI, ML, DL, etc., it is only a matter of time before we are able to synthesize true-to-life speech for use in our ever-evolving network of things.

    Machine Learning is the core of speech to text technology

    Machine learning is integral to the development of speech to text technology. New speech synthetization tools rely on deep neural algorithms to provide the highest quality outputs as this technology evolves.

    Author: Matt James

    Source: Smart Data Collective

  • TNO: ‘Amsterdam blijft bereikbaar dankzij big data’

    1000Innovatieorganisatie TNO ziet kansen voor big data en Internet of Things-technologie (IoT) om de bereikbaarheid van de metropoolregio Amsterdam te vergroten. “Met big data kunnen we meerdere oplossingen aan elkaar koppelen om de infrastructuur van een stad optimaal te benutten”, zegt Leo Kusters, Managing Director Urbanisation bij TNO.

    Binnen enkele decennia woont 70 procent van de wereldbevolking in grote steden of in sterk verstedelijkte regio’s. Het economische en culturele succes van regio’s als de Randstad trekt veel mensen. De infrastructuur van deze steden wordt daardoor steeds meer belast. Infrastructuur en mobiliteit zijn daarom bepalende factoren voor het succes van de grootstedelijke regio’s.

    Slimme mobiliteit

    Kusters wijst op het project Praktijkproef Amsterdam (PPA), waarin TNO samenwerkt met ARS Traffic & Transport Technology aan het verminderen van files in de regio Amsterdam. “Aan dit project zijn 15.000 automobilisten verbonden”, zegt Kusters. Door weggebruikers beter te informeren over de verkeerssituatie in de stad, verwacht TNO dat het aantal files in de regio Amsterdam afneemt.

    De deelnemers hebben de beschikking over een app waarmee ze op individueel niveau geïnformeerd worden over de beste reiskeuzes die ze kunnen maken. Daarnaast kunnen gebruikers via de app ook zelf incidenten en vertragingen op de weg melden. Hierdoor komen de automobilisten sneller op hun bestemming en kunnen ze rekenen op een betrouwbare reistijd.

    Bijzonder aan dit project is volgens Kusters dat de app ook advies geeft op basis van verkeerslichten die op rood staan. Vervolgens houdt het systeem rekening met deze verkeerslichten om een opstopping op de weg te voorkomen.

    TNO voert een vergelijkbaar project uit met vrachtverkeer in Helmond. Kusters: “Door de stad Helmond loopt een snelweg waar veel vrachtauto’s overheen rijden. Hierdoor is er in de stad veel belasting voor het milieu en de luchtkwaliteit.” In dit project experimenteert TNO met data-analyse om de doorstroming voor de betrokken vrachtwagens te optimaliseren. De chauffeurs krijgen doorlopend snelheidsadviezen om de doorstroming in de stad te verbeteren. Hierdoor hoeven chauffeurs minder te stoppen in de stad. Vrachtwagens verbruiken daardoor minder brandstof.

    Twee vliegen in één klap

    Een grote kans van big data en de toepassing van IoT-technologie ligt volgens Kusters in het combineren van meerdere oplossingen voor optimale benutting van bestaande infrastructuur. Big data kan ook bijdragen aan besparingen in het onderhoud van de infrastructuur, waar Nederland jaarlijks € 6 mrd aan uitgeeft.

    TNO richt zich bijvoorbeeld op het verlengen van de levensduur van bruggen. ”Een essentieel onderdeel van de infrastructuur”, zegt Kusters. “Als bruggen niet werken, staat alles stil.” TNO meet met sensoren de haarscheurtjes in bruggen. “Zo kunnen we precies weten wanneer een brug onderhoud nodig heeft of moet worden vervangen. Dit maakt het mogelijk om de levensduur van de brug ‘op maat’ te verlengen. Dus precies op tijd met een minimum aan overlast voor het verkeer.”

    De levensduur van infrastructuuronderdelen wordt meestal bepaald op basis van theoretische modellen. Kusters: “Omdat de werkelijkheid altijd anders is, ontwikkelt TNO met Rijkswaterstaat nieuwe meetmethodes. Het gebruik van infrastructuur kan in de praktijk intensiever of juist minder intensief zijn in vergelijking met de inschatting uit theoretische modellen, en de schade dus ook. Door big data in te zetten, kunnen we nauwkeurige voorspellingen maken voor het onderhoud van de brug en daarmee kosten besparen.”

    De coöperatieve auto

    Bij deze projecten is de betrokkenheid van verschillende partijen van groot belang, meent Kusters. “Mobiliteit is allang niet meer het alleenrecht van de overheid. De overheid neemt een andere rol aan bij de verduurzaming van infrastructuur en mobiliteit. Ook technologiebedrijven worden steeds belangrijker. Dat zijn bedrijven als TomTom en Google, maar ook een partij als chipleverancier NXP, die kunnen bijdragen aan de ontwikkeling van technologie om voertuigen met elkaar te laten communiceren.”

    De TNO-directeur spreekt over de ‘coöperatieve auto’. “Dat betekent dat alle diensten en modaliteiten waar je als automobilist gebruik van wil maken, aan elkaar worden gekoppeld. Het systeem gaat dan als het ware met je mee denken.”

    De coöperatieve auto maakt gebruik van IoT-technologie om rechtstreeks met andere voertuigen of de infrastructuur te communiceren. Hierdoor houdt de auto continu rekening met de huidige verkeerssituatie en de voertuigen die in dezelfde omgeving rijden. Kusters: “Dat is een grote doorbraak, een efficiënte deels-zelfrijdende auto die altijd oplet en altijd wakker is. Zo kunnen we de wegcapaciteit stevig laten toenemen en een flink deel van de fileproblemen oplossen.”

    Toekomstvisie

    De Managing Director Urbanisation ziet de IoT-toepassingen voor mobiliteit in rap tempo toenemen. "De autonoom zelfrijdende auto in de stad is misschien wel minder ver weg dan we denken”, zegt Kusters. “We hebben al auto’s die zelf kunnen parkeren. In de toekomst betekent dit dat de parkeerproblemen in de grote steden ten einde lopen.”

    Naast de IoT-toepassing voor coöperatieve auto’s, ziet Kusters ook kansen voor verbeteringen aan de infrastructuur. “Het verbonden zijn van mensen en van apparaten zal ook terug te zien zijn op het straatbeeld, zoals wifi op straat, wifi voor auto’s, en slimme LED-verlichting. Dat betekent overigens niet dat al die informatie over één en hetzelfde netwerk zal gaan. De informatie die tijdkritisch is en de verkeersveiligheid beïnvloedt, zal bijvoorbeeld gebruikmaken van een apart netwerk. Dit gaan we in steden en op snelwegen binnen een paar jaar in de praktijk zien.”

    In de toekomst ziet de directeur leefomgeving van TNO ook meer veranderingen in het aanzicht van de binnenstad. “In de stad gaan we meer en meer elektrisch rijden. Dat zien we al in recente openbaar vervoersaanbestedingen.” Ook fietsersaantallen zullen volgens Kusters nog verder groeien. “In een stad als Amsterdam is er dan meer ruimte nodig voor de fiets”, zegt Kusters. “Dit is de enige vorm van mobiliteit die in Amsterdam toeneemt. Meer ruimte voor fietsers is daarom belangrijk. Dat gaat wel ten koste van de parkeerplaatsen van de auto’s, maar hoeft dan niet zomaar ten koste te gaan van de bereikbaarheid.”

    Source: DuurzaamBedrijfsleven

  • What Can Retailers Do To Elude Extinction?

    ExtinctHere's what you didn't learn in school about the disruption affecting retail today. A recent article by consultant Chris H. Petersen, "Seven disruptive trends that will kill the bigstock-Extinct-150-79929610-copy'dinosaurs of retail'" discussed the fate of "25 retail dinosaurs that vanished in the last 25 years" which was the subject of an Entrepreneur article. Those retailers included giants such as Circuit City, Comp USA, Blockbuster, Borders, and Tower Records, companies which literally dominated their category or channel. Others named in the article were retail innovators in their own right until new disruptors outgunned them. The point is that neither longevity, size, or specialization guarantee retail survival today. So how can today's retailers avoid being extinguished by current disruptive innovations?

    Disruptive innovation refers to any enhanced or completely new technology that replaces and disrupts an existing technology, rendering it obsolete. (Picture how we went from the Model T to the KIA; from giant mainframes to personal computers; or from fixed-line telephones to cellphones/smartphones).

    Disruptive innovation is described by Harvard Business professor Clayton Christensen as a process by which a product or service takes root initially in simple applications at the bottom of a market and then relentlessly moves up market, eventually displacing established competitors.

    Today's major disruptive retail trends have led to the rise of the consumer, the rise of technology to help retailers best serve the consumer while wrestling with competitive forces, and the demise of "the old way of doing business."

    I. The Consumer.

    Evolving, innovative, disruptive technology has led to consumer-dominated behavior that reaches across many channels. As we know, today's consumer now shops any time and everywhere using a variety of helping tools.

    The consumer is capable of having a personal, seamless experience across their entire shopping journey to explore, evaluate and purchase, tempered by how retailers DO business, provide service, deal with their competition, etc.

    * The consumer journey starts online, although stores remain a destination for experience.

    What can retailers do? The successful retailer of the future needs to not only master online and offline, but how to connect with the consumer across many touch points, especially social media.

    * Mobile juggernaut. The latest stats show that there are now more cell phones in use than people on this planet. Smartphones now exceed 4.5 billion. Mobile is the majority and will be the preferred screen for shopping.

    What can retailers do? Retail survivors must optimize for mobile engagement, and also broadcast offers and connect with consumers wherever they are. The store of the future will not only have beacons to connect, but to track traffic via mobile as well.

    * Stock availability / Virtual aisle / Endless shelf. More than 50 percent of consumers expect to shop online and see if stock is available in store.

    Omni channel consumers now fully realize that stores can't begin to stock every model, style and color. While consumers can see hundreds if not thousands of products in store, they know that there are millions online.

    What can retailers do? The survivors are literally creating a seamless experience between online, store and mobile apps so the consumer can "have it their way" anywhere, anytime.

    * Consumer experience still rules. Consumer experience still needs to come down to senses: Tactile, visual, and psychological.

    What can retailers do? Virtual dressing rooms, better in-store experiences, and adoption of new disruptive technology to address and satisfy these issues.

    * Personalization of products and services.

    What can retailers do? New survivors are emerging with "mass personalization" opportunities to custom tailor your clothes or curate your personal wardrobe assortment and send it to you.

    * Social Connections and the influence of the opinions of others. Social has become a primary source of research and validation on what to buy. Today's consumers are 14 times more likely to believe the advice of a friend than an ad.

    What can retailers do? Today's major brands are giving much more attention to and spending more dollars on social media than traditional media.

    II. Technology

    Disruptors share the common purpose to create businesses, products and services that are better -- usually less expensive and always more creative, useful, impactful -- and scalable.

    What can retailers do? Put into use as soon as possible disruptive technology solutions such as price and assortment intelligence, behavioral economics, customer experience analytics, predictive analytics, and more to help understand, meet, and outgun the competition and service the customer.

    A Note on Predictive Analytics.

    Dr. Christensen subscribes to predictive analytics as, "the ability to look at data from the past in order to succeed in new ways the future." Predictive analytics solutions, the capability to forecast consumer purchase trends in order to sell the most products at the best prices at any given time are coming on strong.

    Bottom Line For Your Bottom Line

    There's never been a time of more disruptive change in retail. Retailers who are the most adaptable to change -- and not the strongest nor more intelligent of the species -- will be the ones to survive.

    It's a case of keeping yourself on top of the tsunami of change through the mastery of today's and tomorrow's new disruptive technologies.

    *Thanks to Chris H. Petersen, PhD, CEO of Integrated Marketing Solutions, a strategic consultant who specializes in retail, leadership, marketing, and measurement.

    Source: upstreamcommerce.com, February 8, 2015

  • Why You Should Be Securing Big Data In The Cloud?

    160104-Cloud-800x445Combining big data and the cloud is the perfect solution for a company's computing needs. A company's data often requires a computing environment which can quickly and effectively grow and flex, automatically accommodating large amounts of data. The cloud computing environment does just that. There is one question which continually arises when discussing cloud computing.

    How secure is the cloud?

    Securing data, especially big data is a major concern. Companies expect that any data stored in the cloud will be secured and that the security measures will be flexible to keep up with a changing threat environment. There are four ways to keep your big data secure in the cloud. Each will keep your data safe and yet provide the flexibility that is inherent to using the cloud.

    1. Encrypt Your Sensitive Data

    Encrypting your data provides another level of security within your cloud infrastructure. Each security solution must be customized to the project and the data. There is no single type of encryption that will work for every situation. Certain types of premise gateway encryption solutions do not work well with cloud big data situations. Other solutions, including encryption provided by the cloud provider, ask the end user to rely on someone else to encrypt their data. This is often a risky proposition and most companies will not agree.

    Encryption solutions such as split-key encryption were developed specifically for data storage in the cloud. These technologies keep the cloud data safe, providing encryption keys which the customer holds and uses.

    Split-key encryption is the safest and most effective means of encrypting cloud-based data.

    2. Use cloud security technologies which can be scaled to meet changing requirements.

    When it comes to big data, solutions must quickly scale to meet the demand. This is the same for security technologies for cloud data. Ensure any cloud security technology you choose is available and relevant across any and all cloud locations. Additionally, to be effective, any cloud security solution must be able to quickly scale to meet demands and changing requirements.

    Because of the inability to quickly scale and grow, hardware solutions are not a viable option for securing cloud big data. It is not possible to adapt a hardware security module (HSM) quickly enough to meet continuously changing data security requirements.

    Only a cloud-based solution will provide the ease and efficiency to scale quickly in response to demand. These solutions are just as, if not more effective than hardware-based technologies. Additionally, cloud based solutions such as CDNs provide security and are also allow for faster working sites and tools.

    3. Automate where possible

    Many companies are not happy with traditional cloud security solutions because they will not scale quickly to meet demand. Standard encryption technologies typically use an HSM element in their design. Since hardware cannot be automated, these security solutions are limited in their effectiveness within the cloud.

    The best cloud security solutions use virtual appliances instead of hardware within their systems. It is also important to ensure an effective RESTful API is part of any cloud security solution.

    A cloud security solution which includes a RESTful API and a virtual appliance will provide the automation and flexibility required to secure cloud big data.

    4. Never compromise on data security

    Cloud big data security solutions are often a complicated business. As a result, we often see systems which are not quite as comprehensive as they should be. Some cloud security systems designers will take a shortcut to get around the complexities involved in securing big data.

    For example, some systems will use freeware and their encryption tools to secure the data. They may keep the encryption keys in a physical location or on a disc which provides an opportunity for them to be lost or stolen. Using these types of shortcuts can certainly be easier, but they do not provide a viable security solution for cloud data storage.

    Companies must protect their data by mapping and reviewing the sensitivity of their data and then designing a cloud security solution to ensure it is protected. The right security solution can provide great results. Remember, not all cloud data storage is secure. If you have very sensitive or regulated data, you may need to search for another security solution to keep your data protected.

    Source: SmartDataCollective

EasyTagCloud v2.8