15 items tagged "Cloud"

  • ‘Privacy als integraal onderdeel van security’

    Experts doen voorspellingen voor 2016

    4726367In 2016 staat veel te gebeuren op het vlak van privacy en dataprotectie. Hoewel de discussie ‘security versus privacy’ dit jaar al is begonnen, barst deze in 2016 echt los. Computable-experts bespreken verschillende it-beveiligingsvoorspellingen voor het komende jaar. Hierbij worden onder andere cloudbeveiliging en de integratie van beveiligingsoplossingen aangehaald.

    Privacy
    Richard van Lent, managing partner bij MITE Systems:
    2015 stond vooral in het teken van onderwerpen als ‘Cloud First’, ‘Mobile First’ & internet of things (IoT). Daarnaast heeft de Wet Meldplicht Datalekken, een toevoeging op de bestaande Wbp, de afgelopen maanden nogal wat stof doen opwaaien binnen afdelingen als hr, legal/risk compliance, security en ict. Data Analytics-oplossingen gaan voor de meeste bedrijven rand voorwaardelijk worden als het gaat om het borgen van de privacy, compliancy en security in de dagelijkse operatie. Een belangrijke trend voor 2016, die zich momenteel in een rap tempo ontwikkelt, is dan ook het inzetten van Data Analytics-oplossingen om zaken als beveiliging, voorspelbaarheid & gedrag bijna in realtime inzichtelijk te maken.

    Gerard Stroeve, manager Security & Continuity Services bij Centric:
    Een andere belangrijke security-trend voor 2016 is privacy. Op het gebied van privacy staat er het komende jaar veel te gebeuren. Neem bijvoorbeeld de meldplicht datalekken. Deze gaat op 1 januari 2016 in en verplicht organisaties (zowel bedrijven als overheden) om ernstige datalekken direct te melden. Ook krijgt het College Bescherming Persoonsgegevens (CBP), dat vanaf 1 januari verder gaat als de Autoriteit Persoonsgegevens, boetebevoegdheid tot 820.000 euro. Daarnaast vormt de Europese Algemene Data Protectie Verordening een belangrijke internationale ontwikkeling op het gebied van privacy. Naar verwachting wordt ook deze begin 2016 van kracht met een overgangsperiode van zo’n anderhalf jaar. Komend jaar moeten organisaties echt aan de slag met de voorbereiding op die nieuwe wetgeving.

    Het is daarbij belangrijk om privacy niet als een op zichzelf staand onderwerp te benaderen. Privacy heeft specifieke wet- en regelgeving, maar is wel een integraal onderdeel van informatiemanagement en -beveiliging als geheel.

    Lex Borger, Principal Consultant bij I-to-I:
    De politieke discussie ‘security versus privacy’ is al begonnen, maar gaat in het verkiezingsjaar in de Verenigde Staten echt losbarsten. Hoeveel privacy moeten we opgeven om terrorisme te bestrijden? Mogen overheden eisen stellen om toegang te krijgen tot informatie die versleuteld verzonden wordt of opgeslagen is? Kan terrorisme zo bestreden worden? Kunnen we overheden vertrouwen met die mogelijkheden? Kunnen we voorkomen dat anderen (overheden, georganiseerde criminaliteit) hier misbruik van maken? Mag de burger nog wat te verbergen hebben? Voldoende gelegenheid tot discussie, ik ben benieuwd.

    Cloud, integratie en IoT
    Gerard Stroeve, manager Security & Continuity Services, Centric:
    De derde belangrijke trend voor 2016 is cloudsecurity. Veel organisaties geven aan nog te weinig grip te hebben op de cloudoplossingen die zij gebruiken. Hoe vind je bijvoorbeeld afstemming met grote, publieke cloudleveranciers als Google, Microsoft en Amazon? Een andere grote uitdaging vormt de zogenaamde ‘Shadow it, een fenomeen waarmee bijna iedere organisatie tegenwoordig te maken heeft. Wanneer de juiste functionaliteit niet of niet snel genoeg beschikbaar is, zoeken medewerkers daarvoor steeds vaker zelf een oplossing in de cloud. Zo ontstaat er langzaam maar zeker een wildgroei aan gebruikte cloudoplossingen waarop de organisatie geen grip heeft. Dat vraagt om helder beleid en goede richtlijnen rondom veilig cloudgebruik, zonder daarbij de productiviteit in de weg te zitten. Een praktisch handvat hierbij vormt de 4C-benadering.

    John Veldhuis, Senior System Consultant bij Sophos:
    Zoals we in de gateway een integratie hebben gezien van voorheen losstaande apparaten voor email/web proxy, vpn, packet filtering, layer 7 filtering, load balancing et cetera. in een UTM, gaan we een doorontwikkeling meemaken die ervoor zorgt dat anti-malware, UTM en versleutelingsoplossingen met elkaar praten. Voorbeeld: Verdacht verkeer, bijvoorbeeld ransomware gerelateerd, wordt gedetecteerd. Dit kan door software op de besmette machine zijn, maar ook door de UTM. Hierdoor kan automatisch:

    • de machine ontdaan worden van sleutels, zodat vertrouwelijke data niet gelekt kan worden, maar ook het bewerken ervan (versleutelen door ransomware) niet meer kan plaatsvinden (geen sleutel = geen toegang);
    • de machine in een quarantaine- of mitigatienetwerk worden geplaatst, op de overige machines een zoekactie worden gestart naar het bestand dat het verkeer veroorzaakte de herkomst van het bestand worden gezocht en in een reputatiefilter worden gezet et cetera.

    Harm de Haan manager consultancy bij Telindus:
    Door de toenemende complexiteit van it-infrastructuren en de mate waarin onderdelen met elkaar geïntegreerd zijn, is security steeds moeilijker te garanderen. Bovendien zijn de gevolgen van een ´breach´ groter, door de nieuwe Wet Meldplicht Datalekken. Hierdoor is security voor veel organisaties een belangrijk thema in 2016. Endpoint solutions kunnen veiligheid onvoldoende garanderen, juist door de complexiteit van moderne infrastructuren. Beter zouden organisaties zich oriënteren op ‘security by design’: een aanpak waarbij security in de ontwerpfase van een infrastructuur wordt ingericht als onderdeel van de losse componenten, zoals compute, networking en storage.

    Lex Borger, Principal Consultant bij I-to-I:
    Geen excuus meer: TLlskan nu overal en gratis. Ttp's moeten hun business-model veranderen, Let's Encrypt levert gratis tls-certificaten met automatische provisioning. Jouw site kan altijd https aan. Geen dure certificaten te beheren, geen moeilijk proces voor aanvraag, maar je moet wel aantonen dat je zeggenschap hebt over je website. Voor velen zal dit veilig genoeg zijn. Het betaalde proces voor certificaatbeheer is complex genoeg, het is absurd dat er nog gepleit wordt om SHA-1 langer geldig te laten zijn omdat we onze certificaten niet snel genoeg kunnen opsporen en vervangen.

    Ook zal het internet of things in 2016 explosief groeien met sensors en kleine automaatjes die eenvoudig moeten werken en simpel aan te sluiten zijn. Dit geeft heel veel mogelijkheden voor hackers om ik weet niet wat te doen (Ik ben niet creatief genoeg om te bedenken wat er allemaal mogelijk is). We gaan er flink last van krijgen dat onze netwerkinfrastructuren inherent onveilig zijn. Het is nog steeds eenvoudig je voor te doen als een ander apparaat op het internet. Dit gegeven ligt aan de basis van veel aanvallen - DDoS, identity theft.

    Tot slot: een gedegen basisproces
    Los van deze drie thema’s vraagt informatiebeveiliging vooral een integrale benadering, meent Gerard Stroeve. ‘Informatiebeveiliging is een breed vakgebied met verschillende aandachtsgebieden. Naast de genoemde thema’s, is er bijvoorbeeld cybersecurity. Het aantal DDoS- of ransomware-aanvallen zal komend jaar niet afnemen. Ook verwachten we dat de aandacht voor beschikbaarheid en continuïteitsmanagement het komend jaar zal toenemen.’

    Om effectief met al deze uiteenlopende dreigingen om te gaan, is een gedegen basisproces volgens hem cruciaal. ‘Door een goed governancemodel ben je in staat te reageren op nieuwe en veranderende dreigingen. Hierbij staan classificatie en het analyseren van risico’s centraal. Belangrijk is ook dat informatiebeveiliging een stevig managementdraagvlak krijgt.’

    Source: Computable

  • ‘Vooruitgang in BI, maar let op ROI’

    5601405Business intelligence (bi) werd door Gartner al benoemd tot hoogste prioriteit voor de cio in 2016. Ook de Computable-experts voorspellen dat er veel en grote stappen genomen gaan worden binnen de bi. Tegelijkertijd moeten managers ook terug kijken en nadenken over hun businessmodel bij de inzet van big data: hoe rechtvaardig je de investeringen in big data?

    Kurt de Koning, oprichter van Dutch Offshore ICT Management
    Business intelligence/analytics is door Gartner op nummer één gezet voor 2016 op de prioriteitenlijst voor de cio. Gebruikers zullen in 2016 hun beslissingen steeds meer laten afhangen van stuurinformatie die uit meerdere bronnen komt. Deze bronnen zullen deels bestaan uit ongestructureerde data. De bi-tools zullen dus niet alleen visueel de informatie aantrekkelijk moeten opmaken en een goede gebruikersinterface moeten bieden. Bij het ontsluiten van de data zullen die tools zich onderscheiden , die in staat zijn om orde en overzicht te scheppen uit de vele verschijningsvormen van data.

    Laurent Koelink, senior interim BI professional bij Insight BI
    Big data-oplossingen naast traditionele bi
    Door de groei van het aantal smart devices hebben organisaties steeds meer data te verwerken. Omdat inzicht (in de breedste zin) een van de belangrijkste succesfactoren van de toekomst gaat zijn voor veel organisaties die flexibel in willen kunnen spelen op de vraag van de markt, zullen zijn ook al deze nieuwe (vormen) van informatie moeten kunnen analyseren. Ik zie big data niet als vervangen van traditionele bi-oplossingen, maar eerder als aanvulling waar het gaat om analytische verwerking van grote hoeveelheden (vooral ongestructureerde) data.

    In-memory-oplossingen
    Organisaties lopen steeds vaker aan tegen de performance-beperkingen van traditionele database systemen als het gaat om grote hoeveelheden data die ad hoc moeten kunnen worden geanalyseerd. Specifieke hybride database/hardware-oplossingen zoals die van IBM, SAP en TeraData hebben hier altijd oplossingen voor geboden. Daar komen nu steeds vaker ook in-memory-oplossingen bij. Enerzijds omdat deze steeds betaalbaarder en dus toegankelijker worden, anderzijds doordat dit soort oplossingen in de cloud beschikbaar komen, waardoor de kosten hiervan goed in de hand te houden zijn.

    Virtual data integration
    Daar waar data nu nog vaak fysiek wordt samengevoegd in aparte databases (data warehouses) zal dit, waar mogelijk, worden vervangen door slimme metadata-oplossingen, die (al dan niet met tijdelijke physieke , soms in memory opslag) tijdrovende data extractie en integratie processen overbodig maken.

    Agile BI development
    Organisaties worden meer en meer genoodzaakt om flexibel mee te bewegen in en met de keten waar ze zich in begeven. Dit betekent dat ook de inzichten om de bedrijfsvoering aan te sturen (de bi-oplossingen) flexibel moeten mee bewegen. Dit vergt een andere manier van ontwikkelen van de bi-ontwikkelteams. Meer en meer zie je dan ook dat methoden als Scrum ook voor bi-ontwikkeling worden toegepast.

    Bi voor de iedereen
    Daar waar bi toch vooral altijd het domein van organisaties is geweest zie je dat ook consumenten steeds meer en vaker gebruik maken van bi-oplossingen. Bekende voorbeelden zijn inzicht in financiën en energieverbruik. De analyse van inkomsten en uitgaven op de webportal of in de app van je bank, maar ook de analyse van de gegevens van slimme energiemeters zijn hierbij sprekende voorbeelden. Dit zal in de komende jaren alleen maar toenemen en geïntegreerd worden.

    Rein Mertens, head of analytical platform bij SAS
    Een belangrijke trend die ik tot volwassenheid zie komen in 2016 is ‘streaming analytics’. Vandaag de dag is big data niet meer weg te denken uit onze dagelijkse praktijk. De hoeveelheid data welke per seconde wordt gegenereerd blijft maar toenemen. Zowel in de persoonlijke als zakelijke sfeer. Kijk maar eens naar je dagelijkse gebruik van het internet, e-mails, tweets, blog posts, en overige sociale netwerken. En vanuit de zakelijke kant: klantinteracties, aankopen, customer service calls, promotie via sms/sociale netwerken et cetera.

    Een toename van volume, variatie en snelheid van vijf Exabytes per twee dagen wereldwijd. Dit getal is zelfs exclusief data vanuit sensoren, en overige IoT-devices. Er zit vast interessante informatie verstopt in het analyseren van al deze data, maar hoe doe je dat? Een manier is om deze data toegankelijk te maken en op te slaan in een kosteneffectief big data-platform. Onvermijdelijk komt een technologie als Hadoop dan aan de orde, om vervolgens met data visualisatie en geavanceerde analytics aan de gang te gaan om verbanden en inzichten uit die data berg te halen. Je stuurt als het ware de complexe logica naar de data toe. Zonder de data allemaal uit het Hadoop cluster te hoeven halen uiteraard.

    Maar wat nu, als je op basis van deze grote hoeveelheden data ‘real-time’ slimme beslissingen zou willen nemen? Je hebt dan geen tijd om de data eerst op te slaan, en vervolgens te gaan analyseren. Nee, je wilt de data in-stream direct kunnen beoordelen, aggregeren, bijhouden, en analyseren, zoals vreemde transactie patronen te detecteren, sentiment in teksten te analyseren en hierop direct actie te ondernemen. Eigenlijk stuur je de data langs de logica! Logica, die in-memory staat en ontwikkeld is om dat heel snel en heel slim te doen. En uiteindelijke resultaten op te slaan. Voorbeelden van meer dan honderdduizend transacties zijn geen uitzondering hier. Per seconde, welteverstaan. Stream it, score it, store it. Dat is streaming analytics!

    Minne Sluis, oprichter van Sluis Results
    Van IoT (internet of things) naar IoE (internet of everything)
    Alles wordt digitaal en connected. Meer nog dan dat we ons zelfs korte tijd geleden konden voorstellen. De toepassing van big data-methodieken en -technieken zal derhalve een nog grotere vlucht nemen.

    Roep om adequate Data Governance zal toenemen
    Hoewel het in de nieuwe wereld draait om loslaten, vertrouwen/vrijheid geven en co-creatie, zal de roep om beheersbaarheid toch toenemen. Mits vooral aangevlogen vanuit een faciliterende rol en zorgdragend voor meer eenduidigheid en betrouwbaarheid, bepaald geen slechte zaak.

    De business impact van big data & data science neemt toe
    De impact van big data & data science om business processen, diensten en producten her-uit te vinden, verregaand te digitaliseren (en intelligenter te maken), of in sommige gevallen te elimineren, zal doorzetten.

    Consumentisering van analytics zet door
    Sterk verbeterde en echt intuïtieve visualisaties, geschraagd door goede meta-modellen, dus data governance, drijft deze ontwikkeling. Democratisering en onafhankelijkheid van derden (anders dan zelfgekozen afgenomen uit de cloud) wordt daarmee steeds meer werkelijkheid.

    Big data & data science gaan helemaal doorbreken in de non-profit
    De subtiele doelstellingen van de non-profit, zoals verbetering van kwaliteit, (patiënt/cliënt/burger) veiligheid, punctualiteit en toegankelijkheid, vragen om big data toepassingen. Immers, voor die subtiliteit heb je meer goede informatie en dus data, sneller, met meer detail en schakering nodig, dan wat er nu veelal nog uit de traditionelere bi-omgevingen komt. Als de non-profit de broodnodige focus van de profit sector, op ‘winst’ en ‘omzetverbetering’, weet te vertalen naar haar eigen situatie, dan staan succesvolle big data initiatieven om de hoek! Mind you, deze voorspelling geldt uiteraard ook onverkort voor de zorg.

    Hans Geurtsen, business intelligence architect data solutions bij Info Support
    Van big data naar polyglot persistence
    In 2016 hebben we het niet meer over big, maar gewoon over data. Data van allerlei soorten en in allerlei volumes die om verschillende soorten opslag vragen: polyglot persistence. Programmeurs kennen de term polyglot al lang. Een applicatie anno 2015 wordt vaak al in meerdere talen geschreven. Maar ook aan de opslag kant van een applicatie is het niet meer alleen relationeel wat de klok zal slaan. We zullen steeds meer andere soorten databases toepassen in onze data oplossingen, zoals graph databases, document databases, etc. Naast specialisten die alles van één soort database afweten, heb je dan ook generalisten nodig die precies weten welke database zich waarvoor leent.

    De doorbraak van het moderne datawarehouse
    ‘Een polyglot is iemand met een hoge graad van taalbeheersing in verschillende talen’, aldus Wikipedia. Het gaat dan om spreektalen, maar ook in het it-vakgebied, kom je de term steeds vaker tegen. Een applicatie die in meerdere programmeertalen wordt gecodeerd en data in meerdere soorten databases opslaat. Maar ook aan de business intelligence-kant volstaat één taal, één omgeving niet meer. De dagen van het traditionele datawarehouse met een etl-straatje, een centraal datawarehouse en één of twee bi-tools zijn geteld. We zullen nieuwe soorten data-platformen gaan zien waarin allerlei gegevens uit allerlei bronnen toegankelijk worden voor informatiewerkers en data scientists die allerlei tools gebruiken.

    Business intelligence in de cloud
    Waar vooral Nederlandse bedrijven nog steeds terughoudend zijn waar het de cloud betreft, zie je langzaam maar zeker dat de beweging richting cloud ingezet wordt. Steeds meer bedrijven realiseren zich dat met name security in de cloud vaak beter geregeld is dan dat ze zelf kunnen regelen. Ook cloud leveranciers doen steeds meer om Europese bedrijven naar hun cloud te krijgen. De nieuwe data centra van Microsoft in Duitsland waarbij niet Microsoft maar Deutsche Telekom de controle en toegang tot klantgegevens regelt, is daar een voorbeeld van. 2016 kan wel eens hét jaar worden waarin de cloud écht doorbreekt en waarin we ook in Nederland steeds meer complete BI oplossingen in de cloud zullen gaan zien.

    Huub Hillege, principal data(base) management consultant bij Info-Shunt
    Big data
    De big data-hype zal zich nog zeker voortzetten in 2016 alleen het succes bij de bedrijven is op voorhand niet gegarandeerd. Bedrijven en pas afgestudeerden blijven elkaar gek maken over de toepassing. Het is onbegrijpelijk dat iedereen maar Facebook, Twitter en dergelijke data wil gaan ontsluiten terwijl de data in deze systemen hoogst onbetrouwbaar is. Op elke conferentie vraag ik waar de business case, inclusief baten en lasten is, die alle investeringen rondom big data rechtvaardigen. Zelfs bi-managers van bedrijven moedigen aan om gewoon te beginnen. Dus eigenlijk: achterom kijken naar de data die je hebt of kunt krijgen en onderzoeken of je iets vindt waar je iets aan zou kunnen hebben. Voor mij is dit de grootste valkuil, zoals het ook was met de start van Datawarehouses in 1992. Bedrijven hebben in de huidige omstandigheden beperkt geld. Zuinigheid is geboden.

    De analyse van big data moet op de toekomst zijn gericht vanuit een duidelijke business-strategie en een kosten/baten-analyse: welke data heb ik nodig om de toekomst te ondersteunen? Bepaal daarbij:

    • Waar wil ik naar toe?
    • Welke klantensegmenten wil ik erbij krijgen?
    • Gaan we met de huidige klanten meer 'Cross selling' (meer producten) uitvoeren?
    • Gaan we stappen ondernemen om onze klanten te behouden (Churn)?

    Als deze vragen met prioriteiten zijn vastgelegd moet er een analyse worden gedaan:

    • Welke data/sources hebben we hierbij nodig?
    • Hebben we zelf de data, zijn er 'gaten' of moeten we externe data inkopen?

    Databasemanagementsysteem
    Steeds meer databasemanagementsysteem (dbms)-leveranciers gaan ondersteuning geven voor big data-oplossingen zoals bijvoorbeeld Oracle/Sun Big Data Appliance, Teradata/Teradata Aster met ondersteuning voor Hadoop. De dbms-oplossingen zullen op de lange termijn het veld domineren. big data-software-oplossingen zonder dbms zullen het uiteindelijk verliezen.

    Steeds minder mensen, ook huidige dbma's, begrijpen niet meer hoe het technisch diep binnen een database/DBMS in elkaar zit. Steeds meer zie je dat fysieke databases uit logische data modelleer-tools worden gegeneerd. Formele fysieke database-stappen/-rapporten blijven achterwege. Ook ontwikkelaars die gebruik maken van etl-tools zoals Informatica, AbInitio, Infosphere, Pentaho et cetera, genereren uiteindelijk sgl-scripts die data van sources naar operationele datastores en/of datawarehouse brengen.

    Ook de bi-tools zoals Microstrategy, Business Objects, Tableau et cetera genereren sql-statements.
    Meestal zijn dergelijke tools initieel ontwikkeld voor een zeker dbms en al gauw denkt men dat het dan voor alle dbms'en toepasbaar is. Er wordt dan te weinig gebruik gemaakt van specifieke fysieke dbms-kenmerken.

    De afwezigheid van de echte kennis veroorzaakt dan performance problemen die in een te laat stadium worden ontdekt. De laatste jaren heb ik door verandering van databaseontwerp/indexen en het herstructureren van complexe/gegenereerde sql-scripts, etl-processen van zes tot acht uur naar één minuut kunnen krijgen en queries die 45 tot 48 uur liepen uiteindelijk naar 35 tot veertig minuten kunnen krijgen.

    Advies
    De benodigde data zal steeds meer groeien. Vergeet de aanschaf van allerlei hype software pakketten. Zorg dat je zeer grote, goede, technische, Database-/dbms-expertise in huis haalt om de basis van onderen goed in te richten in de kracht van je aanwezige dbms. Dan komt er tijd en geld vrij (je kan met kleinere systemen uit de voeten omdat de basis goed in elkaar zit) om, na een goede business case en ‘proof of concepts’, de juiste tools te selecteren.

  • 8 op de 10 bedrijven slaat gevoelige data op in de cloud

    54640085% van de bedrijven slaat gevoelige data op in de cloud. Dit is een flinke stijging ten opzichte van de 54% die vorig jaar aangaf dit te doen. 70% van de bedrijven maakt zich zorgen over de veiligheid van deze data.

    Dit blijkt uit onderzoek van 451 Research in opdracht van Vormetric, leverancier van databeveiliging voor fysieke, big data, public, private en hybride cloud omgevingen. Gevoelige data staat uiteraard niet alleen in de cloud. 50% van de bedrijven geeft aan gevoelige data in big data systemen te hebben staan (tegenover 31% vorig jaar), en 33% heeft dergelijke data in Internet of Things (IoT) omgevingen opgeslagen.

    Zorgen over de cloud
    451 Research heeft respondenten ook gevraagd naar de zorgen die zij hebben over de veiligheid van hun gevoelige data die in de cloud staat. De belangrijkste zorgenpunten zijn:

    • Cyberaanvallen en -inbraken bij een service provider (70%)
    • De kwetsbaarheid van een gedeelde infrastructuur (66%)
    • Een gebrek aan controle over de locatie waar data is opgeslagen (66%)
    • Een gebrek aan een data privacy beleid of privacy SLA (65%)

    Ook is respondenten gevraagd welke wijzigingen hun bereidheid data in de cloud onder te brengen zullen vergroten. De belangrijkste wijzigingen waar respondenten behoefte aan hebben zijn:

    • Encryptie van data, waarbij de encryptiesleutel wordt beheerd op de eigen infrastructuur van het bedrijf (48%)
    • Gedetaileerde informatie over de fysieke en IT-beveiliging (36%)
    • Het zelf kunnen kiezen voor encryptie van data die is opgeslagen op de infrastructuur van een service provider (35%)

    Zorgen over big data systemen
    Ook de opslag van gevoelige data in big data systemen baart respondenten zorgen. De belangrijkste zorgenpunten zijn:

    • De veiligheid van rapporten die met big data systemen worden gecreëerd, aangezien deze gevoelige data kunnen bevatten (42%)
    • Het feit dat data op iedere locatie binnen deze omgeving kan zijn ondergebracht (41%)
    • Privacyschendingen door data die uit verschillende landen afkomstig is (40%)Toegang door gebruikers met ‘superrechten’ tot beschermde data (37%)
    • Een gebrek aan een security raamwerk en beheermogelijkheden binnen de omgeving (33%)

    Ook merkt 451 Research op dat big data systemen vaak in de cloud draaien. Zorgen over de opslag van gevoelige data van de cloud zijn hierdoor ook van toepassing op data die in big data omgevingen is opgeslagen.

    Ook data in IoT omgevingen leidt tot zorgen
    Tot slot kijkt 451 Research naar de zorgen die bedrijven hebben over de opslag van data in IoT omgevingen. De belangrijkste zorgen op dit gebied zijn:

    • Het beschermen van data die door IoT wordt gecreëerd (35%)
    • Privacyschendingen (30%)
    • Identificeren welke data gevoelig is (29%)
    • Toegang van gebruikers met ‘superrechten’ tot IoT data en apparaten (28%)
    • Aanvallen op IoT-apparaten die een impact kunnen hebben op de kritieke bedrijfsvoering (27%)

    Het gehele onderzoek lees je HIER

    Source: Executive People

  • Better analytics must address cloud computing's remaining challenges

     

    Without proper analytics in place, many cloud services customers are wasting resources, struggling with compliance and suffering from outages and unexpected costs, according to a new study from Forrester Research.

    The study, sponsored by enterprise cloud hosting provider iLand, shows that all of the 275 IT decision makers and senior business executives surveyed in the United States, United Kingdom and Singapore said they’ve experienced at least one negative financial or operational impact due to missing or hidden metadata. These negative business impacts include outages, wasted resources, unexpected costs and challenges reporting to management.

    “Companies aren’t just using the cloud—they depend on it,” the report says. “Nevertheless, cloud providers fail to keep cloud users happy. As companies expand their use of cloud services, they need to be confident that their cloud providers aren’t holding anything back, and are committed to their success.”

    Other findings of the report are that overall, cloud customer satisfaction is low, with a strong sentiment that providers don’t respond to customers’ needs. More than half of the respondents said their provider does not understand their company’s needs or care about their success.

    Forrester recommends that companies considering cloud services evaluate the native tools delivered by the cloud platform to ensure they deliver visibility, alerting and analytics; demand clarity about compliance data, on-call experts and straightforward processes from your cloud provider; and look for a cloud with onboarding and support teams staffed by experts

    Auteur: Bob Violino

    Bron: Information Management

  • Big Data on the cloud makes economic sense

    With Big Data analytics solutions increasingly being made available to enterprises in the cloud, more and more companies will be able to afford and use them for agility, efficiency and competitiveness

    google
    For almost 10 years, only the biggest of technology firms such as Alphabet Inc.’s Google and Amazon.com Inc.
    used data analytics on a scale that justified the idea of ‘big’ in Big Data. Now more and more firms are
    warming up to the concept. Photo: Bloomberg

    On 27 September, enterprise software company SAP SE completed the acquisition of Altiscale Inc.—a provider of Big Data as-a-Service (BDaaS). The news came close on the heels of data management and analytics company Cloudera Inc. and data and communication services provider CenturyLink Inc. jointly announcing BDaaS services. Another BDaaS vendor, Qubole Inc., said it would offer a big data service solution for the Oracle Cloud Platform.

    These are cases in point of the growing trend to offer big data analytics using a cloud model. Cloud computing allows enterprises to pay for software modules or services used over a network, typically the Internet, on a monthly or periodical basis. It helps firms save relatively larger upfront costs for licences and infrastructure. Big Data analytics solutions enable companies to analyse multiple data sources, especially large data sets, to take more informed decisions.

    According to research firm International Data Corporation (IDC), the global big data technology and services market is expected to grow at a compound annual growth rate (CAGR) of 23.1% over 2014-2019, and annual spending is estimated to reach $48.6 billion in 2019.

    With Big Data analytics solutions increasingly being made available to enterprises in the cloud, more and more companies will be able to afford and use them for agility, efficiency and competitiveness.

    MarketsandMarkets, a research firm, estimates the BDaaS segment will grow from $1.8 billion in 2015 to $7 billion in 2020. There are other, even more optimistic estimates: research firm Technavio, for instance, forecasts this segment to grow at a CAGR of 60% from 2016 to 2020.

    Where does this optimism stem from?

    For almost 10 years, it was only the biggest of technology firms such as Alphabet Inc.’s Google and Amazon.com Inc., that used data analytics on a scale that justified the idea of ‘big’ in Big Data. In industry parlance, three key attributes are often used to understand the concept of Big Data. These are volume, velocity and variety of data—collectively called the 3Vs.

    Increasingly, not just Google and its rivals, but a much wider swathe of enterprises are storing, accessing and analysing a mountain of structured and unstructured data. The trend is necessitated by growing connectivity, falling cost of storage, proliferation of smartphones and huge popularity of social media platforms—enabling data-intensive interactions not only among ‘social friends’ but also among employers and employees, manufacturers and suppliers, retailers and consumers—virtually all sorts of connected communities of people.

    g tech web
     
    A November 2015 IDC report predicts that by 2020, organisations that are able to analyse all relevant data and deliver actionable information will achieve an extra $430 billion in productivity benefits over their less analytically oriented peers.

    The nascent nature of BDaaS, however, is causing some confusion in the market. In a 6 September article onNextplatform.com, Prat Moghe, founder and chief executive of Cazena—a services vendor—wrote that there is confusion regarding the availability of “canned analytics or reports”. According to him, vendors (solutions providers) should be carefully evaluated and aspects such as moving data sets between different cloud and on-premises systems, ease of configuration of the platform, etc., need to be kept in mind before making a purchase decision.

    “Some BDaaS providers make it easy to move datasets between different engines; others require building your own integrations. Some BDaaS vendors have their own analytics interfaces; others support industry-standard visualization tools (Tableau, Spotfire, etc.) or programming languages like R and Python. BDaaS vendors have different approaches, which should be carefully evaluated,” he wrote.

    Nevertheless, the teething troubles are likely to be far outweighed by the benefits that BDaaS brings to the table. The key drivers, according to the IDC report cited above, include digital transformation initiatives being undertaken by a lot of enterprises; the merging of real life with digital identity as all forms of personal data becomes available in the cloud; availability of multiple payment and usage options for BDaaS; and the ability of BDaaS to put more analytics power in the hands of business users.

    Another factor that will ensure growth of BDaaS is the scarcity of skills in cloud as well as analytics technologies. Compared to individual enterprises, cloud service providers such as Google, Microsoft Corp., Amazon Web Services and International Businsess Machines Corp. (IBM) can attract and retain talent more easily and for longer durations.

    Manish Mittal, managing principal and head of global delivery at Axtria, a medium-sized Big Data analytics solutions provider, says the adoption of BDaaS in India is often driven by business users. While the need is felt by both chief information officers and business leaders, he believes that the latter often drive adoption as they feel more empowered in the organisation.

    The potential for BDaaS in India can be gauged from Axtria’s year-on-year business growth of 60% for the past few years—and there are several niche big data analytics vendors currently operating in the country (besides large software companies).

    Mittal says that the growth of BDaaS adoption will depend on how quickly companies tackle the issue of improving data quality.

    Source: livemint.com, October 10, 2016
     

     

  • Big Data Predictions for 2016

    A roundup of big data and analytics predictions and pontifications from several industry prognosticators.

    At the end of each year, PR folks from different companies in the analytics industry send me predictions from their executives on what the next year holds. This year, I received a total of 60 predictions from a record 17 companies. I can't laundry-list them all, but I can and did put them in a spreadsheet (irony acknowledged) to determine the broad categories many of them fall in. And the bigger of those categories provide a nice structure to discuss many of the predictions in the batch.

    Predictions streaming in
    MapR CEO John Shroeder, whose company just added its own MapR Streams component to its Hadoop distribution, says "Converged Approaches [will] Become Mainstream" in 2016. By "converged," Schroeder is alluding to the simultaneous use of operational and analytical technologies. He explains that "this convergence speeds the 'data to action' cycle for organizations and removes the time lag between analytics and business impact."

    The so-called "Lambda Architecture" focuses on this same combination of transactional and analytical processing, though MapR would likely point out that a "converged" architecture co-locates the technologies and avoids Lambda's approach of tying the separate technologies together.

    Whether integrated or converged, Phu Hoang, the CEO of DataTorrent predicts 2016 will bring an ROI focus to streaming technologies, which he summarizes as "greater enterprise adoption of streaming analytics with quantified results." Hoang explains that "while lots of companies have already accepted that real-time streaming is valuable, we'll see users looking to take it one step further to quantify their streaming use cases."

    Which industries will take charge here? Hoang says "FinTech, AdTech and Telco lead the way in streaming analytics." That makes sense, but I think heavy industry is, and will be, in a leadership position here as well.

    In fact, some in the industry believe that just about everyone will formulate a streaming data strategy next year. One of those is Anand Venugopal of Impetus Technologies, who I spoke with earlier this month. Venugopa, in fact, feels that we are within two years of streaming data becoming looked upon just another data source.

    Internet of predicted things
    It probably won't shock you that the Internet of Things (IoT) was a big theme in this year's round of predictions. Quentin Gallivan, Pentaho's CEO, frames the thoughts nicely with this observation: "Internet of Things is getting real!" Adam Wray, CEO at Basho, quips that "organizations will be seeking database solutions that are optimized for the different types of IoT data." That might sound a bit self-serving, but Wray justifies this by reasoning that this will be driven by the need to "make managing the mix of data types less operationally complex." That sounds fair to me.

    Snehal Antani, CTO at Splunk, predicts that "Industrial IoT will fundamentally disrupt the asset intelligence industry." Suresh Vasudevan, the CEO of Nimble Storage proclaims "in 2016 the IoT invades the datacenter." That may be, but IoT technologies are far from standardized, and that's a barrier to entry for the datacenter. Maybe that's why the folks at DataArt say "the IoT industry will [see] a year of competition, as platforms strive for supremacy." Maybe the data center invasion will come in 2017, then.

    Otto Berkes, CTO at CA Technologies, asserts that "Bitcoin-born Blockchain shows it can be the storage of choice for sensors and IoT." I hardly fancy myself an expert on blockchain technology, so I asked CA for a little more explanation around this one. A gracious reply came back, explaining that "IoT devices using this approach can transact directly and securely with each other...such a peer-to-peer configuration can eliminate potential bottlenecks and vulnerabilities." That helped a bit, and it incidentally shines a light on just how early-stage IoT technology still is, with respect to security and distributed processing efficiencies.

    Growing up
    Though admittedly broad, the category with the most predictions centered on the theme of value and maturity in Big Data products supplanting the fascination with new features and products. Essentially, value and maturity are proxies for the enterprise-readiness of Big Data platforms.

    Pentaho's Gallivan says that "the cool stuff is getting ready for prime time." MapR's Schroeder predicts "Shiny Object Syndrome Gives Way to Increased Focus on Fundamental Value," and qualifies that by saying "...companies will increasingly recognize the attraction of software that results in business impact, rather than focusing on raw big data technologies." In a related item, Schroeder predicts "Markets Experience a Flight to Quality," further stating that "...investors and organizations will turn away from volatile companies that have frequently pivoted in their business models."

    Sean Ma, Trifacta's Director of Product Management, looking at the manageability and tooling side of maturity, predicts that "Increasing the amount of deployments will force vendors to focus their efforts on building and marketing management tools." He adds: "Much of the capabilities in these tools...will need to replicate functionality in analogous tools from the enterprise data warehouse space, specifically in the metadata management and workflow orchestration." That's a pretty bold prediction, and Ma's confidence in it may indicate that Trifacta has something planned in this space. But even if not, he's absolutely right that this functionality is needed in the Big Data world. In terms of manageability, Big Data tooling needs to achieve not just parity with data warehousing and BI tools, but needs to surpass that level.

    The folks at Signals say "Technology is Rising to the Occasion" and explain that "advances in artificial intelligence and an understanding [of] how people work with data is easing the collaboration between humans and machines necessary to find meaning in big data." I'm not sure if that is a prediction, or just wishful thinking, but it certainly is the way things ought to be. With all the advances we've made in analyzing data using machine learning and intelligence, we've left the process of sifting through the output a largely manual process.

    Finally, Mike Maciag, the COO at AltiScale, asserts this forward-looking headline: "Industry standards for Hadoop solidify." Maciag backs up his assertion by pointing to the Open Data Platform initiative (ODPi) and its work to standardize Hadoop distributions across vendors. ODPi was originally anchored by Hortonworks, with numerous other companies, including AltiScale, IBM and Pivotal, jumping on board. The organization is now managed under the auspices of the Linux Foundation.

    Artificial flavor
    Artificial Intelligence (AI) and Machine Learning (ML) figured prominently in this year's predictions as well. Splunk's Antani reasons that "Machine learning will drastically reduce the time spent analyzing and escalating events among organizations." But Lukas Biewald, Founder and CEO of Crowdflower insists that "machines will automate parts of jobs -- not entire jobs." These two predictions are not actually contradictory. I offer both of them, though, to point out that AI can be a tool without being a threat.

    Be that as it may, Biewald also asserts that "AI will significantly change the business models of companies today." He expands on this by saying "legacy companies that aren't very profitable and possess large data sets may become more valuable and attractive acquisition targets than ever." In other words, if companies found gold in their patent portfolios previously, they may find more in their data sets, as other companies acquire them to further their efforts in AI, ML and predictive modeling.

    And more
    These four categories were the biggest among all the predictions but not the only ones, to be sure. Predictions around cloud, self-service, flash storage and the increasing prominence of the Chief Data Officer were in the mix as well. A number of predictions that stood on their own were there too, speaking to issues as far-reaching as salaries for Hadoop admins to open source, open data and container technology.

    What's clear from almost all the predictions, though, is that the market is starting to take basic big data technology as a given, and is looking towards next-generation integration, functionality, intelligence, manageability and stability. This implies that customers will demand certain baseline data and analytics functionality to be part of most technology solutions going forwards. And that's a great sign for everyone involved in Big Data.

    Source: ZDNet

     

  • Business Intelligence and beyond: predictions for 2016

    businessIt’s been an interesting year for BI – and 2016 looks set to be no different

    Here are some predictions on what we believe next year has in store, in particular for the data and analytics industry.

    1. Cannibalisation of the channel
    Next year will see many vendors looking to take back control, rather than invest in their channel partners. The danger for the channel is that this will result in vendors keeping good deals or redirecting services projects back to themselves. Platforms such as Amazon Web Services and Microsoft Azure have grown exponentially this year. Another risk is the continued trend of vendors developing hosted solutions, via platforms such as these that cut out their channel partners. In response to this, the channel needs to look for vendors with a transparent indirect strategy in place and form mutually beneficially relationships.

  • Data, Analytics & Fuel Innovation at Celgene

    Williams-Richard-CelgeneCIO Richard Williams leads a global IT organization that’s harnessing digital, data, and analytics to support R&D innovation, drive operational excellence, and help Celgene achieve first-mover advantage in the shift to value-based, personalized health care intended to help patients live longer and healthier lives.
     
     
    An explosion of electronic health information is rocking the entire health care ecosystem, threatening to transform or disrupt every aspect of the industry. In the biopharmaceutical sector, that includes everything from the way breakthrough scientific innovations and insights occur to clinical development, regulatory approvals, and reimbursement for innovations. Celgene, the $11 billion integrated global biopharmaceutical company, is no exception.
     
    Indeed, Celgene, whose mission is to discover, develop, and commercialize innovative therapies for the treatment of cancer, immune-inflammatory, and other diseases, is aggressively working to leverage the information being generated across the health care system, applying advanced analytics to derive insights that power its core business and the functions that surround and support it. Long known for its commitment to external scientific collaboration as a source of innovation, Celgene is investing to harness not only the data it generates across the enterprise, but also the real-world health care data generated by its expanding network of partners. Combined, this network of networks is powering tremendous value.
     
    CIO Richard Williams sees his mission—and that of the IT organization he leads—as providing the platforms, data management, and analytics capabilities to support Celgene through the broader industry transition to value-based, personalized health care. At Celgene, this transformation is enabled by a focus on the seamless integration of information and technology. A cloud-first platform strategy, coupled with enterprise information management, serves as the foundation for leveraging the data generated and the corresponding insights from internal and external health care data.
     
    Williams recently shared his perspective on the changes wrought by enormous data volumes in health care, the role of IT at Celgene, and the ways IT supports life sciences innovation.
     
    Can you describe the environment in which Celgene is currently operating?
     
    Williams: We are living in an exciting era of scientific breakthroughs coupled with technology convergence. This creates both disruption and opportunity. The explosion and availability of data, the cloud, analytics, mobility, artificial intelligence, cognitive computing, and other technologies are accelerating data collection and insight generation, opening new pathways for collaboration and innovation. At Celgene, we’re able to apply technology as never before—in protein homeostasis, epigenetics, immuno-oncology, immuno-inflammation, informatics, and other fields of study—to better understand disease and develop targeted therapies and treatments for people who desperately need them.
     
    How does IT support scientific and business innovation at Celgene?
     
    At its core, Celgene IT is business aligned and value focused. Rather than looking at technology for technology’s sake, we view information and technology as essential to achieving our mission and business objectives. As an integrated function, we have end-to-end visibility across the value chain. This enables us to identify opportunities to leverage technology investments to connect processes and platforms across all functions. As a result, we’re able to support improvements in R&D productivity, product launch effectiveness, and overall operational excellence.
     
    This joint emphasis on business alignment and business value, which informs everything we do, is manifest in three important ways:
     
    First is our emphasis on a core set of enterprise platforms, which enable us to provide end-to-end visibility rather than a narrower functional view. We established a dual information- and cloud-first strategy to provide more comprehensive platforms of capabilities that can be shared across Celgene’s businesses. The cloud—especially with recent advances in security and analytics—provides tremendous scale, agility, and value because it allows us to standardize and create both consistency and agility across the entire organization regardless of device or access method. It’s our first choice for applications, compute power, and storage.
     
    Second is our focus on digital and the proliferation of patient, consumer, scientific, and it is creating. Health care data is growing exponentially—from something like 500 petabytes (PB) of data in 2013 to 25,000 PB by 2020, according to one study.
     
    To address this opportunity, we’ve initiated an enterprise information management (EIM) strategy through which we are targeting important data domains across our business and applying definitions, standards, taxonomies, and governance to data we capture internally and from our external partners. Establishing that consistency is critically important. It drives not only innovation, but also insight into our science, operations, and, ultimately, patient outcomes. Celgene is at the forefront in leveraging technologies that offer on-demand compute and analytic services. By establishing data consistency and influencing and setting standards, we will support our own objectives while also benefiting the broader industry.
     
    Third is our support for collaboration—the network of networks—and the appropriate sharing of information across organizational boundaries. We want to harness the capabilities and data assets of our partners to generate insights that improve our science and our ability to get better therapies to patients faster. Celgene is well-known in the industry for external innovation—how we partner scientifically—and we are now extending this approach to data and technology collaboration. One recent example is our alliance with Medidata Solutions, whose Clinical Cloud will serve as our enterprise technology and data platform for Celgene clinical trials worldwide. Celgene is also a founding commercial member of the Oncology Research Information Exchange Network, a collaboration of cancer centers spearheaded by M2Gen, a health informatics solution company. And we have teamed with ConvergeHEALTH by Deloitte and several other organizations for advanced analytics around real-world evidence and knowledge management, which will also be integrated into our data platform.
     
    You’re building this network-enabled, data-rich environment. But are your users prepared to take advantage of it?
     
    That’s an important aspect of the transformation and disruption taking place across multiple industries. Sure, IT can make information, technology, and insights available for improved decision-making, but the growing complexity of the data—whether it’s molecular structures, genomics, electronic medical records, or payment information—demands different skill sets.
     
    Data scientists are in high demand. We need to embed individuals with those specialized skills in functions from R&D to supply chain and commercial. At the same time, many more roles will require analytics acumen as part of the basic job description.
     
    As you build out your platform and data strategies, are you likely to extend those to your external alliances and partners?
     
    External collaboration enabled by shared data and analytics platforms is absolutely part of our collaboration strategy. If our informatics platforms can help our academic or commercial biotech collaborators advance the pace of their scientific evaluations, clinical studies, and commercialization, or they can help us with ours, that’s a win-win situation—and a differentiator for Celgene. We are already collaborating with Sage Bionetworks, leveraging Apple ResearchKit to develop an app that engages patients directly in innovation aimed at improving treatments for their diseases. We’re also working with IBM Watson to increase patient safety using cognitive computing to improve drug monitoring. As the power of collaborative innovation continues, collaboration will become more commonplace and lead to some amazing results.
     
    As you look out 12 to 18 months, what technologies might you want to bolt onto this platform or embed in your EIM strategy?
     
    The importance of cognitive computing, including machine learning and artificial intelligence, will continue to grow, helping us to make sense of the increasing volumes of data. The continued convergence of these technologies with the internet of things and analytics is another area to watch. It will result in operational insights as well as new, more intelligent ways to improve treatments for disease.
     
    What advice do you have for CIOs in health care or other industries who may not be as far along in their cloud, data, and analytics journeys?
    A digital enterprise is a knowledge- and information-driven enterprise, so CIOs should first focus on providing technologies and platforms that support seamless information sharing. In the process, CIOs should constantly be looking at information flows through an enterprise lens—real value is created when information is connected across all functions. Next, it’s increasingly important for CIOs to help build a technology ecosystem that allows the seamless exchange of information internally and externally because transformation and insight will occur in both places. Last, CIOs need to recognize that every job description will include data and information skills. This is an especially exciting time to be in IT because the digital capabilities we provide increasingly affect every function and role. We need to help people develop the skills they need to take advantage ofwhat we can offer now and in the future.
    Source: deloitte.wsj.com, November 14, 2016
  • Four Drivers of Successful Business Intelligence

    BICompanies across industries face some very common scenarios when it comes to getting the most value out of data. The life science industry is no exception. Sometimes a company sets out to improve business intelligence (BI) for a brand, division or functional area. It spends many months or years and millions of dollars to aggregate all of the data it thinks it needs to better measure performance and make smart business decisions only to yield more data. In another familiar scenario, a team identifies critical questions the BI system can't answer. Again, months and millions go into development. But by the time the system goes live, market and/or company conditions have changed so much that the questions are no longer relevant.

    Building Better Business Intelligence Systems
    Today's challenges cannot be met by throwing more dollars into the marketing budget or by building more, or bigger, data warehouses. Ultimately, navigating today's complexities and generating greater value from data isn't about more, it's about better. The good news is that other industries have demonstrated the power and practicality of analytics at scale. Technology has evolved to overcome fragmented data and systems. We are now observing a real push in life sciences for a BI capability that's smarter and simpler.

    So how do we build better business intelligence platforms? In working with life sciences companies around the globe, IMS Health has observed a recurring journey with three horizons of business intelligence maturity: alignment of existing KPIs, generation of superior insights and customer-centric execution (see Figure 1).

    What does it take to advance in business intelligence maturity?
    No matter where a company currently stands, there are four fundamental steps that drive BI success: the ability to align business and information management strategy, improving information management systems integration and workflow, engineering BI systems to derive more value and insights from data, and making the most of new cloud computing technologies and Software-as-a-Service (SaaS) models for delivery.

    Step 1: Align Business and Information Management Strategy
    Many IT and business leaders recognize that the traditional "build it and they will come" mentality can no longer sustain future growth in agile and cost-efficient ways. To be successful, companies need to focus upfront on developing an information management strategy that begins with the business in mind. Through a top-down and upfront focus on critical business goals, drivers and pain points, companies can ensure that key insights are captured to drive development of commercial information management strategies that align with prioritized business needs. Leading organizations have achieved success via pilot-and-prove approaches that focus on business value at each step of the journey. To be successful, the approach must be considered in the context of the business and operational strategies.

    Step 2: Improving Information Management Systems Integration and Workflow
    Although technology systems and applications have proliferated within many organizations, they often remained siloed and sub-optimized. Interoperability is now a key priority and a vehicle for optimizing commercial organizations-improving workflow speed, eliminating conflicting views of the truth across departments and paring down vendor teams managing manual data handoffs. Information and master data management systems must be integrated to deliver an integrated view of the customer. When optimized, these systems can enable advanced BI capabilities ranging from improved account management and evolved customer interactions (i.e. account-based selling and management, insights on healthcare networks and relationships with influencers and KOLs) to harnessing the power of big data and demonstrating value to all healthcare stakeholders.

    Step 3: Engineering BI Systems to Derive More Value and Insights from Data
    Life sciences companies compete on the quality of their BI systems and their ability to take action in the marketplace. Yet existing analytics systems often fail to deliver value to end users. Confusing visualizations, poorly designed data queries and gaps in underlying data are major contributors in a BI solution's inability to deliver needed insights.

    By effectively redesigning BI applications, organizations can gain new insights and build deeper relationships with customers while maximizing performance. Effective BI tools can also help to optimize interventions and the use of healthcare resources. They can drive post-marketing research by unearthing early signals of value for investigation, help companies better engage and deliver value to their customers and contribute to improve patient outcomes. This information can advance the understanding of how medicine is practiced in the real world-from disease prevention through diagnosis, treatment and monitoring.

    Step 4: Making the Most of New Cloud Computing Technologies and Software-as-a-Service (SaaS) Models for Delivery
    Chief information officers (CIOs) are increasingly looking to adopt cloud technologies in order to bring the promise of technology to commercialization and business intelligence activities. They see the potential value of storing large, complex data sets, including electronic medical records and other real-world data, in the cloud. What's more, cloud companies have taken greater responsibility for maintaining government-compliant environments for health information.

    New cloud-based BI applications are fueling opportunities for life sciences companies to improve delivery of commercial applications, including performance management, advanced analytics, sales force automation, master data management and the handling of large unstructured data streams. As companies continue their journey toward BI maturity, getting the most from new technologies will remain a high priority. Leveraging cloud-based information management and business intelligence platforms will bring tremendous benefits to companies as approaches are revised amidst changing customer demands and an urgent need for efficiency.

    The Way Forward
    While each organization's journey will be unique, advancing in business intelligence maturity-and getting more value from data - can be achieved by all with these four steps. It's time for BI that's smarter and simpler and that realizes greater value from data. With focus and precision-and the support of business and technology experts-companies can hone in on the key indicators and critical questions that measure, predict and enhance performance.

    Source: ExecutiveInsight

  • Google koopt Anvato ter versterking van cloudplatform

    Google heeft Anvato opgekocht. Dat bedrijf regelt de codering, editing, publicatie en distributie van uiteenlopende video's over meerdere platformen. De zoekgigant wil Anvato bij zijn cloudplatform voegen en de technologie implementeren in zijn eigen diensten. Hoe Google dit voor ogen heeft is niet bekend.


    Wel is bekend dat Amerikaanse televisiezenders als NBCUniversal, Fox Sports en MSNBC gebruik maken van de diensten van Anvato bij het maken en aanbieden van online video’s. Het is een dienst die Google’s eigen cloudplatform nog niet aanbiedt, dus vermoedelijk ligt hierin de reden voor de aankoop.

    "Onze teams gaan samenwerken om cloudoplossingen te bieden om bedrijven in de media en entertainmentindustrie te helpen hun video-infrastructuur te schalen en hoge kwaliteit live-video’s en on-demand content aan consumenten te bieden op elk apparaat – of dat nou een smartphone, tablet of smart-tv is", stelt Google’s senior productmanager Belwadi Srikanth in een statement.

    Het is niet bekend hoeveel Google betaald heeft voor het bedrijf. Bij zijn oprichting in 2007 haalde Anvato zo’n 2,5 miljoen dollar op in een investeringsronde, maar ondertussen zal de waarde van het bedrijf flink gegroeid zijn.

    Bron: Techzine nieuws 

  • IBM verdient beter aan lagere omzet


    Valutaire tegenwind, het afstoten van bedrijfsonderdelen en lagere marges in groeisegment de cloud hebben de omzet van IBM op bijna alle fronten doen dalen. Maar IBM hield wel meer over aan zijn bedrijfsactiviteiten

    IBM rapporteerde over het vierde kwartaal een omzet van 24,1 miljard dollar. Dat was bijna 12 procent minder dan in het vierde kwartaal van 2013. Ook over het hele jaar opgeteld daalde de omzet. Met 92,8 miljard dollar kwam de omzet in 2014 5,7 procent lager uit dan in 2013.

    De omzetdaling lijkt spectaculairder dan die in feite is. In 2013 had IBM in verschillende divisies nog inkomsten van de System X-divisie - die het afgelopen jaar werd verkocht aan Lenovo. Daarnaast treden er verschillen op door de uitbesteding van klantenservice en door de koersstijging van de dollar. Gecorrigeerd voor die factoren bleef de daling in het vierde kwartaal beperkt tot 2 procent. Wat de invloed ervan was op de jaaromzet, specificeert IBM niet.

    Cloud eist zijn tol
    Dat IBM ook kampt met structurele veranderingen in de markt, blijkt echter uit het wel en wee van de divisie software. Omzetdaling was in die divisie jarenlang ondenkbaar. Maar in het vierde kwartaal daalde de omzet daar met een kleine 7 procent. Gecorrigeerd voor valutaschommelingen resteert nog altijd een min van 3 procent. Over het gehele jaar gerekend nam de omzet in software met 2 procent af. Die negatieve ontwikkeling reflecteert de opkomst van cloud. Met clouddiensten haalde IBM in 2014 7 miljard dollar binnen, 60 procent meer dan in 2013, maar de schaal van zijn clouddiensten is naar eigen zeggen nog onvoldoende om de lagere marges te compenseren.

    Hogere brutomarge, lagere nettowinst
    Desalniettemin wist IBM het bedrijfsresultaat als percentage van de omzet nog iets op te schroeven. De bruto winstmarge over 2014 was 50 procent, die over 2013 was 49,5 procent. In het vierde kwartaal was het gat met 2013 nog iets groter: 53,3 om 52,4 procent.

    Die toegenomen efficiëntie vertaalde zich niet in een hogere netto winst. Na belastingen, afschrijvingen en bijzondere lasten boekte IBM in het vierde kwartaal een netto winst van 5,5 miljard dollar, 11 procent minder dan in het vierde kwartaal van 2014. Over het hele jaar gerekend nam de netto winst zelfs met 27 procent af tot 12 miljard dollar. Behalve de valutaschommelingen spelen daarbij ook enkele bijzondere kostenposten een rol, zoals de kosten van inkrimp van de divisie micro-electronica en een voorziening van 580 miljoen dollar voor inkrimping van het personeelsbestand.

     

    Automatiseringsgids, 21 janauri 2015

  • Insights from Dresner Advisory Services’ 2016 The Internet of Things and Business Intelligence Market Study

    • Sales and strategic planning teams see IoT as the most valuable.
    • IoT advocates are 3X as likely to consider big data critical to the success of their initiatives & programs.
    • Amazon and Cloudera are the highest ranked big data distributions followed by Hortonworks and Map/R.
    • Apache Spark MLib is the most known technology on the nascent machine learning landscape today.

    These and many other excellent insights are from Dresner Advisory Services’ 2016 The Internet of Things and Business Intelligence Market Study published last month. What makes this study noteworthy is the depth of analysis and insights the Dresner analyst team delivers regarding the intersection of big data and the Internet of Things (IoT), big data adoption, analytics, and big data distributions. The report also provides an analysis of Cloud Business Intelligence (BI) feature requirements, architecture, and security insights. IoT adoption is thoroughly covered in the study, with a key finding being that large organizations or enterprises are the strongest catalyst of IoT adoption and use. Mature BI programs are also strong advocates or adopters of IoT and as a result experience greater BI success. IoT advocates are defined as those respondents that rated IoT as either critical or very important to their initiatives and strategies.

    Key takeaways of the study include the following:

    • Sales and strategic planning see IoT as the most valuable today.The combined rankings of IoT as critical and very important are highest for sales, strategic planning and the Business Intelligence (BI) Competency Centers. Sales ranking IoT so highly is indicative of how a wide spectrum of companies, from start-ups to large-scale enterprises, is attempting to launch business models and derive revenue from IoT. Strategic planning’s prioritization of IoT is also driven by a long-term focus on how to capitalize on the technology’s inherent strengths in providing greater contextual intelligence, insight, and potential data-as-a-service business models.

    IoT-Importance-by-Function-cp

    • Biotechnology, consulting, and advertising are the industries that believe IoT is the most important to their industries.Adoption of IoT across a wide variety of industries is happening today, with significant results being delivered in manufacturing, distribution including asset management, logistics, supply chain management, and marketing. The study found that the majority of industries see IoT as not important today, with the exception of biotechnology.

    IOT-Importance-by-Industry-cp

    • Location intelligence, mobile device support, in-memory analysis, and integration with operational systems are the four areas that most differentiate IoT advocates’ interests and focus.Compared to the overall sample of respondents, IoT advocates have significantly more in-depth areas of focus than the broader respondent base. The four areas of location intelligence, mobile device support, in-memory analysis, and integration with operational systems show they have a practical, pragmatic mindset regarding how IoT can contribute greater process efficiency, revenue and integrate with existing systems effectively.

    IoT-Advocates-Circle-cp1

    • An organization’s ability to manage big data analytics is critically important to their success or failure with IoT. IoT advocates are 3X as likely to consider big data critical, and 2X as likely to consider big data very important. The study also found that IoT advocates see IoT as a core justification for investing in and implementing big data analytics and architectures.

    importance-of-big-data-cp

    • Data warehouse optimization, customer/social analysis, and IoT are the top three big data uses cases organizations are pursuing today according to the study. Data warehouse optimization is considered critical or very important to 50% of respondents, making this use case the most dominant in the study. Large-scale organizations are adopting big data to better aggregate, analyze and take action on the massive amount of data they generate daily to drive better decisions. One of the foundational findings of the study is that large-scale enterprises are driving the adoption of IoT, which is consistent with the use case analysis provided in the graphic below.

    big-data-use-cases-with-cp

    • IoT advocates are significantly above average in their use of advanced and predictive analytics today. The group of IoT advocates identified in the survey is 50% more likely to be current users of advanced and predictive analytics apps as well. The study also found that advanced analytics users tend to be the most sophisticated and confident BI audience in an organization and see IoT data as ideal for interpretation using advanced analytics apps and techniques.

    advanced-and-predictive-analytics-cp

    • Business intelligence experts, business analysts and statisticians/data scientists are the greatest early adopters of advanced and predictive analytics. More than 60% of each of these three groups of professionals is using analytics often, which could be interpreted as more than 50% of their working time.

    users-of-advanced-and-predictive-analytics-cp

    • Relational database support, open client connectors (ODBC, JDBC) and automatic upgrades are the three most important architectural features for cloud BI apps today. Connectors and integration options for on-premises applications and data (ERP, CRM, and SCM) are considered more important than cloud application and database connection options. Multitenancy is considered unimportant to the majority of respondents. One factor contributing to the unimportance of multi-tenancy is the assumption that this is managed as part of the enterprise cloud platform.

    Cloud-BI-Architectural-Requirements-cp

    • MapReduce and Spark are the two most known and important big data infrastructure technologies according to respondents today. 48% believe that MapReduce is important and 42% believe Spark is. The study also found that all other categories of big data infrastructure are considered less important as the graphic below illustrates.

    big-data-infrastructure-cp

     Forbes, 4 oktober 2016

  • Research details developments in the business intelligence (BI) market that is estimated to grow at 10% CAGR to 2020

    HOIThe global business intelligence market report, an analyst says In the past few years, social media has played critical roles in SMEs and mid-sized organizations. Many SMEs are increasingly embracing this trend and integrating their BI software with social media platforms.

    Market outlook of business intelligence market - market research analyst predicts the global business intelligence market to grow at a CAGR of around 10% during the forecast period. The growing adoption of data analytics by organizations worldwide is a key driver for the growth of this market.

    The majority of corporate data sources include data generated from enterprise applications along with newly generated cloud-based and social network data. business intelligence tools are useful in the retrieval and analysis of this vast and growing volume of discrete data.

    They also help optimize business decisions, discover significant weak signals, and develop indicator patterns to identify opportunities and threats for businesses.

    The increased acceptance of cloud BI solutions by SMEs is also boosting the growth of this market. The adoption of cloud services allows end-users to concentrate on core activities rather than managing their IT environment.

    Cloud BI solutions enable applications to be scaled quickly, can be easily integrated with easy integration with third-party applications, and provide security at all levels of the enterprise IT architecture so that these applications can be accessed remotely.

    Market segmentation by technology of the business intelligence market:

    • Traditional BI
    • Mobile BI
    • Cloud BI
    • Social BI

    The mobile BI segment accounts for approximately 20% of the global BI market. It enables the mobile workforce to get business insights by data analysis, using applications optimized for mobile and smart devices.

    The growing smartphone adoption is likely to emerge as a key growth driver for this segment during the forecast period.

    Market segmentation by deployment of the business intelligence market

    • Cloud BI
    • On-premises BI

    The on-premise segment accounted for 86% of the market share during 2015. However, the report anticipates this segment to witness a decline in its shares by the end of the forecast period.

    In this segment, the software is purchased and installed on the server of an enterprise. It requires more maintenance but is highly secure and easy to manage.

    Geographical segmentation of the BI market

    • Americas
    • APAC
    • EMEA

    The Americas dominated the market during 2015, with a market share of around 56%. The high adoption of cloud BI solutions in this region is the major growth contributor for this market.

    The US is the market leader in this region as most of the key vendors are based out of here.

    Competitive landscape and key vendors

    Microsoft is one of the largest BI vendors and offers Power BI, which helps to deliver business-user-oriented, self-service data preparation and analysis needs through Excel 2013 and Office 365. The competitive environment in this market is expected to intensify during the forecast period due to an increase in R&D innovations and mergers.

    The market is also expected to witness a growing trend of acquisitions by the leading players. The key players in the market are expected to diversify their geographical presence during the forecast period.

    The key vendors of the market are -

    • IBM
    • Microsoft
    • Oracle
    • SAP
    • SAS Institute

    Other prominent vendors in the market include Actuate, Alteryx, Board International, Brist, Datawatch, GoodData, Infor, Information Builders, Logi Analytics, MicroStrategy, Panorama Software, Pentaho, Prognoz, Pyramid Analytics, Qlik, Salient Management Company, Tableau, Targit, Tibco Software, and Yellowfin.

    Key questions answered in the report

    • What will the market size and the growth rate be in 2020?
    • What are the key factors driving the BI market?
    • What are the key market trends impacting the growth of the BI market?
    • What are the challenges to market growth?
    • Who are the key vendors in the global BI market?
    • What are the market opportunities and threats faced by the vendors in the BI market?
    • Trending factors influencing the market shares of the Americas, APAC, and EMEA?
    • What are the key outcomes of the five forces analysis of the BI market?

    Source: WhaTech

  • Security Concerns Grow As Big Data Moves to Cloud

    red-hacked-symbol-200x133Despite exponential increases in data storage in the cloud along with databases and the emerging Internet of Things (IoT), IT security executives remain worried about security breaches as well as vulnerabilities introduced via shared infrastructure.

    A cloud security survey released Wednesday (Feb. 24) by enterprise data security vendor Vormetric and 451 Research found that 85 percent of respondents use sensitive data stored in the cloud, up from 54 percent last year. Meanwhile, half of those surveyed said they are using sensitive data within big data deployments, up from 31 percent last year. One-third of respondents said they are accessing sensitive data via IoT deployments.

    The upshot is that well over half of those IT executive surveyed are worried about data security as cloud usage grows, citing the possibility of attacks on service providers, exposure to vulnerabilities on shared public cloud infrastructure and a lack of control over where data is stored.

    Those fears are well founded, the security survey notes: “To a large extent both security vendors and enterprises are like generals fighting the last war. While the storm of data breaches continues to crest, many remain focused on traditional defenses like network and endpoint security that are clearly no longer sufficient on their own to respond to new security challenges.”

    Control and management of encryption keys is widely seen as critical to securing data stored in the cloud, the survey found. IT executives were divided on the question of managing encryption keys, with roughly half previously saying that keys should be managed by cloud service providers. That view has shifted in the past year, the survey found, with 65 percent now favoring on-premise management of encryption keys.

    In response to security concerns, public cloud vendors like Amazon Web Services, Google, Microsoft and Salesforce have moved to tighten data security through internal development, partnerships and acquisitions in an attempt to reduce vulnerabilities. Big data vendors have lagged behind, but the survey noted that acquisitions by Cloudera and Hortonworks represent concrete steps toward securing big data.

    Cloudera acquired encryption and key management developer Gazzang in 2014 to boost Hadoop security. Among Hortonworks’ recent acquisitions is XA Secure, a developer of security tools for Hadoop.

    Still, the survey warned, IoT security remains problematic.

    When asked which data resources were most at risk, 54 percent of respondents to the Vormetric survey cited databases while 41 percent said file servers. Indeed, when linked to the open Internet, these machines can be exposed vulnerabilities similar to recent “man-in-the-middle” attacks on an open source library.

    (Security specialist SentinelOne released an endpoint platform this week designed to protect enterprise datacenters and cloud providers from emerging threats that target Linux servers.)

    Meanwhile, the top security concerns for big data implementations were: the security of reports that include sensitive information; sensitive data spread across big data deployments; and privacy violations related to data originating in multiple countries. Privacy worries have been complications by delays in replacing a 15-year-old “safe harbor” agreement struck down last year that governed trans-Atlantic data transfers. A proposed E.U.-U.S. Privacy Shield deal has yet to be implemented.

    Despite these uncertainties and continuing security worries, respondents said they would continue shifting more sensitive data to the cloud, databases and IoT implementations as they move computing resources closer to data. For example, half of all survey respondents said they would store sensitive information in big data environments.

    Source: Datanami

  • Why You Should Be Securing Big Data In The Cloud?

    160104-Cloud-800x445Combining big data and the cloud is the perfect solution for a company's computing needs. A company's data often requires a computing environment which can quickly and effectively grow and flex, automatically accommodating large amounts of data. The cloud computing environment does just that. There is one question which continually arises when discussing cloud computing.

    How secure is the cloud?

    Securing data, especially big data is a major concern. Companies expect that any data stored in the cloud will be secured and that the security measures will be flexible to keep up with a changing threat environment. There are four ways to keep your big data secure in the cloud. Each will keep your data safe and yet provide the flexibility that is inherent to using the cloud.

    1. Encrypt Your Sensitive Data

    Encrypting your data provides another level of security within your cloud infrastructure. Each security solution must be customized to the project and the data. There is no single type of encryption that will work for every situation. Certain types of premise gateway encryption solutions do not work well with cloud big data situations. Other solutions, including encryption provided by the cloud provider, ask the end user to rely on someone else to encrypt their data. This is often a risky proposition and most companies will not agree.

    Encryption solutions such as split-key encryption were developed specifically for data storage in the cloud. These technologies keep the cloud data safe, providing encryption keys which the customer holds and uses.

    Split-key encryption is the safest and most effective means of encrypting cloud-based data.

    2. Use cloud security technologies which can be scaled to meet changing requirements.

    When it comes to big data, solutions must quickly scale to meet the demand. This is the same for security technologies for cloud data. Ensure any cloud security technology you choose is available and relevant across any and all cloud locations. Additionally, to be effective, any cloud security solution must be able to quickly scale to meet demands and changing requirements.

    Because of the inability to quickly scale and grow, hardware solutions are not a viable option for securing cloud big data. It is not possible to adapt a hardware security module (HSM) quickly enough to meet continuously changing data security requirements.

    Only a cloud-based solution will provide the ease and efficiency to scale quickly in response to demand. These solutions are just as, if not more effective than hardware-based technologies. Additionally, cloud based solutions such as CDNs provide security and are also allow for faster working sites and tools.

    3. Automate where possible

    Many companies are not happy with traditional cloud security solutions because they will not scale quickly to meet demand. Standard encryption technologies typically use an HSM element in their design. Since hardware cannot be automated, these security solutions are limited in their effectiveness within the cloud.

    The best cloud security solutions use virtual appliances instead of hardware within their systems. It is also important to ensure an effective RESTful API is part of any cloud security solution.

    A cloud security solution which includes a RESTful API and a virtual appliance will provide the automation and flexibility required to secure cloud big data.

    4. Never compromise on data security

    Cloud big data security solutions are often a complicated business. As a result, we often see systems which are not quite as comprehensive as they should be. Some cloud security systems designers will take a shortcut to get around the complexities involved in securing big data.

    For example, some systems will use freeware and their encryption tools to secure the data. They may keep the encryption keys in a physical location or on a disc which provides an opportunity for them to be lost or stolen. Using these types of shortcuts can certainly be easier, but they do not provide a viable security solution for cloud data storage.

    Companies must protect their data by mapping and reviewing the sensitivity of their data and then designing a cloud security solution to ensure it is protected. The right security solution can provide great results. Remember, not all cloud data storage is secure. If you have very sensitive or regulated data, you may need to search for another security solution to keep your data protected.

    Source: SmartDataCollective

EasyTagCloud v2.8