50 items tagged "intelligent organization"

  • 5 Best practices to attract (and retain) talent for your organization

    5 Best practices to attract (and retain) talent for your organization

    By applying these best practices, you can bring on talent that keeps pace with innovation, shifting customer needs, and new technologies.

    It’s no secret that there is currently a massive technology talent shortage. As this Wall Street Journal article notes, tech leaders and recruiters alike increasingly feel the pressure to stay competitive, some even going so far as to offer perks like six-figure bonuses and the ability to work from anywhere they want. Hiring tech talent is a massive pain point across many organizations, and it’s at the top of most IT leaders’ to-do lists.

    Aside from outsized perks, what can organizations do to address the talent shortage? The key lies in looking for talent in new places and uncovering ways to connect with and inspire candidates before, during and after the interview process. Here are five ways to identify, hire, and retain the right team.

    1. Partner with schools

    If you’re not already doing so, build relationships with schools and make it a priority to partner with them to fuel student interest in your company. Current students are the future of your business, so working with universities early and often can both grow and keep your talent pool specialized.

    Many schools have begun implementing programs to directly address the shortage of technology talent, including degree programs in industries like cybersecurity and cloud. MIT, for example, among many others, now offers programs that match specific business needs like 'Ethics of AI' and 'Modeling and Optimization for Machine Learning'. Some cloud providers are teaming up with schools to offer programs and specialized degrees, and we have also seen great success in partnering with universities to sponsor research in engineering departments.

    2. Look to untapped pools of talent

    Beyond looking to recent graduates, consider untapped pools of talent to diversify your workforce. While often overlooked because of 'lack of relevant technical experience', veterans offer skills that could greatly impact your existing teams, including strong leadership, productivity and decision-making capabilities. We can look to companies like Salesforce for inspiration: Its veteran program Vetforce connects the military community with open IT positions.

    Another pool of talent often left behind are those who have taken time off and want to restart their careers, including parents with new children or those who had to care for a loved one in a time of need. Returnship programs for example. These programs help professionals with five or more years of work experience, and who have been out of the paid workforce for a minimum of two years, to bridge their transition back into the workforce. We have found excellent, talented employees through this channel.

    3. Ask the right questions

    Once you have a candidate in mind, ask the right interview questions to determine their potential fit on your team. My favorite interview question is 'What isn’t on your resume that you’d like to share?' A resume tells 'what' you did. But it doesn’t tell 'how' you did it. These stories often provide the most critical insight into a candidate. I want to hear how a candidate has overcome adversity and what they learned from their challenges. I prize candidates’ perseverance and determination rather than a list of accomplishments or schools they went to. Tell me what you did with what you had. With the technology industry changing at a rapid pace, we need candidates who are comfortable being uncomfortable in the name of positive change.

    4. Think beyond money with the job offer

    If you think you can entice today’s talent pool just with compensation, think again. Career growth opportunities now rank as the most important factor when looking for a new job. Offering plenty of opportunities for employee training and growth will not only entice potential candidates, but it will also keep current employees on board. We offer an array of training and certification programs so our employees can build marketable skills in enterprise cloud technology. These programs should be all about choice, enabling employees to design the mix of in-person, online, or video training that meet them wherever they work today. Large, high-growth companies can also offer candidates the ability to easily move between different teams at the company, learn from new groups and cross-pollinate ideas.

    5. Simplify redundant tasks

    Identifying areas where a company can simplify to boost productivity can be an equally important step to the above. For example, automating existing, repetitive IT tasks can help free up time to focus on more innovative, creative projects. At our company, we’re using the power of machine learning (ML) and natural language processing to augment our IT helpdesk and customer support services. Using ML technologies, more than 30% of all service requests are automatically resolved, freeing up both time and budget for value-creating activities.

    When it comes to hiring and retaining the best talent, it can feel like you’re in a losing race against a continually changing technology environment. But by keeping these best practices in mind, you can bring on talent that keeps pace with innovation, shifting customer needs, and new technologies.

    Author: David Sangster

    Source: Informationweek

  • 5 Tips to maintain your business amid the challenges of 2021

    5 Tips to maintain your business amid the challenges of 2021

    The pandemic and everything related to it caught many business owners by surprise and, even worse, caused some businesses to fold

    The pandemic and everything related to it caught many business owners by surprise and, even worse, caused some businesses to fold. Yelp reported that permanent business closures have reached 97,966, representing 60% of closed businesses that won’t be reopening. Imagine investing so much into an entity, and because of some virus, things start to slow down, till they eventually go under.

    It’s the new year, and nobody knows what it holds. But as a business owner, you have to plan and fortify your business against such problems.

    Every day, innovations are created which simplify the buying process for your customers. If your business isn’t doing anything new and making life easier for your customers, you may risk losing your customers to people who will. Steve Jobs said, “Innovation distinguishes between a leader and a follower”. Meaning, if you want to become a leader in your industry or niche, you have to innovate. This cuts across marketing, product delivery, customer retention, etc.

    2021 is a different year and would most likely have new challenges, so here are five survival tips for your business.

    1. Stay on your customers’ minds

    Marketing is very crucial for any business. Businesses that survived through 2020 were the ones that remained in their customers’ minds. If there are still lockdowns in 2021, that may cause your business not to operate normally, resulting in lower revenues. However, you shouldn't stop marketing but instead, reduce your marketing budget.

    Marketing is how you understand your customers’ needs, educate them, attract new ones, and get them to keep doing business with you. All marketing strategies may not lead to outright sales, but sales will happen eventually if you’re consistent enough. 

    Your social media platforms should be your best friends in the coming years. Studies show that “nearly 50% of the world's population uses social media”. That's over 3 billion users worldwide.

    Part of staying on your customers’ minds is being around where they can see you, and social media marketing is a marketing strategy that gives you that leverage. Whether you choose to use influencers or paid social, you have to put in the effort to remain in your customers’ minds. 

    2. Get the best hands for the job

    Great employees mean great business. If you want to give your business that competitive advantage, then you need to get the best hands for the job. You won’t give your customers the best if you don’t have the best staff.

    It goes beyond hiring the right people but also training and keeping your employees happy. This cuts across being understanding, ensuring that you create a healthy work environment, and giving them a purpose that contributes to their job fulfillment. 

    How do you create that positive work environment? Everyone likes to feel heard. So, it’s your job as the business owner to create an environment that allows everyone to share their opinions and ideas about certain things. This can happen at weekly meetings. Implementing this can significantly increase their productivity, which in turn helps your business.

    3. Asset protection

    As a business owner, many things can harm your assets, which can cause you to lose money.  It won’t be nice losing your business assets in a legal case.  Many entrepreneurs overlook this part, but it is crucial to secure your assets even if things don’t go your way after a legal case.

    Nobody starts a business and considers the possibilities of entering a court case; this is why many people get into trouble. 78% of lawsuit defendants in the US never thought it would happen to them. Sony lost 200 million in assets, which could have been avoided if they protected their assets. Interestingly, you can still get your assets protected online and offline no matter your enterprise’s size.

    4. Financing and budgeting

    Your planning should also touch on finances because every business needs money to facilitate certain areas of business. Budgeting is like a roadmap for your business; you put your business at risk when you don’t have a budget.

    Riley Panko, in a report on budgeting, said, “Businesses of all sizes should create a budget if they don’t want to risk the financial health of their organization.” She said, “Businesses may create more challenges for themselves by skipping a budget. This is because budgeting helps small businesses focus.”

    Budgeting decides whether your marketing, asset acquisition, employee remuneration, and other parts of your business can go smoothly. You should anticipate all your long and short term financial needs. This may require hiring an accountant who will keep records of all the money going in and out of business. 

    5. Be ready to adapt to anything

    How adaptable are you? Patrick J. Rottinghaus defines adaptability as “The capacity to cope with and capitalize on change, and the ability to recover when unforeseen events alter life plans.”

    No one saw 2020 coming, but it came, and it was like everyone had to “adapt or die.” The usual nature of work we were all used to halted for apparent reasons. Everyone had to adapt to zoom meetings, online education, and social distancing. This alone affected many brick and mortar businesses whose primary service delivery was in-person.

    2021 is uncertain, but what is absolute should be your ability to adapt to anything. Whether it’s in your service/product delivery, marketing, or engaging with your customers, you have to be ready to dance to the tune of whatever the market throws at you and attack it creatively. 

    Author: James Jorner

    Source: Entrepreneur

  • Adopting Data Science in a Business Environment

    Adopting Data Science in a Business Environment

    While most organizations understand the importance of data, far fewer have figured out how to successfully become a data-driven company. It’s enticing to focus on the “bells and whistles” of machine learning and artificial intelligence algorithms that can take raw data and create actionable insights. However, before you can take advantage of advanced analytics tools, there are other stops along the way, from operational reporting to intelligent learning.

    Digital transformation is dependent on adoption. But adoption and proficiency of new technologies can be disruptive to an organization. Mapping a data journey provides awareness and understanding of where your organization is to ultimately get where you want to go, with enablement and adoption of the technology throughout. Without the clarity provided by a data journey, your organization won’t be positioned to successfully deploy the latest technology.

    Here are the four elements of an effective data journey. 

    Determine Your Roadmap

    As with any trip, your data journey requires a roadmap to get you from where you are to where you want to go. Before you can get to your destination, the first step is to assess where you are.

    Most organizations begin with a focus on operational reports and dashboards, which can help you glean business insights from what happened, including how many products were sold, how often and where. They can also identify where problems exist, and deliver alerts about what actions are needed.

    Ultimately, most want to get to the point where analytics tools can help with statistical analysis, forecast, predictive analytics and optimization. Armed with machine learning, manufacturers want to understand why something is happening, what happens if trends continue, what’s going to happen next and what’s the best that can be done.

    Capture Data and Build Processes and Procedures

    Once you know where you want to go, it’s important to capture the data that is essential in helping you achieve your business goals. Manufacturers capture tremendous amounts of data, but if the data you collect doesn’t solve a business need, it’s not vital to your data processing priorities.

    This phase of your data journey isn’t just about what data you collect, it’s also about your data strategy: how you collect the data, pre-process it, protect it and safely store it. You need to have processes and procedures in place to handle data assets efficiently and safely. Questions such as how you can leverage the cloud to gain access to data management tools, data quality and data infrastructure need to be answered. 

    Make Data Accessible to Business Users

    Today, data – and business insights about that data – need to be accessible to business users. This democratization of data makes it possible for business users from procurement to sales and marketing to access the data that’s imperative for them to do their jobs more effectively.

    In the past, data was the domain of specialists which often caused bottlenecks in operations while they analyzed the data. In this phase of the data journey, it’s important to consider data management tools that can consolidate and automate data collection and analysis.

    Once data is removed from silos, it makes it possible for data to be analyzed by more advanced analytics and data science tools to glean business insights that can propel your success. 

    Change Company Culture for Full Adoption

    A data culture gap is a common barrier to the adoption of advanced data analytics tools for many companies. When employees who are expected to use the data and insights don’t understand the benefits data can bring to decision-making it can create a roadblock. Your company won’t be data-driven until your team embraces a data-driven culture and starts to use the data intelligently.

    If you want to get the most out of the advanced data analytics tools that are available today and use data intelligently in your organization, you must first develop a solid foundation.

    First, you must be clear where you are in your organization’s data journey with a roadmap. Then create effective data processes, procedures, and collection methods, as well as identify what data management and analytics tools can support your initiatives. Finally, your team is key to adopting advanced data analytics tools, so be sure they are trained and understand how these tools can empower them. Once you have a solid analytics foundation, you’re ready to put machine learning to work to drive your collective success.

    Author: Michael Simms

    Source: Insidebigdata

  • Aligning your business with your data team

    Aligning your business with your data team

    It’s important for everyone at a company to have the data they need to make decisions. However, if they just work with their data team to retrieve specific metrics, they are missing out. Data teams can provide a lot more insights at a faster rate, but you will need to know how to work with them to make sure that everyone is set up for success. 

    Data teams can be thought of as experts at finding answers in data, but it’s important to understand how they do that. In order to get the most value out of your collaboration, you need to help them understand the questions that matter to you and your team and why those questions need to be answered. There are a lot of assumptions that get built into any analysis, so the more the data team knows about what you are looking for, the more knowledge they may find as they explore data to produce their analysis. Here are four tips to make more productive requests from members of your data team: 

    Approach data with an open mind

    It’s important to treat the data request process as an open-ended investigation, not a way to find data that proves a point. A lot of unexpected insights can be found along the way. Make your goal to ask questions and let your data team search for the answers. This approach will allow you to get the best insights, the type of unknowns that could change your decision for the better. If you put limitations on what you’re asking the data, you’ll end up putting limitations on the insights you can get out of your inquiry. 

    To really dig into this, think about how questions are answered scientifically. Scientists treat any bias as an opportunity for the insight to be compromised. For example, let’s say you are looking to improve customer satisfaction with your product. Requesting a list of customers with the highest and lowest NPS scores will give you a list of people who are happiest or most frustrated, but it is not going to let you know how to improve customer satisfaction. This request puts too much attention on the outliers in your customer base rather than identifying the key pain points. That’s part of the picture, but not all of it. If you’re trying to create a program that targets your goal, let your data team know the goal, give them a few optional starting points, and see what they come back with. They might surprise you with some sophisticated analysis that provides more insight and helps you launch a fantastic program. 

    Start with a conversation, not a checklist

    The single biggest mistake a line-of-business professional can make when requesting data is to present a data expert with a list of KPIs and tell the data team to just fill in the blanks. This approach misses so much of the value a data team can provide. Modern data teams have technology and abilities that allow them to go much further than just calculating numbers. They can guide analytical exploration through flexible, powerful tools to make sure you’re getting the most valuable insights out of your data.

    Instead of a list of metrics, think about starting your data request as a meeting. You can provide the business context needed and a list of questions that you want answered. You can even present some initial hypotheses about what those numbers may look like and why they might move in one direction or another. This is a great way to kick off the conversation with your data counterpart. From there, you can benefit from their experience with data to start generating new and more informed questions from their initial inquiries. The data team’s job is to get you information that helps you be more informed, so give them as much context as possible and let them work as a problem solver to find data-driven recommendations.

    Data should recommend actions, not just build KPIs reports

    A lot of standard business KPIs measure the results of company efforts: revenue, lead conversion, user count, NPS, etc. These are important statistics to measure, but the people tracking them should be very clear that these numbers track how the company is moving, not why it is moving that way. To make these data points actionable, you need to take analysis further. Knowing that your NPS is going up or down is useless if it doesn’t inform a customer team about the next step to take. 

    A good data team will map important KPIs to other data and find connections. They’ll comb through information to find the levers that are impacting those important KPIs the most, then make recommendations about how to achieve your goals. When you get a list of levers, make sure to understand the assumptions behind the recommendations and then take the right actions. You can always go back to those KPI reports to test if the levers are having the intended effect.

    Data requests are iterative, give the data person feedback

    Communication about data should not end when the data has been delivered to you. It’s important to dig into the analysis and see what you can learn. Instead of reporting that data or taking action on it right away, you should check with your dashboard creator to make sure that he or she can verify that you’re reading all of the data properly and that the next steps are clear. There are a lot of ways to misinterpret data, a good way to prevent mistakes is to continue communicating.

    Even if you’ve gotten the right takeaways from the data, it’s still good to consult with your dashboard creator and go over your interpretation of the information so they know how you read data. You may need a follow-up meeting to restart with the overall question you want to answer, then see what additional data needs to be collected or what modifications are needed to make the report or dashboard work best for your intended use-case.

    Author: Christine Quan

    Source: Sisense

  • An overview of the Chief Data Officer role, past and present

    An overview of the Chief Data Officer role, past and present

    The role of Chief Data Officers (CDOs) in 2020 is evolving as CDOs are having quite possibly their most important and challenging year in their nearly two-decade existence to meet their organizations’ best needs and data capabilities. 2020 has become the year that everyone became fully aware of data and its role in our lives.

    As COVID-19 rapidly spread into a pandemic, business models were turned on their heads, supply chains dried up, and consumer behavior drastically altered CDOs, largely the leader of an organization’s overall data strategy, needed to deliver new ways to harness data and deliver insights quickly to help the business adjust.

    In greater numbers than ever before, organizations worldwide are harnessing the data that flows into and out of their systems to accurately forecast sales, customer retention, new products, competitor performance, employee satisfaction, regulatory compliance, fraud reduction, and more. To put it simply: Data is a commodity that must be harnessed by your business, or you will be left behind.

    Thus, the CDO’s role has drastically evolved. No longer is the CDO only responsible for maintaining regulatory compliance and potentially increasing efficiencies – the CDO could be the most important member of the leadership team other than the CEO.

    History

    The first CDO appointment occurred in 2002 when CapitalOne appointed Cathy Doss to the position. The role was created largely as a response to the passing of the Sarbanes-Oxley Act of 2002, which was created as a response to various financial scandals. This new regulation required far more data governance and retention than ever before. Despite the newfound need for the role, its growth was relatively flat until 2008-2010.

    As recently as 2012, only 12% of large companies had a CDO. However, as many organizations realized the important role data plays in their business, those numbers began to rise sharply. In fact, in 2018, organizations with an appointed CDO rose to nearly 70%, and Gartner estimates that by 2025, 90% of large organizations will have a CDO.

    Evolution of the role

    Initially, the CDO was created primarily in response to new federal finance laws, thus serving largely as a defensive role focusing on governance and regulation. However, as technology improvements in the form of hardware and software emerged, coupled with an expansion in data analytics, progressive executives noted the potential for offensive corporate data utilization. Soon these organizations were able to monetize the data they were already collecting in new efficiencies, productivity, and overall growth.

    For example, various departments’ data was previously siloed, meaning that product development data wasn’t necessarily available to customer support or marketing. Under the CDO leadership, this data now exists as a thread that weaves throughout the organization, connecting design engineers all the way through to the customer. The CDO now serves as the tip of the innovation spear and not simply as a data steward.

    Challenges and opportunities

    While the role of the CDO continues to evolve and serve their organizations in new ways, there remain challenges that must be addressed:

    • Who does the CDO report to? Ideally, the CDO will be equal on the executive team, but the organizational fit varies.
    • Stakeholder buy-in to both the usage of data and the role itself also varies greatly in different organizations.
    • Battles with the CIO. CIOs trend towards attempting to save money, while CDOs typically want to invest in new technologies.
    • High turnover. The average tenure of a CDO is two years. This may, however, come from CDOs going to where the grass is greener.
    • Clarity of mission. Only 28% of recent survey respondents say that the role is successful and established.
    • Data silos. Data is still extremely siloed and scattered in most organizations, and it can be difficult to bring it all together and, more importantly, understand it.

    But just as there are questions and challenges for CDOs, there are also opportunities the office of the CDO can now offer their organizations:

    • Revenue growth
    • Innovation
    • Fraud reduction
    • Enhanced data governance
    • Lower costs
    • Data literacy

    Changes for 2020

    Harnessing the power of data in digital transformation will be imperative for most organizations going forward. AI, ML, and data analytics are no longer buzz words only for tech and finance, and every successful organization will pivot towards viewing data as an asset. COVID-19 has challenged everyone in all walks of life. Companies that have embraced data have been able to analyze their help desk data, VPN information, and other portions of their computing environments to determine what remote work policies are working are which are not.

    In the healthcare industry, the CDO office has provided information on the availability of personal protection equipment (PPE), beds, and staff to ensure adequate treatment is available for COVID-19 patients. Additionally, grocery chains have been taxed as never before, and data models provide valuable information on supply and demand and frontline grocery workers’ health.

    The post-COVID-19 world will offer opportunities due to the lessons learned and data ingested during the pandemic, such as enhanced digitization of workflows, robust disaster recovery plans, investment opportunities, and more. If 2020 has shown us anything, it is in the power of responsible data collection and sharing. CDOs that leverage this new emphasis on data and invest in the future will steer their organizations to new heights while building robust plans for future opportunities as well as crises. This is a new era, an era where data is king, and the CDO will be a critical player in determining organizations’ success or failure.

    Author: John Morrell

    Source: Datameer

  • Automating your organization in a couple of months through 4 steps

    Automating your organization in a couple of months through 4 steps

    Make automation a reality in the next 60 to 90 days

    Over the past three years, companies that scaled intelligent automation across their enterprises outperformed their peers in profitability, revenue growth and efficiency. And that performance gap is expected to widen over the next three years.

    This probably doesn’t surprise you. After all, you’re reading this blog. It’s also not news to you that automation is difficult, especially in an enterprise.

    Learning about automation approaches is definitely important, and so is starting in the right place.

    To help make sure that you’re set up for success, we’ve put together four steps you can take in the next 60 to 90 days to make automation a reality in your business.

    Step 1: Determine your goal

    At the end of the day, what are you hoping to achieve with automation? Do you want to provide better customer support? Are you trying to improve business processes so that your workers can focus on higher-quality tasks? Do you want to enable your employees to complete their work in less time so that they can spend more time with their families?

    The more specific you can be, the better. For example, CDG Prévoyance didn’t just want to improve their customer experience; they wanted to “provide the best service to our customers in a shorter amount of time and in the most effective channel.” For ENN Group, it was to drive clean energy options that improve the quality of people’s lives.

    Working with a specific goal in mind will help you keep your team’s efforts focused. It will also set you up for long-term success.

    Step 2: Choose one area of your business

    Often, our first instinct is to choose a project that will help everyone. However, when you’re introducing automation into your business, it’s best to start small.

    Automation could probably improve every part of your business in some way, so you’ll have a lot of options in front of you. Maybe you’ll find that there’s a department that is completely overwhelmed with time-consuming, manual processes. Maybe there’s a team that needs to make big changes due to recent data privacy regulations. Or maybe there’s an area that has faced recent budget cuts and now has fewer people to help get work done.

    Most importantly, you’ll want to look for an area where the people on the team are excited about taking on this new project. You’ll need them to be your partners throughout the process.

    For example, the Administrative Office of the Courts in a southeastern state had an overburdened payment claims team. When they started their discovery process, they saw that more than 70,000 claims were on hold, mostly due to simple errors that had to be corrected through manual processes. For Turkcell, it was that their marketing team had limited time to review 7.9 million contracts and confirm that the contract data matched the information in their CRM system.

    In both of these companies, they knew the teams needed help and the team members were eager to start automating repetitive tasks.

    Step 3: Set yourself up for a quick win

    The unfortunate truth is that some people won’t be on board with your automation project until you show that it’s successful. So, plan with that reality in mind.

    You will have already set yourself up for success by choosing a single project with an engaged team. It’s also great to consider what you can easily measure – those numbers are going to be important as you prove the value of automation. You may also want to consider low-code technologies that are quick and easy to implement.

    For ENN Group, this meant implementing an automated financial assistant that could perform routine tasks like pulling reports or tracking monthly ledgers.

    Step 4: Monitor the progress

    Automation is an iterative process, and teams tend to see results very quickly. Be sure to track metrics from the start so you can see what’s going well and what you’d like to improve.

    For example, ENN Group’s virtual assistant was an immediate hit, it completed between 2,000 and 3,000 tasks every day, which reduced processing time by 60%. At the Administrative Office of the Courts, they reduced their payment processing times from 45-60 days to less than 10 days.

    As these results come in, you’ll start to have additional opportunities, either to expand the solutions you’ve implemented or to scope out additional ways that automation could help teams. And these could be found in surprising places. At the Administrative Office of the Courts, having an automated payment system facilitated the first raise for the state’s public defenders in a decade.

    Automation tips

    Here are a few additional tips that can help make automation a reality for your company:

    • Develop an internal methodology for automation projects
    • Learn how to avoid false starts
    • Remember that not everything should be automated

    Author: Naomi Sherry

    Source: IBM

  • BI as a driver of collaboration in your company

    BI as a driver of collaboration in your company

    Collaboration is a key driver of success in business and one of the hardest things to achieve in today’s work climate. Businesses are juggling remote workers, hybrid workers, and everything in between since the pandemic’s onset. Business intelligence can be crucial in creating collaboration.

    According to the U.S. Census and Bureau of Labor Statistics, remote working between 2005 and 2017 went up by 159%; however, despite this, FlexJobs noted that only 3.4% of the workforce worked from home. Of course, this was all before COVID-19 hit businesses hard, forcing most non-essential workers to work remotely. And perhaps to everyone’s advantage, this trend doesn’t seem to be going anywhere too fast. A recent survey demonstrated that 82% of respondents enjoy working from home, and 66% felt they were more productive working remotely. It seems that the future of work has changed dramatically this year, forcing businesses to put various solutions in place to support the changes that COVID-19 brought about.

    According to Zapier CEO Wade Foster, who has hundreds of employees working remotely,  “companies who don’t have effective systems in place are winging it in a lot of areas right now. They’re going to have a hard time with this sudden transition. They are being thrust into an environment where they have no structure.” He told Computerworld that the “wrong type of management, misaligned culture, and lack of essential tools” could create a negative remote working environment.

    Business Intelligence can be the catalyst

    One of the “essential tools” to create a successful, collaborative work environment is having an automated centralized Business Intelligence platform, cloud-based, that is utilized across the organization’s BI landscape. Today, businesses are dependent on data to make important decisions. Gone are the days where only business analysts were accessing data; today, the entire organization or anyone who needs to make a decision, whether big or small, needs to have access to the same data to collaborate with team members immediately. The data exists in silos and needs one centralized solution that will allow visibility to the entire landscape. A cloud-based, centralized Business Intelligence tool provides the internal framework where everyone has access to the same semantically defined, usable and trustworthy data right off the bat.

    This has been particularly useful when collaboration is required to solve a business problem for a large organization. So much time was wasted in the past, and projects were stalled by the “black hole” of manually discovering, organizing, preparing, and cleaning data for analysis. Sometimes this manual work would take months. The automation of metadata lets people get on with collaboration by accessing the same data across the entire company. A person working in California will have the same data that a remote worker in Australia will have, and if they are on the same team, they are looking at the same information and able to collaborate on how to proceed to the next step.

    It is now more crucial than ever to have a centralized, automated business intelligence platform. As workers become more scattered across the world, data is becoming more centralized, creating a collaborative environment within the remote workspace. The ability to trace the origin of metadata with a view of the entire data lineage, and with data discovery trace data immediately, will significantly impact the speed in how decisions can be made by providing the relevant information to make such decisions immediately, creating the internal framework of a collaborative workforce.

    Who knew that COVID-19 would change the way businesses operate forever? Making us more remote yet cohesive and collaborative since the nuts and bolts of the company are being forced to centralize? Companies worldwide are unifying through centralized data; having the right information, no matter where you are.

    Author: Amnon Drori

    Source: Dataconomy

  • Building blocks to help you create the best possible IT team

    Building blocks to help you create the best possible IT team

    There are three key attributes high-functioning IT team members share that are identifiable in your initial interviews: Desire, willingness, and "teachability".

    There’s been a lot written about the importance of teamwork over the years. Entire books (and for some writers, entire careers) have been based upon what it takes to create synergy among disparate team members. But what business leaders often neglect to highlight is the importance of bringing the right peopletogether as a first step to generating great teamwork. For so many disciplines, having the right talent in the room is the critical first factor in determining team success. And this is particularly true with information technology teams. Here’s why:

    IT is notorious for being over time and over budget. The reason almost always boils down to not having the right people on the job, because IT departments often dive straight into assembling teams from a project idea rather than from a project description. These are two very different things.

    Developing a detailed project description at the outset often makes the difference between a project capable of staying on time and on budget, and one that is not. Once you have a project description, it’s easy to identify the specific mix of job descriptions needed. If you dive into assembling the team before going through these two steps, you'll forever be trying to reverse-engineer the ideal IT project team, and that is a recipe for failure every time.

    The amount of time you spend thinking through the nuances of each job description will be rewarded by increased team efficiency in the long run. So, spend a lot of time and energy here, and you will be rewarded with a significantly higher functioning team. Once you have an accurate project description and accompanying job descriptions, it’s time to start recruiting your team. And when it comes to hiring, process beats personality seven days a week.

    What attributes does a strong IT team need?

    There are three key attributes high-functioning IT team members share that are identifiable in your initial interviews: Desire, willingness, and “teachability.”

    The key with desire is you aren't just looking for people who want to win, but that believe they are going to win no matter what. They don't get out of bed to lose. They may not know exactly how, but they’re confident they and their team are going to win. When you can find these people and surround them with others who have the same winning style and energy, you are going to get it done.

    When it comes to willingness, you have to identify individuals who you know have the conviction and strength of character to do whatever it might take to succeed (within ethical boundaries, of course). Look for potential team members who have demonstrated their ability to go beyond the ordinary to get a job done, including things that at first might seem counterintuitive, uncomfortable or even unbelievable. Project teams that include these individuals will exceed expectations.

    A third, less tangible attribute is what I call “teachability.” This may be the most difficult attribute to detect, but the most important to find. Team members must be naturally teachable, willing to change, and open to training and new ideas. If they are not, your team will face unnecessary slowdowns, and overall progress will be hampered.

    The right stuff, combined with collaboration and reward

    Even a team armed with the right skills and attributes can fall victim to dissent and stubbornness. These are the progress killers you’ll need to eliminate and can be a challenge to avoid in a group of highly talented individuals. Dissention and lack of collaboration occur when someone must be right, and in IT "right" is not always an option. Teams that function without common goals struggle to overcome challenges, because self-centered motivators lead to dissention. 

    To keep a group of high-functioning individuals centered, you need to energize the team with a shared core goal that unifies everyone and everything. When problems arise, and they will, a unified core goal will provide the single touchstone needed to bring everyone back to center.

    In addition to helping keep everyone centered, establishing a core goal provides measurement and accountability that can be used to determine success and most of all, rewards. We all know that financial incentives, while important, are not the greatest reward in the minds of IT professionals.

    The best way to reward project teams and employees is through something tangible, given with public recognition so others understand the value of the team. I don't care if it is an engraved plaque on the wall, a shirt with a cool logo, or a challenge coin. A unique keepsake reward combined with praise in front of colleagues goes a very long way. When you reward a team in this manner, team members walk away legitimately happy with something they don't want to toss aside or minimize, because it is an ongoing reminder of the recognition they received. 

    On their own, following these guidelines may not insure you’ll have a winning team project, but neglecting to follow them will absolutely guarantee failure. There is no other way to build a great IT team than by first looking for the attributes required to achieve your goal.

    Author: HK Bain

    Source: InformationWeek

  • Cognitive diversity to strengthen your team

    Cognitive diversity to strengthen your team

    Many miles of copy have been written on the advantages of diverse teams. But all too often this thinking is only skin deep. That is it focusses on racial, gender & sexual orientation diversity.

    There can be a lot more benefit in having team ;members who actually think differently. This is what is called cognitive diversity. I’ve seen that in both the teams I’ve lead and at my clients’ offices. So, when blogger Harry Powell approached me with his latestbook review, I was sold.

    Harry is Director of Data Analytics at Jaguar Land Rover. He has blogged previously on th Productivity Puzzle and an Alan Turing lecture, amongst other topics. So, over to Harry to share what he has learnt about the importance of this type of diversity.

    Reading about Rebel Ideas

    I have just finished reading 'Rebel Ideas' by Matthew Syed. It’s not a long book, and hardly highbrow (anecdotes about 9/11 and climbing Everest, you know the kind of thing) but it made me think a lot about my team and my company.

    It’s a book about cognitive diversity in teams. To be clear that’s not the same thing as demographic diversity, which is about making sure that your team is representative of the population from which it is drawn. It’s about how the people in your team think.

    Syed’s basic point is that if you build a team of people who share similar perspectives and approaches the best possible result will be limited by the capability of the brightest person. This is because any diversity of thought that exists will essentially overlap. Everyone will think the same way.

    But if your team comprises people who approach problems differently, there is a good chance that your final result will incorporate the best bits of everyone’s ideas, so the worst possible result will be that of the brightest person, and will it normally end up being a lot better. This is because the ideas will overlap less, and so complement each other (see note below).

    Reflections on why this is a good idea

    In theory, I agree with this idea. Here are a few reflections:

    • The implication is that it might be better to recruit people with diverse perspectives and social skills than to simply look for the best and brightest. Obviously bright, diverse and social is the ideal.
    • Often a lack of diversity will not manifest itself so much in the solutions to the questions posed, but in the selection or framing of the problems themselves.
    • Committees of like-minded people not only water down ideas, they create the illusion of a limited set of feasible set of problems and solutions, which is likely to reduce the confidence of lateral thinkers to speak up.
    • Strong hierarchies and imperious personalities can be very effective in driving efficient responses to simple situations. But when problems are complex and multi-dimensional, these personalities can force through simplistic solutions with disastrous results.
    • Often innovation is driven not simply by the lone genius who comes up with a whole new idea, but by combining existing technologies in new ways. These new 'recombinant' ideas come together when teams are connected to disparate sets of ideas.

    All this, points towards the benefits of having teams made up of people who think differently about the world. But it poses other questions.

    Context guides the diversity you need

    What kinds of diversity are pertinent to a given situation?

    For example, if you are designing consumer goods, say mobile phones, you probably want a cross-section of ages and gender, given that different ages and genders may use those phones differently: My kids want to use games apps, but I just want email. My wife has smaller hands than me, etc.

    But what about other dimensions like race, or sexual preference? Are those dimensions important when designing a phone? You would have thought that the dimension of diversity you need may relate to the problem you are trying to solve.

    On the other hand, it seems that the most important point of cognitive diversity is that it makes the whole team aware of their own bounded perspectives, that there may be questions that remain to be asked, even if the demographic makeup of your team does not necessarily span wide enough to both pose and solve issues (that’s what market research is for).

    So, perhaps it doesn’t strictly matter if your team’s diversity is related to the problem space. Just a mixture of approaches can be valuable in itself.

    How can you identify cognitive diversity?

    Thinking differently is harder to observe than demographic diversity. Is it possible to select for the former without resorting to selecting on the latter?

    Often processes to ensure demographic diversity, such as standardised tests and scorecards in recruitment processes, promote conformity of thought and work against cognitive diversity. And processes to measure cognitive diversity directly (such as aptitude tests) are more contextual than are commonly admitted and may stifle a broader equality agenda.

    In other words, is it possible to advance both cognitive and demographic diversity with the same process?

    Even if you could identif different thinkers, what proportion of cognitive diversity can you tolerate in an organisation that needs to get things done?

    I guess the answer is the proportion of your business that is complex and uncertain, although a key trait of non-diverse businesses is that their self-assessment of their need for new ideas will be limited by their own lack of perspective. And how can you reward divergent thinkers?

    Much of what they do may be seen as disruptive and unproductive. Your most obviously productive people may be your least original, but they get things done.

    What do I do in my team?

    For data scientists, you need to test a number of skills at interview. They need to be able to think about a business problem, they need to understand mathematical methodologies, and they need to be able to code. There’s not a lot of time left for assessing originality or diversity of thought.

    So what I do is make the questions slightly open-ended, maybe a bit unconventional, certainly without an obviously correct answer.

    I expect them to get the questions a bit wrong. And then I see how they respond to interventions. Whether they take those ideas and play with them, see if they can use them to solve the problem. It’s not quite the same as seeking out diversity, but it does identify people who can co-exist with different thinkers: people who are open to new ways of thinking and try to respond positively.

    And then try to keep a quota for oddballs. You can only have a few of them, and they’ll drive you nuts, but you’ll never regret it.

    EndNote: the statistical appeal of Rebel Ideas

    Note: This idea appeals to me because it has a nice machine learning analogue to it. In a regression you want your information sets to be different, ideally orthogonal. If your data is collinear, you may as well have just one regressor.

    Equally, ensembles of low performing but different models often give better results than a single high-performing model.

    Author: Paul Laughlin

    Source: Datafloq

  • Communicating competitive intelligence insights in 5 steps

    Communicating competitive intelligence insights in 5 steps

    You work hard to stay on top of your competitors’ moves. You sift through big and small changes alike and analyze the data to identify trends and strategies. But then what do you do with the data? One of the biggest challenges with making use of competitive intelligence (CI) is distributing that intel to the relevant stakeholders within your organization so that your team can take action on your competitive insights. 

    The teams who are most successful at communicating competitive intelligence have identified relevant communication channels, established a regular cadence for distribution, and, of course, crafted their CI updates to deliver immense value in a compact package.

    Borrowing from the ideas of great CI updates, we’ve created a template for distributing competitive intelligence. Read on for recommendations to create your own.

    1: Tailor Deliverables to Your Stakeholders 

    The first step is to identify your competitive intelligence stakeholders. Each stakeholder within your organization has their own set of priorities. Because each team within your organization is responsible for their own objectives, the information that you will be delivering to each team will differ in type of content, deliverable type, and distribution channel. No single CI report or channel will work for every audience, so think about how you can customize your update to each group.

    Sales wants to know how to win more deals  and position your solution against alternatives. An example of a great deliverable for your sales team is a competitive battlecard, which can be updated in real-time, and lives either in your CRM or competitive intelligence platform. 

    Marketing wants to know how to create differentiated messaging, impactful content, and stand out against competitors in a crowded market. An example of a great deliverable for marketing is a competitive landscape snapshot or a competitive newsletter that highlights your competitor’s key messaging changes, market information, and latest content or campaign information. 

    Product wants to know what your competitors’ are doing to improve their own offerings and solve your market's problems. An example of a deliverable for your product team is a product sheet (or a one-pager, if you want to give a product overview). This allows your product team to gain insight into features, benefits, product updates, and even feedback from your competitor’s customers, which will then help your product team iterate on their strategy. 

    Executives don’t have a lot of time to dedicate to CI, so the simpler the deliverable, the better. An example of a great deliverable for your executive team is an executive dashboard. This is impactful for your executives to get a snapshot of major market shifts happening in real-time. Think of this as a command center for your executives to keep a pulse on the market.

    While these are just some examples of competitive intelligence deliverables, it gives you insight into how you can tailor your CI deliverables to each stakeholder within your organization

    2: Make it Digestible

    No matter which deliverable you’re creating or who you’re communicating it to, you want to ensure that you’re making the content digestible. There is a lot of competitive intelligence data out there, and it’s easy to get lost in competitive intel. That can lead to your team not having enough time to act on the intel or even worse, starting to ignore intel.

    A great way to make sure your entire organization is getting competitive intelligence delivered to them is by creating a competitive intel digest or newsletter. These newsletters should be digestible to encourage others to consistently review and evaluate the findings. Keep the digests short and focused and leverage formatting as well to keep the content skimmable. These can be sent out at whatever cadence works well for your team, whether that be daily, weekly, or monthly. 

    3: Answer “So What”

    In order for competitive intel to have an impact on your team, you need to help bridge the gap between what happened and why it matters. With every piece of intel you plan to share, ask and answer the question, “So what?” because that’s exactly what others will be thinking. Think of each piece of intel going through the following path: What Happened -> What It Means / Why It Matters -> What We Should Do About It. 

    The worst thing that happens with competitive intelligence information is that it doesn’t get used. This often happens when the “so what” goes unanswered. When you’re creating your competitive intelligence deliverables, you want to be sure that there is a purpose behind every insight you’re including. The more impact an insight has, the more likely your team is to leverage that information. 

    4: Keep up a Regular Cadence of Communication

    Competitive intelligence is like eating healthy or going to the gym - you need to do it consistently over a long period of time to see the impact. That means whatever cadence you choose for CI should be maintained. This allows you to take advantage of both short-term opportunities and long-term trends. If you’re not sure how often to leverage each method of communication, here is a simple list. 

    Email - Daily, Weekly, or Monthly 

    Meetings - Weekly or Monthly 

    Chat App / Slack - Daily 

    Wiki / Intranet - Weekly or Monthly 

    CRM or Competitive Intelligence Platform - Daily 

    5: Look at the Short-Term and Long-Term Objectives

    Competitive Intelligence isn’t a one and done type of initiative. CI is a long-term, ongoing process. While there are many short-term goals that can be accomplished with CI, there are long term wins as well. With a particular CI update cadence, you can risk boxing yourself into the intel delivered in that timeframe. Be sure to take a step back and identify longer-term trends to watch, and take a deep dive into the long-term view from time to time.

    If you want to make an impact with your competitive intelligence program, ensure that you’re effectively communicating your data across your organization. Following these suggestions, you’ll be able to tailor CI to your stakeholders, create impactful deliverables, and communicate in a timely manner. Once your entire organization has a pulse on the competition, you’ll be able to level-up your strategy and gain a strong competitive advantage. 

    Source: Crayon

    Author: Emily Dumas

     

     

  • Comparing BI to ERP, which analytical tool fits your purpose?

    Comparing BI to ERP, which analytical tool fits your purpose?

    Since its conception in 1997, Business Intelligence (BI) has been compared to a number of analytical tools, from spreadsheets, to Customer Relationship Management (CRM) tools to Enterprise Resource Planning (ERP) software. For some analytical tools, BI wins by a mile – yes we're specifically talking about spreadsheets. For others, there is no clear winner.

    In this blog, we’d like to discuss ERP and the connection with BI  to help you determine what works for your business.

    describe the imageWhat is Enterprise Resource Planning (ERP)?

    Every department within an organisation typically has its own computer system, optimised to suit the way that department operates. ERP essentially combines these multiple systems into a single, integrated software program that runs off a single database. Of course, the one integrated software is then divided into software modules that are replicas of their older standalone counterparts.

    Gartner states: “ERP tools share a common process and data model, covering broad and deep operational end-to-end processes such as those found in finance, HR, distribution, manufacturing, service and the supply chain”.

    What are the key differences between BI and ERP?

    Strategic-level vs. operational-level analytics

    BI tools are typically leveraged by analysts for high level discussions which involve strategic decisions. A BI tool accesses all of the data in your data warehouse, both strategic and (financial; revenue, profit and growth), and operational (daily sales performance). BI tools enable you to conduct in-depth analyses to generate comprehensive information that can deliver high-level insights. Essentially, BI is a step towards a companywide view of information.

    ERP, on the other hand, is an operational system chock full of operational and transactional data. It will give you an exact view of your business from an operational perspective, but it is not built to perform trend analyses or give you high-level overviews. It is a tool centred around delivering operational insights.

    OLAP vs. OLTP system

    BI is built as an Online Analytical Processing system (OLAP), to provide robust analytical capabilities, such as high-speed access to reports, dashboard management and the development of balanced scorecards. BI also comes with advanced analytical features that allow you to view data from different sources on one page, and in the format or perspective you need.

    ERP, on the other hand, is an Online Transaction Processing system (OLTP), used to record transactions as and when they take place. The data architecture of ERP software is designed to provide high-speed transaction recording, while keeping data space utilization at a minimum.

    Agility vs. efficiency

    Over the past years, there has been a shift of focus in BI - organizations are moving from historical reporting to forecasting and forward planning. Through these future-centric capabilities, BI can make organizations become more agile, allowing them to make strategic-level decisions that take advantage of future conditions.

    ERP software, on the other hand, is built to deliver efficiencies to an organization. These efficiencies come in many forms: better interdepartmental communication, IT cost savings and business process efficiencies. Both Gartner and CIO.com believe that proper ERP implementation can improve an organisation’s overall performance.

    What does this mean for you?

    Before you begin choosing tools, first determine your organisation’s objectives. Once you know what you are trying to achieve, you can identify the right approach to help you achieve it.

    If you have an in-depth understanding of your operational performance, then look at BI to obtain strategic level insights into your performance. While if you need a better understanding of your operational performance and need to make operational improvements, ERP is the tool you need.

    Ultimately, both tools are geared towards business improvement and can deliver significant results.

    Source: Phocas Software

  • Continuous improvement requires continuous intelligence

    Continuous improvement requires continuous intelligence

    Business leaders must take the initiative to leverage their data using new technologies and approaches to adapt and succeed in the digital world.

    The digital age has presented businesses with a significant challenge: adaptation. Organizations can only hope to survive in this new era if they are able to adapt to the new reality of doing business.

    For the past few years, adaptation efforts have fallen under the umbrella of digital transformation. It is now widely understood that organizations must engage the groundswell of digital data and refine it into a byproduct that can inform decisions or instantaneous actions. However, because digital data flows continuously, the data engagement model should also be continuous, leveraging advances in machine learning, AI, IoT, and analytics. This sort of continuity will catalyze organizations to adapt and thrive in the new digital reality.

    Continuous intelligence

    Because IT has historically focused on batch processing, the concept of continuous processing is fairly new to most organizations. Continuous intelligence waits for nothing. Not data collection periods, not resource availability, not processing time. It is the non-stop generation of insight and actions based on operational data stores as well as streams of data and events generated in the moment. It is the ability to harness an ever-changing environment where the data is constantly flowing and the insights and actions are perishable.

    According to Gartner, success can only be achieved in a world that is constantly changing by implementing a continuous approach. Gartner suggests that continuous intelligence is at the heart of fast-paced digital business and process optimization. However, continuous intelligence is not only about IT architectures. Successful implementation requires a change in managerial approach as well.

    New leadership approach

    Conway’s Law gave us the insight that system designs reflect the communication structures of the organizations that design them. Because designing continuous intelligence requires new architectures, it is critical that the organizations designing them reflect the architectural intent.

    Most organization structures today assume they are performing in a batch-processing world. One team works to complete a task before handing it off to the next team, there is no continuity of visibility or activity. Initiating a continuous intelligence effort with the limitations inherent in a batch processing management model will produce feet but no wings.

    To fully implement the continuous intelligence approach, business leaders need to adapt to agile management methodologies. Just as the DevOps world engages continuous integration across teams, so must the larger IT organization engage in a more active and constant way. The rate of engagement is necessarily radically higher, that is the only way for the broader team to understand what’s going on in the organization. This approach will facilitate initial success and be the foundation for staying ahead in an era of new, dynamic technologies and continuous change.

    The need for speed

    One of the fundamental changes to the IT stack required for continuous intelligence is a new data processing layer designed to perform at extremely low levels of latency. Regardless of whether the data already exists in operational data stores or arrives in event-based streams, the concept of continuity is at odds with latency.

    Our traditional systems of record do not have this design point, nor should we expect them to. They will continue to do their job well while a new, complementary data processing layer is added.

    Innovations in IoT, machine learning, and AI assume both constancy and immediacy. Business value has become inextricably linked with real-time action. New applications require speed and scalability in the underlying data processing to produce responses as well as to 'feed the beast' to inform models. The money is in the microseconds, whether the data is at rest or in motion.

    Digitization has permanently changed the business landscape. Continuous intelligence is achievable. Business leaders must take the initiative to leverage their data through new technologies and approaches to adapt and succeed in the digital world.

    Author: Kelly Herrell

    Source: TDWI

  • Democratized software: a tool for all parts of a business

    Democratized software: a tool for all parts of a business

    We are witnessing the next wave of software: one that democratizes its use and allows all parts of a business to participate in making it work.

    My background in consulting gave me the opportunity to work with many unique businesses in diverse industries and geographies. Despite their differences, they all wanted to solve the same challenge: how to serve customers and accelerate growth?

    Today I lead one of those companies, and not surprisingly I find myself with the same needs my clients had years ago. I, too, want to serve our customers better and grow our business faster and am looking for the smartest and most innovative ways to do so.

    My time in consulting coincided with the rapid rise of application software and I lived through the transition from on-premise to cloud-based software. While the hosting mechanics changed, the software was similar. In both worlds, CIO shops and large consulting organizations configured application software and designed best-practice business processes to drive adherence to internal process and policy. Process design decisions were controlled centrally and rolled out to business users. A focus on the customer was often secondary to consistency, efficiency and adherence.

    Democratized software, by contrast, allows more people, especially those on the front-line with customers, to own and solve their customers’ needs. Business users can adapt and extend the core business processes without disrupting business and IT integrity. Using low-code (or 'average-joe code') applications, businesspeople can create workflows and business processes without depending on their IT organizations. Problem solving is placed in the hands of the people who identified and live with their customers’ needs and empowers them to act faster.

    We saw this transition happen over the past decade with websites. IT teams grew tired of managing the dynamic nature of websites and put the power to change them in the hands of marketing teams. The technology shift from (for example) Java scripting to WordPress accelerated this change. We now see the opportunity to achieve this same dynamic with other business functions. Whether the software is labeled low-code, forms management, workflow automation, robotics process automation (RPA), or any other number of things, business users can essentially build applications to manage their work without negotiating for, and consuming, precious IT resources.

    As a result, enterprise software is changing and becoming more powerful and flexible for its end-users. At the same time, CIOs and their teams are more comfortable empowering business users to solve for themselves while maintaining the controls and governance to protect the enterprise. Luckily, this flatter and more dynamic environment is one embraced (and even insisted upon) by the new generation of employees and leaders.

    There is another aspect of traditional application software that’s being challenged. For as long as I have worked with CIOs and business leaders, there has been a debate between choosing best-of-breed point solutions vs. fully integrated packages. People often want the advanced functionality associated with best-of-breed, but they don’t want the headaches associated with building and maintaining the integration of these solutions.

    In the old days, larger enterprises moved toward integrated ERP packages to avoid these integration challenges. This left them with less flexible software designed primarily to drive process adherence and control with its users. ERP is about managing the transactions and data associated with core business processes. These core processes don’t embrace the constantly changing needs of customers. They aren’t inherently customer centric. It’s all about consistency, efficiency and process adherence. Today, businesses need to adapt to the needs of their customers, and to connect these core business processes more directly to their customers.

    There is great consumer parallel from the media industry, where customers have subscribed to Comcast or DirectTV for their content needs. These companies assume the challenge of aggregating access and content. The advancement of technology (specifically network wireless access and bandwidth) now allows consumers to curate their own content interests across a broader network. Consumers can subscribe to the content that matters to them, such as ESPN, HBO, Disney+, and Netflix, and often pay less than their current subscription contracts. The same is possible with today’s flexible SaaS software. CIOs can buy multiple packages and allow business owners to solve customer needs in the field. The advancement in technology makes integrations and information security much easier than before while providing this flexibility.

    We are witnessing the next wave of software: one that democratizes its use and allows all parts of a business to participate in making it work. As this happens, we will witness software segment convergence that brings many software categories together to engage customers more completely, and that allows enterprises to adapt and flex to a dynamic customer experience.

    Author: David T. Roberts

    Source: Informationweek

  • Different Roles in Data Science

    Different Roles in Data Science

    In this article, we will have a look at five distinct data careers, and hopefully provide some advice on how to get one's feet wet in this convoluted field.

    The data-related career landscape can be confusing, not only to newcomers, but also to those who have spent time working within the field.

    Get in where you fit in. Focusing on newcomers, however, I find from requests that I receive from those interested in join the data field in some capacity that there is often (and rightly) a general lack of understanding of what it is one needs to know in order to decide where it is that they fit in. In this article, we will have a look at five distinct data career archetypes, and hopefully provide some advice on how to get one's feet wet in this vast, convoluted field.

    We will focus solely on industry roles, as opposed to those in research, as not to add an additional layer of complication. We will also omit executive level positions such as Chief Data Officer and the like, mostly because if you are at the point in your career that this role is an option for you, you probably don't need the information in this article.

    So here are 5 data career archetypes, replete with descriptions and information on what makes them distinct from one another.

    FigureSource: KDnuggets

    Data Architect

    The data architect focuses on engineering and managing data stores and the data that reside within them.

    The data architect is concerned with managing data and engineering the infrastructure which stores and supports this data. There is generally little to no data analysis needing to take place in such a role (beyond data store analysis for performance tuning), and the use of languages such as Python and R is likely not necessary. An expert level knowledge of relational and non-relational databases, however, will undoubtedly be necessary for such a role. Selecting data stores for the appropriate types of data being stored, as well as transforming and loading the data, will be necessary. Databases, data warehouses, and data lakes; these are among the storage landscapes that will be in the data architect's wheelhouse. This role is likely the one which will have the greatest understanding of and closest relationship with hardware, primarily that related to storage, and will probably have the best understanding of cloud computing architectures of anyone else in this article as well.

    SQL and other data query languages — such as Jaql, Hive, Pig, etc. — will be invaluable, and will likely be some of the main tools of an ongoing data architect's daily work after a data infrastructure has been designed and implemented. Verifying the consistency of this data as well as optimizing access to it are also important tasks for this role. A data architect will have the know-how to maintain appropriate data access rights, ensure the infrastructure's stability, and guarantee the availability of the housed data.

    This is differentiated from the data engineer role by focus: while a data engineer is concerned with building and maintaining data pipelines (see below), the data architect is focused on the data itself. There may be overlap between the 2 roles, however: ETL; any task which could transform or move data, especially from one store to another; starting data on a journey down a pipeline.

    Like other roles in this article, you might not necessarily see a "data architect" role advertised as such, and might instead see related job titles, such as:

    • Database Administrator
    • Spark Administrator
    • Big Data Administrator
    • Database Engineer
    • Data Manager

    Data Engineer

    The data engineer focuses on engineering and managing the infrastructure which supports the data and data pipelines.

    What is the data infrastructure? It's the collection of software and storage solutions that allow for the retrieval of data from a data store, the processing of data in some specified manner (or series of manners), the movement of data between tasks (as well as the tasks themselves), as data is on its way to analysis or modeling, as well as the tasks which come after this analysis or modeling. It's the pathway that the data takes as it moves along its journey from its home to its ultimate location of usefulness, and beyond. The data engineer is certainly familiar with DataOps and its integration into the data lifecycle.

    From where does the data infrastructure come? Well, it needs to be designed and implemented, and the data engineer does this. If the data architect is the automobile mechanic, keeping the car running optimally, then data engineering can be thought of as designing the roadway and service centers that the automobile requires to both get around and to make the changes needed to continue on the next section of its journey. The pair of these roles are crucial to both the functioning and movement of your automobile, and are of equal importance when you are driving from point A to point B.

    Truth be told, some the technologies and skills required for data engineering and data management are similar; however, the practitioners of these disciplines use and understand these concepts at different levels. The data engineer may have a foundational knowledge of securing data access in a relational database, while the data architect has expert level knowledge; the data architect may have some understanding of the transformation process that an organization requires its stored data to undergo prior to a data scientist performing modeling with that data, while a data engineer knows this transformation process intimately. These roles speak their own languages, but these languages are more or less mutually intelligible.

    You might find related job titles advertised for such as:

    • Big Data Engineer
    • Data Infrastructure Engineer

    Data Analyst

    The data analyst focuses on the analysis and presentation of data.

    I'm using data analyst in this context to refer to roles related strictly to the descriptive statistical analysis and presentation of data. This includes the preparation of reporting, dashboards, KPIs, business performance metrics, as well as encompassing anything referred to as "business intelligence." The role often requires interaction with (or querying of) databases, both relational and non-relational, as well as with other data frameworks.

    While the previous pair of roles were related to designing the infrastructure to manage and facilitate the movement of the data, as well managing the data itself, data analysts are chiefly concerned with pulling from the data and working with it as it currently exists. This can be contrasted with the following 2 roles, machine learning engineers and data scientists, both of which focus on eliciting insights from data above and beyond what it already tells us at face value. If we can draw parallels between data scientists and inferential statisticians, then data analysts are descriptive statisticians; here is the current data, here is what it looks like, and here is what we know from it.

    Data analysts require a unique set of skills among the roles presented. Data analysts need to have an understanding of a variety of different technologies, including SQL & relational databases, NoSQL databases, data warehousing, and commercial and open-source reporting and dashboard packages. Along with having an understanding of some of the aforementioned technologies, just as important is an understanding of the limitations of these technologies. Given that a data analyst's reporting can often be ad hoc in nature, knowing what can and cannot be done without spending an ordination amount of time on a task prior to coming to this determination is important. If an analyst knows how data is stored, and how it can be accessed, they can also know what kinds of requests — often from people with absolutely no understanding of this — are and are not serviceable, and can suggest ways in which data can be pulled in a useful manner. Knowing how to quickly adapt can be key for a data analyst, and can separate the good from the great.

    Related job titles include:

    Machine Learning Engineer

    The machine learning engineer develops and optimizes machine learning algorithms, and implements and manages (near) production level machine learning models.

    Machine learning engineers are those crafting and using the predictive and correlative tools used to leverage data. Machine learning algorithms allow for the application of statistical analysis at high speeds, and those who wield these algorithms are not content with letting the data speak for itself in its current form. Interrogation of the data is the modus operandi of the machine learning engineer, but with enough of a statistical understanding to know when one has pushed too far, and when the answers provided are not to be trusted.

    Statistics and programming are some of the biggest assets to the machine learning researcher and practitioner. Maths such as linear algebra and intermediate calculus are useful for those employing more complex algorithms and techniques, such as neural networks, or working in computer vision, while an understanding of learning theory is also useful. And, of course, a machine learning engineer must have an understanding of the inner workings of an arsenal of machine learning algorithms (the more algorithms the better, and the deeper the understanding the better!).

    Once a machine learning model is good enough for production, a machine learning engineer may also be required to take it to production. Those machine learning engineers looking to do so will need to have knowledge of MLOps, a formalized approach for dealing with the issues arising in productionizing machine learning models.

    Related job titles:

    • Machine Learning Scientist
    • Machine Learning Practitioner
    • <specific machine learning technology> Engineer, e.g. Natural Language Processing Engineer, Computer Vision Engineer, etc.

    Data Scientist

    The data scientist is concerned primarily with the data, the insights which can be extracted from it, and the stories that it can tell.

    The data architect and data engineer are concerned with the infrastructure which houses and transports the data. The data analyst is concerned with pulling descriptive facts from the data as it exists. The machine learning engineer is concerned with advancing and employing the tools available to leverage data for predictive and correlative capabilities, as well as making the resulting models widely-available. The data scientist is concerned primarily with the data, the insights which can be extracted from it, and the stories that it can tell, regardless of what technologies or tools are needed to carry out that task.

    The data scientist may use any of the technologies listed in any of the roles above, depending on their exact role. And this is one of the biggest problems related to "data science"; the term means nothing specific, but everything in general. This role is the Jack Of All Trades of the data world, knowing (perhaps) how to get a Spark ecosystem up and running; how to execute queries against the data stored within; how to extract data and house in a non-relational database; how to take that non-relational data and extract it to a flat file; how to wrangle that data in R or Python; how to engineer features after some initial exploratory descriptive analysis; how to select an appropriate machine learning algorithm to perform some predictive analytics on the data; how to statistically analyze the results of said predictive task; how to visualize the results for easy consumption by non-technical folks; and how to tell a compelling story to executives with the end result of the data processing pipeline just described.

    And this is but one possible set of skills a data scientist may possess. Regardless, however, the emphasis in this role is on the data, and what can be gleaned from it. Domain knowledge is often a very large component of such a role as well, which is obviously not something that can be taught here. Key technologies and skills for a data scientist to focus on are statistics (!!!), programming languages (particularly Python, R, and SQL), data visualization, and communication skills — along with everything else noted in the above archetypes.

    There can be a lot of overlap between the data scientist and the machine learning engineer, at least in the realm of data modeling and everything that comes along with that. However, there is often confusion as to what the differences are between these roles as well. For a very solid discussion of the relationship between data engineers and data scientists, a pair of roles which also can also have significant overlap, have a look at this great article by Mihail Eric.

    Remember that these are simply archetypes of five major data profession roles, and these can vary between organizations. The flowchart in the image from the beginning of the article can be useful in helping you navigate the landscape and where you might find your role within it. Enjoy the ride to your ideal data profession!

    Author: Matthew Mayo

    Source: KDnuggets

  • Five challenges to overcome in order to improve sales productivity

    Five challenges to overcome in order to improve sales productivity

    Many of the sales leaders we work with say they need to find ways to make their sales team more productive. It’s no wonder, given the constant pressure to deliver results. Sales leaders also recognize that meeting sales goals consistently requires a broad-based contribution from everyone on the team — not just a few star players.

    How do you foster that contribution and help sales teams be truly productive? A starting point is to understand some of the key productivity challenges that today’s sales teams face:

    The complexity of today’s buyer 

    Today’s B2B buyers are more self-directed than ever before, and they’re likely to base much of their decision-making on information they find online. When buyers do interact with sales reps, they expect a continuous experience — which means that to prospect productively, reps need to know where buyers are in the journey and provide the information they need to progress.

    Lack of comfort with virtual selling

    54% of sales reps in Forrester’s latest sales activity study said that losing the ability to meet with clients face-to-face has hurt their ability to meet quota. Though sales teams continue to hone their virtual selling skills, achieving the same level of proficiency as in an in-person environment takes practice. Virtual selling will be the norm in many selling scenarios even after the pandemic, so reps need to build these capabilities to be productive and effective.

    Not using technology to its full potential

    The proliferation of sales technology in recent years can leave sales leaders feeling unsure of where to begin. It’s imperative to work with sales operations to choose the tools that will yield the greatest productivity gains for your organization. Whether it’s automating capture of buyer interactions or leveraging revenue operations platforms that centralize data and analytics, truly understanding what’s available and zeroing in on what will best serve your team can be a game changer.

    Time-draining administrative tasks

    Our latest sales activity study data shows that sales reps spend, on average, more than one-quarter of their working hours on administrative tasks such as internal meetings, order booking, and expense reporting. That’s slightly more than the time they spend in the most productive manner: directly selling to prospects. Finding opportunities to minimize unproductive work is a key to improving team performance.

    Not having the right content

    Our sales activity studies have consistently shown that finding content and information is a significant productivity obstacle for sales teams. Without easy access, reps will miss opportunities to provide information that could help move prospects closer to a sale. Steps such as consolidating content into a centralized repository and categorizing it by buyer journey phase can contribute to greater sales success.

    Working through these sales productivity challenges is essential to enabling reps to perform as effectively as possible!

    Author: Phil Harrell

    Source: Forrester

  • Forrester: Insights to help prepare your organization for 5G

    Forrester: Insights to help prepare your organization for 5G

    5G Presents immense innovation potential

    5G promises to usher in not just new use cases in every industry but also new business models.

    Some of the most relevant use cases across industries, such as those enabled by AR/VR and massive IoT, fit right into improving customer experience and digital transformation. As a change agent, 5G is among the most important technological enablers in this decade and the next. Therefore, investing and taking a deep look at 5G is critical at this time.

    5G Will develop rapidly through 2020 but is still developing nonetheless

    The 5G wireless arms race is fueled by the immense potential, so technology development is intense. Almost all current 5G announcements are regional siloed pilots and enhancements upon 4G LTA rather than actual 'at-scale' 5G standalone deployments. Manufacturers and operators have been aggressively pushing their 5G strategies. However, many challenges and uncertainties are still open: cost of network, monetization of use cases, regulatory challenges and, most importantly, the lack of mature standards.

    2018-19 Was a major leap in 5G standards, but beware the hype

    Through the 3GPP standards body, the industry had agreed to complete the non-standalone (NSA) implementation of 5G New Radio by December 2017, and this facilitated large-scale trials based on the specifications.

    Various sources cite numerous estimates about 5G. According to the International Telecommunications Union (ITU), commercial 5G networks are expected to start deployment after 2020. By 2025, the GSM Association (GSMA) expects 5G connections to reach 1.1 billion, which is estimated to be about 12 percent of total mobile connections. One Ericsson study estimates that 5G-enabled industry digitalization revenues for ICT players will be US$1.3 trillion in 2026. Still, current 5G reality is far from the profound expectations established by its proponents.

    Structuring your 5G thinking

    At Forrester, we have a deep bench of experts who are closely monitoring the developments and hype around 5G.

    Here is a simple framework:

    1. First, understand the background, technology, and the physical and business challenges behind practical implementations of 5G to cut through the hype. 
    1. There is a lot of talk about coverage in rural areas. In fact, bridging the digital divide is often touted to be a big plus of 5G. However, every early investment and the motivation behind it seem to suggest that at least until 5G achieves deployment scale, the digital divide may get worse. 
    1. Further, thoroughly assess your own 5G needs. Many current use cases probably do not need 5G. Hence, clearly understanding and nailing your use cases is an important vision to have. 
    1. Understand how 5G will transform your network operations, impact apps, and customer experience
    1. Finally, ask the right questions to your service provider on 5G timelines, cost, strategy, coverage, and implementation to understand what you can expect and to plan your investments in the coming months.

    Author: Abhijit Sunil

    Source: Forrester

  • Getting the most out of your data as a team

    Everyone wants to get more out of their data, but how exactly to do that can leave you scratching your head. Our BI Best Practices demystify the analytics world and empower you with actionable how-to guidance.

    Answering the big questions

    In the right hands, data is the ultimate means to answer important business questions. The problem is that when data is used incorrectly, it still provides answers (just bad ones). The best way to avoid decisions made based on that bad information is to improve the relationships between the employees analyzing data and the employees acting on that information. 

    Today, data questions often involve someone from a line-of-business team and someone from a data team. Both people bring their own individual expertise to the collaboration, but these projects can easily fall into the trap of unclear expectations and insufficient communication. 

    Before building a dashboard to answer questions, data experts need to sit down with their line-of-business counterparts and have a discussion about the purpose of that dashboard. Here are a few tips for data experts to get the most out of that meeting: 

    Go into every dashboard with an open mind

    It’s crucial to start every new data inquiry with a fresh mind. Every assumption the data experts make is an opportunity for the overall insight to lose value. On the other side, provided assumptions and business context can accelerate the data work for a faster time to value. The goal of every dashboard creator in a data request meeting should be to fully understand the data consumer’s needs and workflow. That means listening to their individual context and letting the investigation go where the data leads. If you drive the conversation or ask leading questions, you’ll end up at the resolution that you want, not necessarily the one that’s most valuable.

    Get to know the individual requesting the data

    Like most collaborative projects, empathy is critical to success. I like to start data request meetings by asking the data customer to walk me through their typical day. In some cases, it even helps me to shadow them for a while. What I’m looking for is a complete picture of the way that individual uses data. There might be pain points or missed opportunities for data to be used that I can help integrate. Asking someone about their typical day is also an easy way to get them communicating openly. It disarms them and puts the focus on the personal connection rather than a business problem.

    Understand the business value behind a data request

    An easy way for a data inquiry to get off track is for the dashboard creator to receive a request and cut straight to building a dashboard. Sometimes a request for a specific metric might miss the bigger question that data can solve. For example, a CS person might say  'give me churn' but what they mean is 'we need to find a way to minimize churn'. If you were to build a chart that simply listed churn over time, you’d miss all the other data points that correlate with churn. It’s often the data team’s job to connect data from different teams and maximize business value. Narrowing in on one team’s KPIs is an easy way to take your eye off the real goal. If you understand what is at the heart of the data request, it’ll be easier to work backward and find the right questions for the data to answer.

    Work in pairs if possible

    It’s true that two heads are generally better than one, but this concept is less about brainpower than it is about having one set of hands dedicated to note-taking. While a second data expert can definitely help understand the bigger questions in a meeting about data, it is invaluable to the flow of the conversation to have someone designated as a note-taker. Since the conversation moves at the speed of the slowest participant, it is extremely disruptive to have one person focusing on both moving the conversation forward and also documenting the important information for later. If a paired approach isn’t possible, it’s always an option to record the conversation, but I also recommend sketching out potential charts during these sessions, and that doesn’t translate well through audio.

    Start general, then get specific

    It’s likely that the initial dashboard request will be for a very pointed metric. This isn’t a bad way to start, but it’s the data expert’s job to guide the bring the focus to a bigger general question that this data can solve and uncover complementary data that can be included in that dashboard. Consider the example of churn from earlier. Presumably, the company’s goal is to maximize revenue and the inquiry into churn is being done to that end. It’s probably valuable to first create a variety of charts that illustrate the effect of churn on revenue. From there, it might also be useful to consider churn for individual cohorts, find factors that correlate in some way with churn or locate levers that will improve churn. Organizing the questions you ask from general to specific will also help you organize the charts on your dashboard when it’s time to create that asset.

    Build an unsorted list of questions to answer with data

    At the end of the meeting, the best thing to come away with is a list of questions that can be translated into data queries. An easy way to build this list is to open a document and just list the big questions that come up as you have exploratory conversations and try to understand the bigger business issue. If you can work as a pair, your note taker can do this part too. When the meeting is over (or maybe afterward if you recorded the conversation and want some time to go back and review notes), you can submit the list of questions to the dashboard requester and make sure that you have everything covered. Once that list is approved, you can start mapping each of the questions into a chart that will be organized and placed on your final dashboard.

    Think long instead of short

    Building a dashboard isn’t a straight line: there’s a lot of back and forth, questioning, editing, and iterating between the initial request and the finished product. These tips will help you build a strong foundation for this process and leave you in the best position for creating a dashboard that can really help your users make smarter decisions. Try embracing a collaborative process; you’ll be glad you took the time, asked the hard questions.

    Author: Christine Quan

    Source: Sisense

  • Google: a prime example of an intelligent organization basing its decisions on data

    Google: a prime example of an intelligent organization basing its decisions on data

    Google is a company in which fact-based decision-making is part of the DNA and where Googlers (that is what Google calls its employees) speak the language of data as part of their culture. In Google the aim is that all decisions are based on data, analytics and scientific experimentation.

    About Google

    Google is a multinational Internet and software corporation specialized in Internet search, cloud computing, and advertising technologies, based in Mountain View, California, USA. Google’s mission is to organize the world‘s information and make it universally accessible and useful. And with this mission, Google is very serious about using information to inform their decisions.

    Data to inform decision-making

    In companies data should be collected to provide answers to the most important questions and unless you are clear about the questions you need to answer, data is pretty useless. In Google today, the aim is to start with questions and be very clear about the information needs at the outset. Their executive chairman Eric Schmidt says: “We run the company by questions, not by answers. So in the strategy process we’ve so far formulated 30 questions that we have to answer […] You ask it as a question, rather than a pithy answer, and that stimulates conversation. Out of the conversation comes innovation. Innovation is not something that I just wake up one day and say ‘I want to innovate.’ I think you get a better innovative culture if you ask it as a question.” There are thousands of great examples of how Google applies this thinking but let’s look at a great case example from their HR department.

    Fact-based Decision-Making at Google

    Within their global HR function, Google has created a People Analytics Department that supports the organization with making HR decisions with data. One question Google wanted to have an answer to was: Do managers actually matter? This is a question Google has been wrestling with from the outset, where its founders were questioning the contribution managers make. At some point they actually got rid of all managers and made everyone an individual contributor, which didn’t really work and managers were brought back in.

    Project Oxygen

    Within the people analytics department Google has created a group called the Information Lab, which comprises of social scientists who are part of the people analytics department but focus on longer term questions with the aim of conducting innovative research that transforms organizational practice within Google and beyond. This team took on the project of answering the question: Do Managers Matter – codenamed ‘Project Oxygen’. So the objectives and information needs were clearly defined.

    What Data to Use?

    The team first looked at the data sources that already existed, which were performance reviews (top down review of managers) & employee survey (bottom up review of managers). The team took this data and plotted them on a graph which revealed the managers were generally perceived as good. The problem was that the data didn’t really show a lot of variation so the team decided to split the data into the top and bottom quartile.

    Analytics

    Using a regression analysis the team was able to show a big difference between these two groups in terms of team productivity, employee happiness, and employee turnover. In summary, the teams with the better managers were performing better and employees were happier and more likely to stay. While this has confirmed that good managers do actually make a difference, it wouldn’t allow Google to act on the data. The next question they needed an answer to was: What makes a good manager at Google? Answering this question would provide much more usable insights.

    New Data Collection

    So the team introduced two new data collections. The first was a ‘Great Managers Award’ through which employees could nominate managers they feel were particularly good. As part of the nomination employees had to provide examples of behaviours that they felt showed that the managers were good managers. The second data set came from interviews with the managers in each of the two quartiles (bottom and top) to understand what they were doing (the managers didn’t know which quartile they were in). The data from the interviews and from the Great Manager Award nominations was then coded using text analysis. Based on this the analytics team was able to extract the top 8 behaviours of a high scoring manager as well as the top 3 causes why managers are struggling in their role. 

    Using the Insights

    Google used different ways of sharing these insights with the relevant people including a new manager communication that outlined the findings and expectations. But only sharing the insights wasn’t enough, Google saw a need to act on the insights. There were many concrete actions that followed this analysis, here are some key ones:

    • Google started to measure people against these behaviours. For that purpose it introduced a new twice-yearly feedback survey
    • Google decided to continue with the Great Manager Award
    • Google revised the management training

    An Intelligent Organization

    Google is a great example of how good decision-making should be supported by good data and facts. Google clearly followed the five steps I outline in my book ‘The Intelligent Company: Five steps to success with Evidence-based Management’:

    1. Defining the objectives and information needs: ‘Do managers matter?’ and ‘What makes a good manager within Google?’
    2. Collecting the right data: using existing data from performance reviews and employee surveys and creating new data sets from the award nominations and manager’s interviews.
    3. Analysing the data and turning it into insights: simply plotting of the results, regression analysis and text analysis.
    4. Presenting the Information: new communications to the managers
    5. Making evidence-based decisions: revising the training, measuring performance in line with the findings, introducing new feedback mechanisms.

    Author: Bernard Marr

    Source: Smart Data Collective

  • How business leaders can learn from firefighters

    How business leaders can learn from firefighters

    The Firefighting Dichotomy

    When problems or crises arise in a business context, we often hear leaders refer to how busy their teams are firefighting. This paints a picture of chaos in the workplace, with overworked and stressed-out teams running from problem to problem, not addressing an issue before they are interrupted by the next predicament. Managers constantly juggle where to prioritize overworked people and often make rash decisions to try to put quick fixes in place.

    According to the Cambridge dictionaryfirefighting (in a business context) is “spending time on problems that need to be dealt with quickly, instead of working in a calm, planned way.” But in the real world of fighting fires, firefighting is far from chaotic and unplanned. If you think you’re firefighting, you’re not thinking like a firefighter. In contrast, firefighters are highly trained, dedicated, and coordinated teams who plan, practice, and prepare for a range of different crisis scenarios. They must — their lives and the lives of those in emergency depend on them being able to respond in an assured, controlled, and well-rehearsed manner no matter the disaster.

    What It Takes To Put Out The Fires

    I recently spoke with my new colleague, an experienced volunteer firefighter, to discuss these conflicting views of firefighting. When he was a teenager in rural Tennessee, the local rural fire service put out a fire at his home, saving him and his family. This motivated him to sign up, train, and gain certification as a volunteer firefighter as soon as he was old enough to do so. We agreed that the business world has much to learn from firefighters to be fully fit for any crisis or future scenario. Leaders must:

    • Build high-performance, focused teams. In a business crisis scenario, there’s often an adrenaline rush, with individuals striving to be heroes. However, there’s no room for individual heroics in the fire service. Firefighters operate as a high-performance, close-knit unit where everyone knows their responsibility and is expected to perform their role. The pump operator and frontline firefighters collaborate in unison to ensure a consistent supply of water at high pressure and with the right pattern to deal with the type of fire. There is no room for lapse of concentration or focus. When a firm faces a major challenge, it’s teamwork and collaboration that will pull you through — not individual heroics. Better to have a team of heroes than an individual superstar.

    • Establish clear communication and delegation channels. As you would expect, clear, consistent, and frequent communication in a fire emergency is critical. The incident commander is the ultimate decision-maker but is dependent on the flow of information and situational awareness from frontline teams. Decision-making is devolved to point of need. If the zone one commander at the front of a building fire says the structure is about to collapse, the zone one team will pull out. Equally, teams in other zones are notified to withdraw from the scene if they also face danger. Similarly, in our newfound hybrid working environment, senior leaders need to focus their attention on supporting frontline leaders. Frontline leaders need a clear channel for bidirectional communication with senior leadership.

    • Train their people for future scenarios. Firefighters must complete a rigorous training, qualification, and assessment probationary period. Volunteers complete the same training as full-time firefighters. Training does not stop with qualification. Regular scenario-based training is crucial for effectively dealing with different emergency situations — from high-rise building, aircraft, or tank fires to special situations involving flammable liquids. Undertaking frequent dry runs for different fire scenarios builds muscle memory so that for each situation, every firefighter is prepared. They know their role, what to expect from their colleagues, what equipment is needed, and what to do. Fire commanders monitor training progress through a cycle of continual, 360-degree feedback. Business and tech leaders must similarly ensure that their employees are prepared for any future scenario by developing a future fit learning culture. Rapid change is driving demand for continuous upskilling, and traditional training is not filling the skills gap. Success is achieved through embedding learning into work, performance goals, and a leadership coaching culture.

    • Ensure teams have the right tools and technology and that both are future fit. Reliable, fully operational equipment is the bedrock of the firefighter’s job. A firefighter will never go into a situation without being 100% ready. If people are worn out or equipment is not in proper working order, they will not respond. Each firefighter puts a lot of focus on testing, cleaning, and caring for their gear. They need to rely on their equipment as much as their colleagues. Standards play a critical role and ensure familiarity, consistency, and reliability. For example, every fire engine is equipped with two 150-foot cross-lay hoses, specified for the same pressure and minimum water flow. Technology executives must standardize and prune their portfolios. They must optimize the technology stack to become future fit and ensure teams have the right technology for future work.

    • Base decisions on facts, not gut reactions. When the firefighters arrived at the scene, they were greeted by a panic-stricken woman screaming “my baby is in there.” The crew at first assumed the worst — a young child was caught in the blaze, demanding immediate action. But training kicked in. The threat was less severe: It was her pet dog she was talking about. Often, technology teams are expected to drop everything and respond to a business owner’s “baby” — their project is critical, a higher priority over and above the rest. But with root cause analysis and proper criteria for prioritization, the facts often flag that is not the case. Additionally, when dealing with risk, we can prepare for the worst, but we should not overdramatize without all the facts. There may be smoke, but that does not mean the fire is flaming.

    • Be prepared to adapt. As much as firefighters prepare for different fire scenarios, they can often face a situation they were not quite prepared for. Rockwell told a story of when he was part of a crew called out to a fire at an abandoned house. This is not unusual. Often, homeless people occupying a building will light a small fire for warmth that then escalates into a house fire. The normal priority is to first check if it is safe to enter the structure and identify if any occupants have succumbed to smoke and are unconscious. However, in this situation, the team entering the building discovered that the owner, a farmer, was using it to store flammable propane in tanks. The farmer neglected to display notices outside, warning of the dangers of explosive gases being stored inside. While the team was trained for a flammable scenario, they entered the building unaware of this hidden danger. This left them with no option but to pull back, inform the wider crew that the situation was not as it seemed, and reequip for the changed situation. In business, we cannot anticipate every scenario but must learn to be adaptive. Adaptiveness is the key to business success. Adaptive enterprises continuously assess the environment, anticipating new customer-obsessed opportunities and demonstrating the ability to reshape their core business models and organizational structure.

    • Finish what was started, and capture learning before moving on. Contrast the chaotic business context of firefighting with that of the fire service. Imagine if a fire crew jumps into its truck ready to dash to another call without fully safety-checking the scene. What if there was an unchecked hot spot that caused the fire to rekindle? The fire crew ensures full closure, confirming that the fire is fully extinguished and that the situation is safe for the subsequent cleanup and overhaul process. After each incident, there is a full debrief to assess what worked well, what could have been better, and if there are any ideas or different equipment that would’ve made the job easier. It was this approach that led to the invention of the “jaws of life” hydraulic cutter to rescue people from crashed vehicles. This device replaced the previously used cylinder saw that was noisy, slower, dangerous, and caused extra distress to the trapped victim. Tech leaders are now learning from safety-critical professionals on how to improve incident management practices. They recognize that incidents, in theory, should drive follow-up analysis and corrective action once the fire is out. No incident should be left simmering.

    Author: Phil Brunkard

    Source: Forrester

  • How CIOs are learning from the COVID-19 pandemic to transform towards agile

    How CIOs are learning from the COVID-19 pandemic to transform towards agile

    If the COVID-19 pandemic taught us anything, it’s that you can never be too prepared for change – fast-moving, come-out-of-nowhere change that transforms the way we run our businesses, manage our people, and secure the stability of our enterprises. 

    Disruptive events in business are nothing new. As leaders, we have long invested in planning and processes that enable our organizations to withstand many types of disruption, be it social upheaval, unfavorable market conditions, supply chain breakdowns, or environmental disaster.

    COVID-19, however, was leagues beyond what even the most risk-aware organizations could fathom, let alone plan ahead for in a specific, actionable way. Now, emerging from the pandemic and moving forward into a work landscape that is perhaps more flexible and adaptable than ever before, the COVID-19 wake-up call must be heeded in a thoughtful, intentional manner that prioritizes digital transformation, so that organizations can be nimble and adaptive to the next major, fast-moving disruption. 

    CIOs Are Leading the Digital Transformation Revolution 

    Many CIOs are already in go-mode, leveraging the lessons of the pandemic to push for more agile organizations and business and technology scalability. In fact, a March 2021 SAPinsider survey reveals that 62% of executives and leaders have identified process efficiency as a top business priority for 2021. 

    With the pandemic forcing business to incorporate unprecedented agility into nearly every facet of operations, it’s not surprising that many leaders now see a highly customized and difficult-to-change ERP system as a liability during times of unexpected change. One CIO who has been leading their organization’s digital transformation project for the past year described the challenges of innovating on an antiquated platform as limiting their ability to do new and innovative things with SAP – and that implementing new functionality, or doing anything new, requires significant effort and testing. Those limitations don’t lend themselves to supporting the level of innovation CIOs are looking for, driving them to build new digital platforms with SAP S/4HANA.

    This CIO is certainly not alone in their push for proactive, strategic digital transformation. Right now, CIOs around the world are advocating more forcefully than ever to migrate systems to the cloud and to streamline digital operations by integrating disparate systems and data. The SAPInsider survey backs this up, noting that many CIOs are finding that their pushes toward digital transformation are meeting less resistance these days. With the pandemic proving the irrelevance of location for labor forces and technical infrastructure alike, the realization that the cloud is king has crystallized. The pandemic has ushered in a new era of accord: IT partners, and the technology and innovation they make possible, are not behind-the-scenes vendors, performing tactical functions. Instead, strategic technology partnerships are foundational to most organizations’ livelihoods.

    Many leaders are taking note – and taking action. Over half of executives in the SAPinsider study identified SAP S/4HANA as their most strategic investment in the coming year – despite the fact that SAP’s deadline isn’t until 2027 – and many have already committed to the budgets (66%) and headcount (36%) required to make the transformation goal achievable. With proper planning, S/4HANA can be a game-changing break from the complex, disparate systems that built rigidity into the essential fabric of your business. Is the unsorted data you’ve inherited or accumulated serving your goals? Most likely, it’s weighing you down without pulling its weight in terms of delivering trustworthy analytics. Digital transformation on the whole, and S/4HANA more specifically, provide a golden opportunity to assess your current data landscape and develop a proactive plan that will set your organization up for increased resilience and strategic innovation.

    If ever there were a moment to advocate for organizational resilience strategies, it’s today. The pandemic may be easing up in the U.S., but if the past year has taught us anything at all, it’s to expect the unexpected. Preparation and agility are not virtues we can afford to forget as the pandemic becomes a more distant memory. 

    Author: Steele Arbeeny

    Source: Dataversity

  • How deeper employer - employee connections enhance individual and team performances

    How deeper employer - employee connections enhance individual and team performances

    Ever see something for the first time and then realize you see it everywhere? That happened to me over the last two weeks.

    1936 U.S. Olympic Crew Team

    It started with the book, “The Boys in the Boat,” by Daniel James Brown. It’s about a group of working-class boys from the University of Washington that goes on to win the gold medal in the Berlin Olympics. The part that caught my attention — the boys showed glimpses of greatness through their individual athletic feats. However, they were at risk of never becoming the No. 1 boat because they were consistently beaten by other boats on their own school team. It wasn’t until they internalized that excelling individually but not in sync with each other restricted their overall speed. Once they began rowing, not for one’s self but for each other, the boat began to glide on top of the water like there was no resistance. At that point they were unbeatable.

    2008 Boston Celtics

    Then came the Netflix series 'The Playbook'. This series highlights some of the most successful coaches in sports. The first episode features Doc Rivers, who led the Boston Celtics to the 2008 NBA championship. The team had acquired two star players to join one already on the roster. The problem Rivers faced was that he had three players that, on any other team, would expect to dominate the floor and excel with individual stats. Three players attempting to optimize individual performance on the same team would have had a disastrous result, just as the boys in the boat from Washington discovered. Coach Rivers introduced the concept of ubuntu.

    Ubuntu

    Ubuntu is an African philosophy meaning “you are because of the others.” One succeeds not solely because of one’s individual ability. Rather, enabling others, and having relationships and bonds with others is more important than any individual disagreement or division. Doc Rivers used this philosophy to bring the individuals together on his team to create something greater than the sum of its parts, resulting in an NBA championship. On a much grander scale, Archbishop Desmond Tutu and Nelson Mandela preached the philosophy as a way forward to develop future governance and heal a post-apartheid South Africa.

    Janteloven

    I learned of Janteloven through a conversation with a business executive in Norway. Janteloven is a Scandinavian philosophy, a way of life that puts society ahead of the individual. This restricts the temptation to boast about individual accomplishments and teaches against being jealous of others. Interestingly, the topic didn’t come up because of a philosophical discussion. Instead, it emerged as part of designing a business improvement plan, including how much attention should be put on developing and highlighting individual advancement.

    Philosophy Under Attack

    There are those who argue that these philosophies directly contradict the Western views of individualism and that societies embracing them are at a disadvantage. In this duality of individualism verses collectivism, the argument is that philosophies like ubuntu and janteloven destroy the individual spirit. I’d like to think that the 1936 U.S. crew team and the 2008 Boston Celtics proved just the opposite. They were highly talented, highly skilled individuals who found a ceiling when excelling as individuals. It wasn’t until they embraced the ideal of operating together for the betterment of the group that the ceiling rose exponentially. This is the challenge of leadership — harnessing the talents of individuals to row in unison while in a culture that rewards individually.

    The New Employment Deal

    In late 2020, the Gartner HR practice launched the idea of 'the new employment deal'. This concept centers around employees demanding a relationship with their employers that differs from the traditional barter system of productivity for payment. This new deal requires a shared purpose, flexibility and deeper connections between employer and employees.

    It’s possible to see this relationship as benefiting employees at the expense of the employer, but this study found both enjoyed positive outcomes. Benefits to the employer included an increase in high-performing employees as well as a greater number of employees willing to promote the company externally. At organizations with a traditional approach, about 54% of employees were considered high performers. At organizations that prioritize building deeper connections, that number jumped to 75%. Also, in organizations with a traditional approach, only 38% of employees recommended the company externally. In those that built deeper connections, well over half of employees were willing to promote the company (56%). These outcomes improved retention and give employers advantaged access to the best talent in the labor market.

    At the end of the day

    As business leaders, we act similarly to coaches of professional sports teams. We can learn from the 2008 Celtics and the 1936 U.S. Olympic crew team. By blending the Western focus on the individual with the philosophies of ubuntu and janteloven, we can build groups of highly talented individuals and elevate their performance as part of a team. We do this not by abandoning the individual, but by fostering a culture of enablement that allows the individual to showcase their talents in sync with the group, thereby raising the overall ceiling of success. Our task as leaders is to identify the roadblocks that prevent that from happening, whether it’s in compensation structure, management coaching or simply fear of corporate intimacy.

    Author: Michael Uskert

    Source: Gartner

  • How to be a digital leader in a time of large-scale technology transformation

    How to be a digital leader in a time of large-scale technology transformation

    Following interviews with hundreds of industry executives, McKinsey recently shared fivecornerstones that are enabling organizations across every industry to integrate and capitalize on advanced technologies such as analytics, AI (artificial intelligence), machine learning, and the Internet of Things.

    These cornerstones of large-scale technology transformation include: 

    1. Developing technology road maps that strategically focus investments needed to reinvent legacy businesses and create new digital ones
    2. Training managers to recognize new opportunities and build in-house capabilities to deliver technologies
    3. Establishing a modern technology environment to support rapid development of new solutions
    4. Focusing relentlessly on capturing the strategic value from technology by driving rapid changes in the operating model
    5. Overhauling data strategy and governance to ensure data is reliable, accessible, and continuously enriched to make it more valuable 

    When it comes to the latter, McKinsey says that while every executive understands data will yield a competitive advantage (MicroStrategy’s Global State of Enterprise Analytics Report 2020 shows that 94% believe data and analytics are important to digital transformation and business growth), few have put in place the business practices to capitalize on it. For most, the data is messy and hard to access, and current technologies cannot scale to take advantage of a fast-growing wealth of data sources. 

    Constellation Research founder and Disrupting Digital Business author Ray Wang says this is what's driving the growing divide between digital leaders and laggards, one where digital leaders are taking 40-70% of market share in the future of data in digital transformation.

    'Digital leaders are folks that understand the impact of data', says Wang. 'They understand how to ask the right business questions. They understand that integration is important. They understand why data quality is needed. They understand why testing is so important before releasing something, because if you don't properly test, what you end up with is a lot of bad insights and next best actions which reduces the confidence in the data. They understand the human factors behind data and data design'.

    'They're always looking for new data sources and how to get those data sources to work', continues Wang. 'And they're always trying to figure out how to empower people with not just data, but the ability to make better decisions'.

    'Using data effectively in digital transformation is not easy. To make it work, you've got to have your data house in order. Build a foundation to support strong governance, data prep, streaming, and agility'.

    Author: Tricia Morris

    Source: Microstrategy

  • How to manage your next project? Waterfall or agile?

    How to manage your next project? Waterfall or agile?

    What is waterfall?

    Waterfall methodology, also known as the linear sequential lifecycle model, is defined by its linear, structured approach to project management. It is made up of a series of steps that are completed in sequential order within the software development life cycle (SDLC). These steps are typically tracked through Gantt chart visualizations. Dr. Winston W. Royce is credited with developing this approach, which he documented in his 1970 paper, “Managing the Development of Large Software Systems.”

    Since its publication, variations of waterfall have emerged, but there is general consensus around the following steps within the process:

    1. Gathering of requirements: This stage demands upfront documentation between the development team and the client or end user. During this phase, the product features within the project plan are documented in great detail, enabling the team to determine a clear cost and timeline. After both parties align on the requirements, there is limited to no correspondence between the development team and client until the project is completed.
    2. Design: The design phase is comprised of two steps: logical design and physical design. In logical design, the team brainstorms possible ways to tackle the client problem. When the development team agrees on a solution, these ideas are translated into specific technical tasks, which are then distributed across the team to construct the physical design. 
    3. Implementation: In next phase, developers start coding based on specifications that were developed in the prior steps. 
    4. Verification: This stage tests ensures that the code functions as intended and that the requirements in the scoping document have been met. The development team checks for bugs in the code and a final validation is conducted by the client to ensure that functionality met expectations.  
    5. Maintenance: As users onboard and use the end product, there will be a need for ongoing support as new issues arise.

    Key benefits of the waterfall method

    • Detailed product requirements and documentation enable new programmers to onboard quickly and easily.
    • Documentation provides a clear scope to the project, enabling project managers to communicate budgets, timelines, and key milestones to interested parties.

    Key challenges of the waterfall method

    • Clients can find it difficult to outline all of their requirements at the beginning of the project, leading to gaps in documentation.
    • Minimal customer collaboration during the development process can lead to costly changes if the product does not meet expectations.
    • Testers report issues and bugs later in the process, which could have informed an alternative program architecture.

    What is agile?

    In contrast to waterfall development, agile is defined by its iterative approach to project management. Instead of drafting lengthy project requirements at the onset, an agile team breaks out the product into specific features, and they tackle each one under a specific time constraint, known as a sprint.

    Agile project management requires a cross-functional, self-organizing team that typically consists of five to nine members. Together, they develop a workable piece of software during each sprint, which combines with other functional code from previous iterations. By the end of the sprint timebox, the team demos their work to stakeholders for feedback, allowing them to be flexible in their approach to software development. Since the team has access to frequent feedback, they can adapt the product roadmap during the development lifecycle to ensure that functionality truly meets user expectations. In a waterfall approach, customer involvement typically coincides with the delivery of the final product, which can be costly when requirements are misinterpreted or documented incorrectly.

    There were 17 individuals who found the waterfall project management system to be a highly ineffective, and in 2001, their ideas around the software development process culminated in a piece of work known as the “Agile Manifesto.” This document highlights specific values and principles to prioritize within software development workstreams, and it has yielded a number of popular agile frameworks, such as Scrum, Kanban, Feature Driven Development (FDD), and Extreme Programming. Since then, agile software development has increased in popularity, especially when compared to the waterfall model. 

    Agile scrum framework

    Inspired by the game of rugby, agile scrum emphasizes teamwork to meet deliverables, similar to the way that forwards need to work together in a scrum to gain possession of a rugby ball. The skillset of the agile scrum team varies, but they usually include the following roles:

    • Product owner: This team member represents the needs of both the customer and the business. By crafting user stories, the team can understand how a feature request can help resolve a specific problem, and these stories formulate the backlog of tasks for the team to tackle. This person also prioritizes the stories by their value to the customer, which should, in theory, translate into value for the business. While the product owner leads the team in this way, they do not set deadlines or instruct the team on how work should be delivered.
    • Scrum master: This team member facilitates the overall agile development process. Similar to a project manager, this person keeps the team on task, ensuring that the team remains focused during the project. They can also act as a neutral party to mediate disagreements among team members. For example, team members may disagree on how much to take on in a given sprint. Product owners, in particular, may pressure teams to commit to more than they can deliver within a given timeframe. In these cases, scrum masters can remind team members the scope of their role on the team.

    Other team members of an agile team can vary, but they typically include users from a variety of disciplines, such as design, analytics, QA, and development. These individuals collaborate together to decide on how much work to take on and how they will complete it.

    Agile methodologies are also defined by the ways in which the team comes together. There are specific meetings which help facilitate the workflow across the team. Some of them include the following:

    • Sprint planning: During this meeting, the team comes together to determine which stories will be part of the current sprint. The product owner will prioritize the user stories, but the rest of the team will need to agree on how many and which user stories they can complete during that set time period.
    • Daily standup: These brief meetings are also known as daily scrums. During these check-ins, each team member communicates their individual progress, such as completed tasks, upcoming ones, and any blockers or dependencies which may result in delays.
    • Demo: This meeting showcases the working software that the team completed over the course of the sprint, which can range between two- to four-week increments. The product owner will determine if a user story has met the definition of “done.” If not, the product backlog may be groomed to account for anything missing. This is also an opportunity for the team to present to stakeholders for feedback.
    • Retrospective: This time is reserved for team introspection, where the team identifies how they could improve upon their workflow to achieve better results in the future.

    Key benefits of the agile method

    • Team design facilitates more collaboration.
    • Product development takes an adaptive design approach.
    • Since code is tested with each iteration in the development phase, code defects can inform future design of the software.
    • Tends to yield higher customer satisfaction since frequent feedback leads to increased prioritization of customer needs.
    • Enables continuous integration as each feature is its own workable piece of software.
    • This lean type of software development can lead to lower costs as there is less risk of customer and product misalignment.

    Key challenges of the agile method

    • An agile approach can lack comprehensive documentation. This makes it difficult to onboard new developers, project timelines to stakeholders, and provide accurate cost estimates.
    • Can be difficult to scale.

    Manage your project with agile

    While development teams have been successful under either project management approach, there is certainly more momentum around agile processes. It’s not hard to see why when we observe the benefits that it can offer companies today. 

    Author: Eda Kavlakoglu

    Source: IBM

  • How to translate IIoT investments to ROI

    How to translate IIoT investments to ROI

    A digital transformation takes time, sometimes a considerable amount. This means it can be difficult to quantify ROI, at least in the short term. Return on investment for IIoT (Industrial Internet of Things) relies entirely on the data collected with the technology, and how it’s applied. The information itself may be incredibly valuable, but that won't matter if it's used ineffectively, further reducing the leverage.

    Real-time insights provide more of a direct influence on operations, offering minimal boons to a variety of business facets. That said, measuring real ROI is about the big picture and how all those smaller wins come together to provide a wholly effective strategy.

    It’s difficult to ascertain the ROI of IIoT and gauge whether or not you’re on the right track in the first place.

    Spending on IoT remains high for many industries, but the ROI is still up in the air. About 72% of construction business operators include new tech adoption as part of their strategic plan or vision for the future. Despite that, only 5% see themselves on the cutting edge of adoption. Here are some tips that can help you better plan industrial IoT adoption, while also getting the most out of the new technologies:

    Choose an objective

    Industrial IoT is an incredibly broad field that encompasses nearly every device, machine and process that exists today, and beyond. Just because the technology can be outfitted to work with every system in a facility doesn’t mean that’s what should happen.

    Before moving forward with any form of implementation, every organization should choose an objective for its IIoT campaign. What is the technology going to achieve? Should it be used to improve manufacturing efficiency? Will it help sync up workers across the plant floor? Is it better suited for fleet management and asset tracking?

    While it would be great to have multiple potential solutions in place, it would be nearly impossible to verify the ROI after doing so. By selecting a single objective and following through, data teams can adopt a more systematic approach that provides more accurate insights. In the end, it allows decision-makers to see firsthand whether IoT is a proper investment and worth pursuing on a larger scale.

    If nothing else, deploying IIoT with the intent to eliminate bottlenecks in existing processes is a great place to start.

    Go process by process

    With all the hype surrounding digitization and modern technologies, it's easy to get swept up in the tide. Overhauling every aspect of a business to honor advanced digital solutions may seem like a great idea, initially. The reality is that taking it all on at once is likely going to fail. For instance, switching to a paperless operation while simultaneously installing new IoT sensors on the factory floor will cause more confusion than positive support.

    As Harvard Business Reviews’ Digital Transformation of Business report states, merely spending more on cutting-edge technologies does not guarantee a positive outcome.

    The real winners will be the 'companies that both identify which core business capabilities they need to differentiate and make a commitment to transform these core business capabilities with the right digital technology'.

    Instead, take a look at the processes and systems currently in place and identify what will see the most significant boon from digitization. Choose one or two, and then get to work. Once the ball is rolling, it’s going to take time and resources to implement the proper solutions. New technologies will need to be installed, which means old equipment and tools might need to be phased out or upgraded. Employees will need training, and they may also need their own set of improved tools. Leadership will need to come up with new strategies for working with upgraded systems>, communicating with their workers and taking action.

    It’s a long, demanding process. Not something that happens overnight. That’s precisely why it’s best to take it one step at a time and focus on a single process or solution. Once a particular department or task is honed, then it’s time to move on to other digitization projects within the company.

    Choose a reliable vendor

    With new technologies it’s best to work with a vendor or specialist that already has considerable experience. Yes, it’s possible to develop an in-house IoT solution that’s also managed by a proprietary IT crew. It’s also a lot more costly and more likely that problems will arise as a result.

    Third-party vendors have more resources at their disposal merely because it’s what they do, exclusively. They tend to have more robust IT and security solutions, along with the appropriate human resources to keep everything safe. They can handle installation, upgrades and repairs, which takes the responsibility away from the leading organization. They also provide comprehensive support for when problems or questions do arise.

    Implement predictive operations with IIoT

    Predictive maintenance is something relatively new in the industrial field, made possible thanks to IIoT and the real-time insights it can deliver. Data can reveal hidden details about working machinery, output, potential errors and more. Collectively, it provides a detailed report about performance, allowing decision-makers to pinpoint what areas of the operation are lacking. They can take action, sooner rather than later, to correct any issues and replace ailing equipment.

    It’s a process that should be deployed across the entire operation instead of solely for maintenance. It can be used for a lot more than just predicting when equipment is going to fail, too. Employing machine learning and analytics applications can reveal when and how supplies are going to thin out, demand trends, and much more. Another term for this is business intelligence. Predictive operations throug big data analysis are one facet of business intelligence, albeit an incredibly lucrative one.

    Invest in IoT for predictive operations and ROI will innately improve.

    Improving ROI even before it can be measured

    These tips offer just a few ways that organizations can improve the ROI of IIoT implementation, even through preplanning. It may be difficult to quantify the real value of the technology upfront. Nonetheless, honoring these processes can help realize the bigger picture, which is something business leaders always demand.

    Author: Megan Nichols

    Source: Datafloq

  • How your organization can establish a results-based data strategy

    How your organization can establish a results-based data strategy

    Money never sleeps and neither does your data. In
    this article, we look at digital transformation: the ways of turning data into new revenue streams and apps that boost income, increase stickiness, and help your company thrive in the world of Big Data. 

    The first waves of digital transformation started decades ago and the ripples of this trend continue to be felt to this day. However, what exactly a digital transformation looks like varies widely from company to company. One common theme among many transformations, however, is trying to make better use of data, whether to build analytic apps to unlock new revenue streams or to make smarter decisions internally (or both).  

    While these are worthwhile applications, one blind spot that many teams charged with these projects share is that they look at the data they have on-hand before figuring out what kind of problems they wish to solve with it. 

    “I recommend starting your data strategy with a right-to-left approach, focusing on the desired business outcomes first, instead of the data, to support those outcomes,” says Charles Holive, Sisense Managing Director of Data Monetization and Strategy Consulting. “And there are primarily three areas that industries across the world look to improve: the top line, the bottom line, and customer satisfaction.”

    Define your desired outcome before you start building

    Every company knows they need to digitally transform in order to survive and excel in the modern era. However, many organizations fail to define their goals for this process before they start, and predictably encounter obstacles or outright failures instead of paving a path for future success.

    Business goals should be defined at the very beginning of the digital transformation in the “right-to-left strategy” that starts by answering this question: What is the organization specifically looking to solve or improve? Understanding the details is key, otherwise “digital transformation” will be merely a corporate buzzword that causes headaches, heartbreaks, and lost money instead of producing measurable improvements.

    From there, rather than trying to accumulate and house the company’s entire dataset, the digital transformation team should identify the specific actionable insights and associated data needed to solve for (and measure) agreed-upon outcomes.

    “Not every dataset is made equal; some are more valuable than the others. So being outcome-focused is a way that can you stack-rank the data sets that are most important. Your team can then begin moving that most-important data into your warehouse.”

    Experiment to guide a winning data strategy

    Just as the waterfall method of software development, the strategy of gathering all the requirements upfront and then building and releasing a complete application, has fallen out of favor for agile methods, the same thing should happen when creating an outcome-first data strategy: Rather than trying to build a complete data warehouse right from the outset, approach data strategy as an “innovation factory.”

    “Identifying the exact data you need to solve a singular problem results in a perfect candidate to go into your warehouse on the first cycle. This is because you know exactly what the business is going to do with that data set,” Charles explains. “It’s powerful because it’s already informing or measuring a specific business outcome.”

    And when this data is warehoused and accessible to business partners to make key decisions, you already have a chance to quickly prove this outcome-first data strategy. You’ve immediately created an experiment to win.

    Another piece of advice that Charles talks about in his “Hacking the Analytic Apps Economy” video series is where the innovation factory should live. Namely, not in a mature business unit, but in an agile, fast-reacting department that reports to a Chief Innovation Officer or similar highly-placed exec. This team can deliver on new ideas quickly and won’t get bogged down in pre-existing frameworks or goals that don’t work for what the new data experiments are trying to achieve.

    Create an innovation factory at your company

    “Creating an innovation factory for your company results in faster innovation. You can do these smaller experiments more cost-efficiently, saving money over the traditional data strategy. This also should help your team prioritize projects for the data warehouse that deliver the greatest value, as opposed to the department that screams the loudest.” 

    And while any experiment can fail, but here are some solid tips to help improve your likelihood of success and to maximize the impact of triumphant experiments: 

    • Start by listening to the frontline employees who use the data to make decisions, this will improve the odds of success for your experiment out of the gate.
    • If your experiment works, find other departments that can benefit from that same data, this is where it is key to have a good semantic layer on top of your data warehouse (courtesy of your data analytics software) so you can repurpose the same dataset for different ends.
    • If your experiment fails, see if you can tweak the dataset or use case to apply elsewhere in the company.

    Regardless, approaching data strategy with a focus on business outcomes will put you on the right course.

    “Everything else in the company is business-centered. It just seems counterintuitive not to approach data strategy in the same way.”

    Author: Jack Cieslak

    Source: Sisense

     

  • Implementing practices to keep up with developments towards data-driven and consumer-centric

    Implementing practices to keep up with developments towards data-driven and consumer-centric

    Corporate leaders tell us that the processes that, for decades, have framed the internal dealings of their firms are no longer tolerable, as they need to become customer-obsessed. The increasing occurrence of titles such as chief customer officer, chief data officer, and chief digital officer is a tacit admission that firms need a higher level of cross-business-unit coordination to provide compelling customer experiences. Meanwhile, rising customer and employee needs, shorter decision-making cycles, faster technology change and innovation, and the need to avoid margin pressure and commoditization force companies to rethink their entire underlying approach to process.

    Establish New Practices To Drive Speed And Innovation

    Many tech execs and their teams have become quite comfortable with agile practices. Adoption remains strong, and teams are getting better at knowing when and where to use agile, DevOps, SecOps, and continuous delivery. But when expanding agile practices to the enterprise level, tech execs are confronted by a multitude of issues. As one tech exec told us, “We really see the benefits of using agile to drive better customer experiences. But many of our business partners still focus on driving internal efficiencies, which kind of works against our efforts to focus on external client benefits.”

    To overcome these internal roadblocks, tech execs will have to move beyond agile and establish new practices that enable cross-functional teams to drive speed and innovation at scale. These practices will:

    • Embrace ecosystems, individuals, and interactions over linear processes. Future fit leaders will act as “servant leaders” to their organizations, embrace a culture of openness and diversity, accelerate human-machine interactions, and adopt new practices that drive greater flexibility, cooperation, and creativity.
    • Measure customer value instead of internal efficiency. Future fit tech executives will work with their executive peers to define and agree on joint success metrics that cut across different business functions and drive value for customers.
    • Leverage common product and program management principles instead of rigid control. Future fit leaders will create a new set of practices beyond agile software delivery that meld together customer journey mapping, lean portfolio management, and integrated value stream management.
    • Continuously respond to change instead of working the plan. Future fit leaders leverage collaborative, continuous planning, using value streams to identify which platforms, products, and services to prioritize.

    Shifting beyond continuous improvement to continuous innovation isn’t easy and requires strong executive commitment and buy-in. If you are a tech executive, make sure you are aware of the steps you can take to drive faster innovation and sustainable growth.

    Author: Pascal Matzke

    Source: Forrester

  • Integrating security, compliance, and session management when deploying AI systems

    Integrating security, compliance, and session management when deploying AI systems

    As enterprises adopt AI (artificial intelligence), they'll need a sound deployment framework that enables security, compliance, and session management.

    As accessible as the various dimensions of AI are to today's enterprise, one simple fact remains: embedding scalable AI systems into core business processes in production depends on a coherent deployment framework. Without it, AI's potential automation and acceleration benefits almost certainly become liabilities, or will never be fully realized.

    This framework functions as a guardrail for protecting and managing AI systems, enabling their interoperability with existing IT resources. It's the means by which AI implementations with intelligent bots interact with one another for mission-critical processes.

    With this method, bots are analogous to railway cars transporting data between sources and systems. The framework is akin to the tracks the cars operate on, helping the bots to function consistently and dependably. It delivers three core functions:

    • Security
    • Compliance and data governance
    • Session management

    With this framework, AI becomes as dependable as any other well-managed IT resource. The three core functions each need to be supported as follows.

    Security

    A coherent AI framework primarily solidifies a secure environment for applied AI. AI is a collection of various cognitive computing technologies: machine learning, natural language processing (NLP), etc. Applied AI is the application of those technologies to fundamental business processes and organizational data. Therefore, it's imperative for organizations to tailor their AI frameworks to their particular security needs in accordance with measures such as encryption or tokenization.

    When AI is subjected to these security protocols the same way employees or other systems are, there can be secure communication between the framework and external resources. For example, organizations can access optical character recognition (OCR) algorithms through AWS or cognitive computing options from IBM's Watson while safeguarding their AI systems.

    Compliance (and data governance)

    In much the same way organizations personalize their AI frameworks for security, they can also customize them for the various dimensions of regulatory compliance and data governance. Of cardinal importance is the treatment of confidential, personally identifiable information (PII), particularly with the passage of GDPR and other privacy regulations.

    For example, when leveraging NLP it may be necessary to communicate with external NLP engines. The inclusion of PII in such exchanges is inevitable, especially when dealing with customer data. However, the AI framework can be adjusted so that when PII is detected, it's automatically compressed, mapped, and rendered anonymous so bots deliver this information only according to compliance policies. It also ensures users can access external resources in accordance with governance and security policies.

    Session management

    The session management capabilities of coherent AI frameworks are invaluable for preserving the context between bots for stateful relevance of underlying AI systems. The framework ensures communication between bots is pertinent to their specific functions in workflows.

    Similar to how DNA is passed along, bots can contextualize the data they disseminate to each other. For example, a general-inquiry bot may answer users' questions about various aspects of a job. However, once someone applies for the position, that bot must understand the context of the application data and pass it along to an HR bot. The framework provides this session management for the duration of the data's journey within the AI systems.

    Key benefits

    The outputs of the security, compliance, and session management functions respectively enable three valuable benefits:

    No rogue bots: AI systems won't go rogue thanks to the framework's security. The framework ingrains security within AI systems, extending the same benefits for data privacy. This can help you comply with today's strict regulations in countries such as Germany and India about where data is stored, particularly data accessed through the cloud. The framework prevents data from being stored or used in ways contrary to security and governance policies, so AI can safely use the most crucial system resources.

    New services: The compliance function makes it easy to add new services external to the enterprise. Revisiting the train analogy, a new service is like a new car on the track. The framework incorporates it within the existing infrastructure without untimely delays so firms can quickly access the cloud for any necessary services to assist AI systems.

    Critical analytics: Finally, the session management function issues real-time information about system performance, which is important when leveraging multiple AI systems. It enables organizations to define metrics relevant to their use cases, identify anomalies, and increase efficiency via a machine-learning feedback loop with predictions for optimizing workflows.

    Necessary advancements

    Organizations that develop and deploy AI-driven business applications that can think, act, and complete processes autonomously without human intervention will need a sound deployment framework. Delivering a road map for what data is processed as well as how, where, and why, the framework aligns AI with an organization's core values and is vital to scaling these technologies for mission-critical applications. It's the foundation for AI's transformative potentialand, more important, its enduring value to the enterprise.

    Source: Ramesh Mahalingam

    Author: TDWI

  • Keeping the data of your organization safe by storing it in the cloud

    Keeping the data of your organization safe by storing it in the cloud

    We now live within the digital domain, and accessing vital information is more important than ever. Up until rather recently, most businesses tended to employ on-site data storage methods such as network servers, SSD hard drives, and direct-attached storage (DAS). However, cloud storage systems have now become commonplace.

    Perhaps the most well-known benefit of cloud storage solutions is that their virtual architecture ensures that all information will remain accessible in the event of an on-site system failure. However, we tend to overlook the security advantages of cloud storage with traditional strategies. Let us examine some key takeaway points.

    Technical Experts at Your Disposal

    A recent survey found that 73% of all organizations felt that they were unprepared in the event of a cyberattack. As this article points out, a staggering 40% suspected that their systems had been breached. It is therefore clear that legacy in-house approaches are failing to provide adequate security solutions.

    One of the main advantages of cloud-based data storage is that these services can provide targeted and customized data security solutions. Furthermore, a team of professionals is always standing by if a fault is suspected. This enables the storage platform to quickly diagnose and rectify the problem before massive amounts of data are lost or otherwise compromised. 

    Restricted Digital Access

    We also need to remember that one of the most profound threats to in-house data storage involves its physical nature. In other words, it is sometimes possible for unauthorized users (employees or even third parties) to gain access to sensitive information. Not only may this result in data theft, but the devices themselves could be purposely sabotaged, resulting in a massive data loss.

    The same cannot be said of cloud storage solutions. The information itself could very well be stored on a server located thousands of miles away from the business in question. This makes an intentional breach much less likely. Other security measures such as biometric access devices, gated entry systems, and CCTV cameras will also help deter any would-be thieves. 

    Fewer (if Any) Vulnerabilities

    The number of cloud-managed services is on the rise, and for good reason. These platforms allow businesses to optimize many factors such as CRM, sales, marketing campaigns, and e-commerce concerns. In the same respect, these bundles offer a much more refined approach to security. 

    This often comes with the ability to thwart what would otherwise remain in-house vulnerabilities. Some ways in which cloud servers can offer more robust storage solutions include:

    • 256-bit AES encryption
    • Highly advanced firewalls
    • Automatic threat detection systems
    • Multi-factor authentication

    In-house services may not be equipped with such protocols. As a result, they can be more vulnerable to threats such as phishing, compromised passwords, and distributed denial-of-service (DdoS) attacks. 

    The Notion of Data Redundancy

    The “Achilles’ heel” of on-site data storage has always stemmed from its physical nature. This is even more relevant when referring to unexpected natural disasters. Should a business endure a catastrophic situation, sensitive data could very well be lost permanently. This is once again when cloud storage solutions come into play.

    The virtual nature of these systems ensures that businesses can enjoy a much greater degree of redundancy. As opposed to having an IT team struggle for days or even weeks at a time to recover lost information, cloud servers provide instantaneous access to avoid potentially crippling periods of downtime. 

    Doing Away with Legacy Technology

    Another flaw that is often associated with in-house data storage solutions involves the use of legacy technology. Because the digital landscape is evolving at a frenetic pace, the chances are high that many of these systems are no longer relevant. What could have worked well yesterday may very well be obsolete tomorrow. Cloud solutions do not suffer from this drawback. Their architecture is updated regularly to guarantee that customers are always provided with the latest security protocols. Thus, their vital information will always remain behind closed (digital) doors.

    Brand Reputation

    A final and lesser-known benefit of cloud-based security is that clients are becoming more technically adept than in the past. They are aware of issues such as the growth of big data and GDPR compliance concerns. The reputation of businesses that continue to use outdated storage methods could therefore suffer as a result. Customers who are confident that their data is safe are much more likely to remain loyal over time. 

    Cloud Storage: Smart Solutions for Modern Times

    We can now see that there are several security advantages that cloud storage solutions have to offer. Although on-site methods may have been sufficient in the past, this is certainly no longer the case. Thankfully, there are many cloud providers associated with astounding levels of security. Any business that hopes to remain safe should therefore make this transition sooner rather than later. 

    Author: George Tuohy

    Source: Dataversity

  • Leading your organization to success through Agile Data Governance

    Leading your organization to success through Agile Data Governance

    Laura Madsen wants to challenge your outdated ideas about Data Governance. “I’m pretty sure that we wouldn’t use software that we used 20 years ago, but we’re still using Data Governance and Data Governance methodologies the same way we did 20 years ago.” And although she advocates for Agile, she’s not an Agile coach or a SCRUM master; rather she wants companies to consider agility in a broader sense as well. “Very briefly, when we think about Agile, essentially, we think about reducing process steps.” She paraphrases David Hussman’s belief that there is no inherent value in “process” — process exists in order to prove to other people that “we’re doing something.” To that end, most organizations create an enormous number of process steps she refers to as “flaming hoops,” showing that there was a lot of work put into activities such as status updates, but nothing that provided actual value.

    Madsen is the author of Disrupting Data Governance, Chief Executive Guru at Via Gurus, and Mastermind at the Sisterhood of Technology Professionals (Sistech).

    Resource Use

    Historically, Data Governance has always been resource-intensive, and with Agile Data Governance in particular, she said, the most important resource is the individuals who do the work. The need for a data owner and a data steward for each domain, often with multiple stewards or owners covering the same data domain, etc., emerged as a system designed to serve data warehouses with hundreds of tables, and thousands of rows per table. “That’s a rather parochial idea in 2020, when we have petabytes of data blown through our data warehouses on any given day.”

    One resource-heavy relic from the past is the standing committee, which always starts off with a lot of participation and enthusiasm, but over time people disengage and participation dwindles. Another historical shortcoming in Data Governance is the reliance on one or two individuals who hold the bulk of the institutional knowledge. With the amount of risk attached to Data Governance processes, the people who serve as the governance linchpin are under a lot of pressure to do more with less, so when they leave, the Data Governance program often collapses.

    Instead, she recommends developing responsive and resilient capabilities by creating a dependency on multiple people with similar capabilities instead of one person who knows everything.

    To make best use of time and resources, committees should be self-forming and project-based. Distinct functions must be created for participating resources: “And we should be laser clear about what people are doing.”

    The Kitchen Sink

    Still another legacy from the past is the tendency to take a “kitchen sink” approach, throwing compliance, risk, security, quality, and training all under the aegis of Data Governance, creating a lack of clarity in roles. “When you do everything, then you’re really doing nothing,” she said. Data stewards aren’t given formal roles or capabilities, and as such, they consider their governance duties as something they do on the side, “for fun.”

    Madsen sees this as arising out of the very broad scope of the historical definition of Data Governance. Intersecting with so many different critical areas, Data Governance has become a catch-all. In truth, she said, instead of being wholly responsible for compliance, risk, security, protection, data usage, and quality, Data Governance lives in the small area where all of those domains overlap.

    She considers this narrower focus as essential to survival in modern data environments, especially now, when there are entire departments devoted to these areas. Expecting a Data Governance person to be fully accountable for compliance, privacy, risk, security, protection, data quality, and data usage, “is a recipe for absolute disaster.” Today, she said, there is no excuse for being haphazard about what people are doing in those intersecting areas.

    Four Aspects of Success

    To succeed, companies must move away from the kitchen sink definition of Data Governance and focus on four aspects:

    These categories will not need equal focus in every organization, and it’s expected that priorities will shift over time. Madsen showed a slide with some sample priorities that could be set with management input:

    • Increased data use at 40% importance
    • Quality at 25%
    • Management at 25%
    • Protection at 10%

    From an Agile perspective, every sprint or increment can be measured against those values, creating “an enormous amount of transparency.” And although executives may not care about the specific tasks used to address those priorities, they will care that they are being tackled strategically, she said.

    Increased Use of Data

    If the work of Data Governance isn’t leading to greater use of data, she says, “What the heck are we doing?” Building data warehouses, creating dashboards, and delivering ad hoc analytics are only useful if they enable greater data use. All governance activity should be focused toward that end. The only way to get broad support for Data Governance is to increase the usage of the data.

    Data Quality

    Record counts and data profiling can show what’s in the data and whether or not the data is right, but analysis is not the same as data quality. “What we’re really driving towards here is the context of the data,” Madsen said, which leads to increased data use. The core of Data Quality Management is ensuring it has meaning, and the only way for the data to have meaning is to provide context.

    Data Management

    She talks specifically about the importance of lineage within the context of Data Management. Most end users only interact with their data at the front end when they’re entering something, and at the back end, when they see it on a report or a dashboard. Everything that happens in between those two points is a mystery to them, which creates anxiety or confusion about the accuracy or meaning of the end result. “Without lineage tools, without the capability of looking at and knowing exactly what happened from the source to the target, we lose our ability to support our end users.” For a long time those tools didn’t exist, but now they do, and those questions can be answered very quickly, she said.

    Data Protection

    Although Data Governance has a part in mitigating risk and protecting data, again, these are areas where governance should not be fully responsible. Instead, governance should be creating what Madsen calls “happy alliances” with those departments directly responsible for data protection, and focusing on facilitating increased data usage. This is often reversed in many organizations: If data is locked down to the point where it’s considered “completely safe,” risk may be under control, but no one is using it.

    Moving into the Future/Sweeping Away the Past—Fixing Committees

    Committees, she said, are not responsive, they’re not Agile and they don’t contribute to a resilient Data Governance structure. Theoretically, they do create a communication path of sorts, because a standing meeting at least assumes participants are paying attention for a specific period of time — until they lose interest. 

    What works better, she said, is self-forming Scrum teams or self-forming Agile teams that are on-demand or project-based, using a “backlog” (list of tasks) that becomes the working document for completing the company’s project list. “You come together, you work on the thing, and then you go your own separate ways.”

    A sample self-forming Agile team might consist of a CDO, serving as a product owner, someone from

    security, privacy, and IT, which creates regulatory and IT standards, and executives from business departments like finance, sales, or operations, who might also serve assubject matter experts.

    The backlog serves as a centralized document where data issues are tracked, responsibilities are outlined and milestones on the way to completion are logged.

    Traditional concepts like data ownership and data stewardship still have a part, but they are in service to a project or initiative rather than a fixed area or department. When the project is completed, the team disbands.

    Named Data Stewards

    Named data stewards serve as a resource for a particular project or area, such as the customer data domain. Named data stewards or owners for each area of responsibility should be published so that anyone can quickly and easily find the data steward for any particular domain.

    On Demand Data Stewards

    “Everyone’s a data steward, just like everyone’s in charge of sales.” Anyone who has a question about the data and wants to know more is, in that moment, a data steward, she said, whether they are trained for it or not. By taking ownership of a question and being willing to find an answer, the “on-demand” steward gains the ability to help the organization do a better job in that particular moment. “Ownership is so integral to successful deployment of any data function in an organization.”

    Ensuring Success

    To sum up, Madsen recommends starting a backlog, using it to consistently document exit criteria (your definition of “done”), and committing to actively managing it. Start thinking like a Data Governance product owner, keep communications open among intersecting areas — those “happy alliances” — and keep the ultimate goal of increased data use in mind. Focus on progress over perfection, she says, “And then just keep swimming, just keep swimming …”

    Author: Amber Lee Dennis

    Source: Dataversity

  • Learning about Company Culture through Data Analytics

    Learning about Company Culture through Data Analytics

    Analytics is invaluable for data-driven businesses trying to create better company cultures by analyzing other businesses.

    Many companies refer to themselves as data-driven organizations. Unfortunately, not all of these companies use data analytics strategically enough to thrive.

    In order to become an effective data-driven business, it is necessary to understand what types of data to focus on. One of the most important things to do is use big data to study the effective decisions of other companies. This helps them utilize analytics to figure out what strategies will truly work the best. One way to use big data is to carefully study the company cultures of other successful businesses.

    Using Data Analytics to Better Understand the Company Culture of Other Successful Businesses

    A business is so much more than products, logistics, and customer service. Office culture and employee attitudes are key to helping companies thrive and succeed through the years. This culture is largely driven by the human element, but data analytics can play a huge role in shaping it. Companies can study various elements of company culture with data analytics and improve on them.

    Creating an ideal company culture doesn’t happen overnight, of course. We spoke with business owners about the factors that create winning cultures, and how to make them last. Data-driven organizations can use analytics technology to study these issues more carefully and cultivate their own company culture that integrates these issues.

    Communication at the Core

    Communication is a very important element of any company culture. This is one issue that analytics technology can help with. Analytics technology can help assess various KPIs pertaining to communication between employees. Big data also helps with using technology that helps better facilitate communication between employees and external stakeholders.

    “The best companies I’ve worked with take communication very seriously, no matter how big or small the issue may be. It goes beyond emails and phone calls. Memos, meetings, one-on-one reviews – these things all add up to a strong culture.” – Annabel Love, Co-Founder and COO of Nori

    “Miscommunication is the first symptom of struggling company culture. I encourage employees and peers to communicate early, often, and on repeat if necessary. That saves so much headache and keeps teams together.” – Ben Thompson, CEO of Hardwood Bargains

    “If your goal is to strengthen the culture of your company, it starts with strong communication from the top-down, and bottom-up. Everyone needs to be on board with the game plan. Make no assumptions and always double down on communication – it works.” – Raul Porto, Owner and President of Porto’s Bakery

    “Feedback is our secret weapon for creating the best culture. No matter what your goal may be, open and honest feedback moves you forward.” – Gina Lau, Head of Team Operations at HelloSign

    “The more you communicate, the less you have to worry about things like employee engagement, ambiguity, or a lack of direction for your company. It just makes everything easier and more cohesive for your organization.” – Bill Glaser, Founder of Outstanding Foods

    Positive and Social Atmosphere

    Data analytics could also help create a better atmosphere by analyzing sources of negativity and rectifying them.

    “We take culture seriously, which means looking beyond the bottom line and making this a pleasant place for people to work each day. It takes more effort and an empathetic approach, but the results are 100% worth it when you look at the big picture.” – Shaun Price, Head of Customer Acquisition at MitoQ

    “We’ve all had those jobs where the atmosphere is nothing short of miserable. It’s usually the result of poor management or just a lack of inspiration. Improve the fundamentals and a better company culture will emerge.” – Omid Semino, CEO of Diamond Mansion

    “If you’re curious about your company culture, as a business owner, you can learn a lot based on employee satisfaction. If a company has a positive culture, employees are happier and stay longer.” – Benjamin Smith, Founder of Disco

    “The best company cultures are rooted in an open-door policy and the desire to receive employee feedback. This increases employee engagement which, in turn, boosts retention rates. As a business owner, recognize immediately that feedback must be anonymous, and offer opportunities for your team to complete anonymous surveys. This way, everyone can share their experiences without the fear of retribution.” – Dylan Fox, Founder and CEO of Assembly AI

    “Culture can’t really be measured – it’s more about the vibe you feel at the office and when interacting with other people in-person or online. Keep the energy and positivity high, then everything else will come together.” – Brittany Dolin, Co-Founder of Pocketbook Agency

    Opportunities for Growth

    Data analytics is also extremely important when it comes to helping the organization grow. You can identify major trends with predictive analytics tools and help pursue new opportunities for organizational growth. Analytics also helps with training and other growth opportunities at the individual level.

    “Present opportunities for training, education, or interesting side projects within the company – your people will be eager to jump at the chance. This shows that you care about employees in the long run and not just the short term.” – Nik Sharma, CEO of Sharma Brands

    “There are so many ways to reward employees beyond the boring perks we all expect. Help them become more valuable to the company and themselves by offering training or coursework to level up. That’s a sign of real, strong company culture.” – Bing Howenstein, Founder of All33

    “I’m a believer in hiring the right people and giving them the opportunity to express themselves. Our company gives employees unbelievable amounts of power and autonomy.” – Blake Mycoskie, Founder of TOMS Shoes

    “Employees want to feel like there’s room for growth and development in their own roles or beyond. Nothing is worse than a dead-end job, we all know that. Speak with employees individually and talk about their goals, because that goes a long way.” – Jeff Goodwin, Vice President of Direct to Consumer and Performance Marketing at Orgain

    Dedicated to the Mission

    “Culture begins with you, the leader of the company, setting a strong example with ethics and a relentlessly positive approach to each challenge. You represent the standard that everyone else will follow – don’t point fingers elsewhere.” – Roy Ferman, Founder and CEO of Seek Capital

    “The best company cultures are those that allow for collaboration, innovation, and creativity. While it’s challenging to implement, open cultures that allow employees to speak their mind and provide insight on how things can be done differently are best for the health of the employees and the organization.” – Darren Litt, Chairman and Co-Founder of MarketerHire

    “Your company’s mission is always No. 1 whether you realize it or not. If you don’t have a clear direction for the business, that will show up in terms of low energy and lack of culture. The strongest cultures always stem from a clear purpose and deep-seated determination to succeed.” – Chris Gadek, VP of Growth at AdQuick 

    With the inside scoop from a wide range of successful business, you can see the power of a strong company culture for yourself.

    It’s time to apply these lessons, revamp your company culture, and take your business to new heights.

    Data Analytics is the Key to Thriving Company Cultures

    Data-driven organizations need to find new ways to improve their operational policies. This includes using analytics to create great company cultures. Fortunately, analytics technology can be great for leading a wonderful company culture.

    Author: Diana Hope

    Source: Smart Data Collective

  • Making your Organization more intelligent with a Cloud Data Strategy

    Making your Organization more intelligent with a Cloud Data Strategy

    At a time when most major companies are showing a long-range commitment to “data-driven culture,” data is considered the most prized asset. An Enterprise Data Strategy, along with aligned technology and business goals, can significantly contribute to the core performance metrics of a business. The underlying principles of an Enterprise Data Strategy comprise a multi-step framework, a well-designed strategy process, and a definitive plan of action. However, in reality, very few businesses today have their Data Strategy aligned with overall business and technology goals.

    Data Management Mistakes Are Costly

    Unless the overall business and technology goals of a business are aligned with a Data Strategy, the business may suffer expensive Data Management failure incidents from time to time. If the Data Strategy is implemented in line with a well-laid out action plan that seeks to transform the current state of affairs into “strategic Data Management initiatives” leading to the fulfillment of desirable business needs and objectives in the long term, then there is a higher chance of that Data Strategy achieving the desired outcomes. 

    Data provides “insights” that businesses use for competitive advantage. When overall business goals and technology goals are left out of the loop of an Enterprise Data Strategy, the data activities are likely to deliver wrong results, and cause huge losses to the business.

    What Can Businesses Do to Remain Data-Driven?

    Businesses that have adopted a data-driven culture and those expecting to do so, can invest some initial time and effort to explore the underlying relationships between the overall business goals, technology goals, and Data Strategy goals. The best part is they can use their existing advanced analytics infrastructure to make this assessment before drafting a policy document for developing the Data Strategy.

    This initial investment in time and effort will go a long way toward ensuring that the business’s core functions (technology, business, and Data Science) are aligned and have the same objectives. Without this effort, the Data Strategy can easily become fragmented and resource-heavy—and ineffective.

    According to Anthony Algmin, Principal at Algmin Data Leadership, “Thinking of a Data Strategy as something independent of Business Strategy is a recipe for disaster.”

    Data Governance has recently become a central concern for data-centric organizations, and all future Data Strategies will include Data Governance as a core component. The future Data Strategy initiatives will have to take regulatory compliances seriously to ensure long-term success of such strategies. The hope is that this year, businesses will employ advanced technologies like big data, graph, and machine learning (ML) to design and implement a strong Data Strategy.

    In today’s digital ecosystem, the Data Strategy means the difference between survival and extinction of a business. Any business that is thinking of using data as a strategic asset for predetermined business outcomes must invest in planning and developing a Data Strategy. The Data Strategy will not only aid the business in achieving the desired objectives, but will also keep the overall Data Management activities on track.

    A Parallel Trend: Rapid Cloud Adoption

    As Data Strategy and Data Governance continue to gain momentum among global businesses, another parallel trend that has surfaced is the rapid shift to cloud infrastructures for business processing.

    With on-premise Data Management practices, Cloud Data Management practices also revolve around MDM, Metadata Management, and Data Quality. As the organizations continue their journey to the cloud, they will need to ensure their Data Management practices conform to all Data Quality and Data Governance standards.

    A nagging concern among business owners and operators who have either shifted to the cloud or are planning a shift is data security and privacy. In fact, many medium or smaller operations have resisted the cloud as they are unsure or uninformed about the data protection technologies available on the cloud. Current businesses owners expect cloud service providers to offer premium data protection services.

    The issues around Cloud Data Management are many: the ability of cloud resources to handle high-volume data, the security leaks in data transmission pipelines, data storage and replication policies of individual service providers, and the possibilities of data loss from cloud hosts. Cloud customers want uninterrupted data availability, low latency, and instant recovery—all the privileges they have enjoyed so far in an on-premise data center.

    One technology solution often discussed in the context of cloud data protection is JetStream. Through a live webinar, Arun Murthy, co-founder and Chief Product Officer of Horton Works, demonstrated how the cloud needs to be a part of the overall Data Strategy to fulfill business needs like data security, Data Governance, and holistic user experience. The webinar proceedings are discussed in Cloud Computing—an Extension of Your Data Strategy.

    Cloud Now Viewed as Integral Part of Enterprise Data Strategy

    One of the most talked about claims made by industry experts at the beginning of 2017 was that it “would be a tipping point for the cloud.” These experts and cloud researchers also suggested that the cloud would bring transformational value to business models through 2022, and would become an inevitable component of business models. According to market-watcher Forrester, “cloud is no longer about cheap servers or storage, (but), the best platform to turn innovative ideas into great software quickly.

    As cloud enables big data analytics at scale, it is a popular computing platform for larger businesses who want the benefits without having to make huge in-house investments. Cloud holds promises for medium and small businesses, too, with tailor-made solutions for custom computing needs at affordable cost.

    The following points should be kept in mind while developing a strategy plan for the cloud transformation:

    • Consensus Building for Cloud Data Strategy: The core requirement behind building a successful Data Strategy for the cloud is consensus building between the central IT Team, the cloud architect, and the C-Suite executives. This problem is compounded in cases where businesses may be mix-matching their cloud implementations.
    • Data Architectures on Native Cloud: The news feature titled Six Key Data Strategy Considerations for Your Cloud-Native Transformation throws light on cloud-native infrastructure, which is often ignored during a business transformation. According to this article, though enterprises are busy making investments in a cloud-native environment, they rarely take the time to plan the transformation, thus leaving Data Architecture issues like data access and data movement unattended. 
    • Creating Data Replicas: Data replication on the cloud must avoid legacy approaches, which typically enabled data updating after long durations.
    • Data Stores across Multiple Clouds: HIT Think: How to Assess Weak Links in a Cloud Data Strategy specifically refers to storage of healthcare data, where data protection and quick data recovery are achieved through the provisioning of multiple cloud vendors. These solutions are not only cost-friendly, but also efficient and secure. 

    Author: Paramita (Guha) Ghosh

    Source: Dataversity

  • Moving Towards Data Science: Hiring Your First Data Scientist

    Moving Towards Data Science: Hiring Your First Data Scientist

    In October 2020 I joined accuRx as the company’s first data scientist. At the time of joining, accuRx was a team of 60-odd employees who had done an incredible job relying on intuition and a stellar team of user researchers to create products that GPs needed and loved. This, combined with the increased need for good tech solutions in healthcare in 2020, resulted in our reach expanding (literally) exponentially. Suddenly, we were in almost every GP practice in the UK.

    We found ourselves in an interesting position: we now had several products that were being used very widely by GPs each day, and another set of nascent product ideas that we were only just bringing to life. We knew that at this point we’d need to start relying more on insight from quantitative data to test out our hypotheses and move our product suite in the right direction.

    At this point, we didn’t need advanced ML solutions or the latest big data processing tools. What we really needed was the ability to verify our assumptions at scale, to understand the needs of a very large and diverse group of users and to foster a culture of decision-making in which relying on quantitative data was second nature. This was why I was brought in, and it’s not been without its challenges. Here are a few things I’ve learnt so far: 

    1. New roles create new conversations

    Adding new members to teams presents a series of inevitable challenges: team dynamics change, the initial cost of onboarding is high and there’s now one more voice in the room when making decisions. The effect of this is substantially amplified when you’re adding not just a new member but a new role to a team.

    Before I joined, data science had not been a core part of the product development process. Suddenly, the team were introduced to a host of new concerns, processes and technical requests that they’d not needed to consider before, and addressing these often required a sizeable shift in the entire team’s ways of working.

    A few examples of this are:

    • Software engineers had to spend more time adding analytics to feature releases and making sure that the pipelines producing those analytics were reliable.
    • Sometimes, AB test results take a while to trickle in. Given that those results (hopefully) inform the direction a product will move in next, product managers, designers and engineers often found themselves facing a fair degree of ambiguity over how best — and how quickly — to iterate on features and ideas.
    • Having an additional set of information to consider often meant that it took us longer to reach a decision about which direction to move a product in. We now had to reconcile our intuitions with what the data was telling us — and also make a call as to how reliable we thought both of those were!

    It’ll take a bit of trial and error, but it’s important to find a way of working that gives product managers, designers and engineers the freedom to ship and iterate quickly without sacrificing your commitment to analytical rigour. In our case, this looked like figuring out which product changes were worth testing, what level of detail was worth tracking and what kinds of analyses are most useful at different stages of the product development process.

    2. Effective communication is more than half the battle

    It doesn’t matter how useful you think your analyses are — if people don’t know about or understand them, they’re not likely to have much long-term impact. In addition, the way in which you communicate your findings will determine how much impact your analysis ultimately has.

    Communicate widely and frequently.

    Importantly, it’s not enough to relay your findings to team leads only — the whole team has invested a lot of time and effort adjusting to new ways of working that support analytics, and they expect to be able to see what impact those adjustments have had. Communicating how those changes have positively impacted decision making will go a long way to creating the kind of positive feedback loop needed to motivate your team to keep relying on the processes and techniques that you’ve introduced.

    Once you’ve got your team on board, the really tough part is in ensuring that the initial excitement around using data to make decisions persists. A mistake I’ve made (more than once!) is assuming that communication around analytics is a ticket that you can mark as done. If you’re looking to drive a culture change, you’ll need to continually remind people why they should care about the thing as much as you do. As people hear more and more about the positive inroads teams have made off the back of insight from data, relying on data to back up product decisions should start to become expected and more automatic.

    Always present data with insight.

    Wherever possible, try to communicate your findings in terms of how this will affect decision-making and what people should do as a result. The less abstract you can make the results of an analysis, the better. One simple way to make your results less abstract is to clearly quantify how much impact you think the change will have.

    For example, if you’ve run an AB test to determine if a new feature increases your conversion rate, instead of saying ‘The change was statistically significant’, rather try ‘If we rolled out this new change to all our users, it’s likely that our conversion rate would increase from 5% to 7%, which translates to an additional 200 active users per week’.

    Similarly, when sharing data visualisations with a team, try to be explicit about what the graph is and isn’t showing. Remember that you’ve spent a lot of time thinking about this visualisation, but someone seeing it with fresh eyes likely doesn’t have as much context as you do. Simple ways to make visualisations clear are to make sure that the exact data you’ve used to define a metric is understood, and that you offer an interpretation of the trend or finding you’ve visualised alongside the graph. If you can, try to explain the implications of the trend you’ve visualised for your team’s goals so that they can take action off the back of the insight you’ve shared.

    Speed is good, but accuracy is better.

    There’s no surer way to ensure that your work has low impact than by making a habit of communicating incorrect or partially-correct results. If you’re the first or only data scientist in your team, you are the authority on what constitutes good or sufficient evidence and so, ironically, you have very little margin for error.

    You’ll often find yourself having to trade-off getting results out to teams quickly and making sure that the analyses producing those results are robust, particularly if you’re working with new, suboptimal or unfamiliar tools. In most cases, I’ve found there’s usually a compromise you can reach — but this requires that you’re very clear about the limitations of the data you’ve used to reach a particular conclusion. When in doubt, caveat!

    People will quickly learn if they can trust you, and once broken trust is a tricky thing to get back. This is not to say that you won’t make mistakes — but it’s really important that when these happen they’re caught early, acknowledged widely and that robust processes are put in place to avoid similar mistakes in future.

    3. Good data infrastructure is a prerequisite for good data science

    When it comes to accurate and useful analyses, it’s a foregone conclusion that they’re enabled by accessible and reliable data. No matter how good your infrastructure, it’s reasonable to expect to have to spend a significant chunk of your time cleaning data before running your analyses. As such, if your data infrastructure is not optimised for analytics, the additional time spent cleaning and wrangling data into a usable format will quickly become a major barrier. Up until this point, we hadn’t prioritised securing best in class analytics tools — getting this right is hard work, and it’s something we’re still working towards.

    Death by a thousand cuts…

    The effect of this is twofold. First, it adds enough friction in your workflow that you are likely to forego using information that could be valuable because you’re having to weigh the usefulness of the information against the cost of getting it. When an organisation moves fairly quickly, the time and effort this requires is often prohibitive.

    Secondly, the probability of making mistakes compounds each time you shift and transform data across different platforms. Each relocation or adjustment of your data is associated with some chance of making a mistake — naturally, the more of this you do, the higher the likelihood that your data is less reliable by the time you actually run your analysis. These two barriers together strongly dis-incentivises people in analytics roles to solve problems creatively, and adds enough friction that your approach to analysis might become a fair bit more rigid and instrumental — and where’s the fun in that!

    You become the bottleneck.

    Related to this is the issue of accessibility for the wider team. If data scientists are struggling to access data reliably, you can bet your bottom dollar that everyone else is probably worse off! The result of this is that queries for simple information are outsourced to you — and as people become aware that you are able and willing to wade through that particular quagmire, you, ironically, start to become the bottleneck to data-driven decision-making.

    At this point, your role starts to become a lot more reactive — you’ll spend a majority of your time attending to high effort, marginal value tasks and find that you’ve got a lot less time and headspace to devote to thinking about problems proactively.

    To avoid these pitfalls, you’ll need to make sure that you motivate for the tools you need early on, you automate as much of your own workflow as possible and you provide enough value that people can see that they’d get a lot more from you if you were able to work more efficiently.

     
    Author: Tamsyn Naylor
    Source: Towards Data Science
  • Pyramid Analytics: Main lessons learned from the data-driven drilling and production conference

    Pyramid Analytics: Main lessons learned from the data-driven drilling and production conference

    It was great to be at the data-driven drilling and production conference in Houston on June 11 and 12. The conference was well attended by hundreds of oil and gas (O&G) professionals looking to use technology to minimize downtime, enhance safety, and deliver digital transformation throughout their businesses.

    We talked to dozens of attendees looking to educate themselves about modern data collection and ingestion methods, better information management and integration processes, E&P automation & control systems, more efficient change management, and drilling optimization techniques, and advanced and predictive analytics.

    As an analytics and BI vendor, we were there to learn more about how practitioners are using advanced analytics, particularly AI and machine learning, to extract more value out of their data.

    Three key themes

    In our conversations with attendees and other vendors, three key themes emerged:

    • The persistence of data silos

      No surprise here: data silos aren’t going anywhere. The upstream organizations we spoke to struggle with data sharing across departments. It’s a common scenario for users to have limited access to distributed data. It is also common for upstream organizations to perform analytics using numerous tools (many of the individuals we spoke to freely admitted to using three or four different BI tools). This perpetuates the cliché: there is no single version of the truth. The result is duplicate data, duplicate efforts for reporting, duplicate logic and business rules, and more. As a result, collaboration and efficiency suffer.
    • AI and ML operationalization remain elusive

      Many of the professionals we talked to lack effective systems for putting advanced analytics into production. Here’s a common scenario. A line-of-business user will throw data scientists a set of data and say, 'here’s the data, do your magic'. The data isn’t always optimized, so data scientists often spend time prepping the data before they can even analyze it. Then they analyze the data using standalone ML software applications before outputting a flat file and sending it to a business analyst to reload into one of several desktop-based BI applications. This results in a perpetual cycle of extracting, importing, analyzing, exporting, re-importing, and re-analyzing data. The whole process is cumbersome and inefficient; meaningful insights derived from AI and ML initiatives remain limited.

    • It’s hard to move beyond legacy analytics systems 

      For many O&G companies, there is a strong desire to adopt new data and analytics technologies; they acknowledge legacy tools simply aren’t equipped to quickly accommodate newer sources of data and perform advanced and prescriptive analytics. However, the difficulty of migrating from legacy systems often holds some people back, no matter how siloed their data environment is. Many organizations have had their current desktop-based analytics solutions in place for years, and in some cases decades. However, the huge store of analytic models, dashboards, and reports they have created over the years cannot be easily migrated or re-created. 

    The three challenges identified above are tough. But that doesn’t make trying to solve them any less urgent. And from our perspective, this doesn’t make them any less solvable. The price of inaction is too high. No one can stand on the sidelines while the technology environment changes.

    Author: Brigette Casillas

    Source: Pyramid Analytics

  • Some expert advice on gaining organizational trust

    Some expert advice on gaining organizational trust

    Take a moment and ask yourself, what is your definition of trust and how do you know when you are trusted?

    Did the answers come quickly, or not? If you don’t have a ready definition, don’t worry, most people don’t. It’s just one of those things that we have an impression about. We know it when we feel it. Here’s the bad news. It’s hard to know when you are trusted and even harder to know how to build it. And on average, we need to earn a lot more trust than what we currently have. The good news? It’s possible to earn trust with the right plan.

    What’s that you ask? Doesn’t ‘planning for trust’ sound Machiavellian? I can imagine that it must, but here’s more good news. The plan not only can help you build trust, but it can also help you make a few friends along the way. The same tactics you use to build trust, are some of the same you might use to build relationships and gain friends. The problem is that we leave too much to chance and we don’t always know what works. Don’t do that. Instead, do this:

    1. Understand the context in which you want to earn trust. If you are a CIO working on getting a seat at the table or build out a digital transformation plan, that context is far different than the context of traditional IT and how you may have built your brand so far.
    2. Recognize that trust is developed based on your expertise and knowledge within that context.
    3. Acknowledge that trust is also developed based on the rapport and relationships you build accordingly.

    Trust is contextual

    The determination of trustworthiness happens within a particular context. A pediatrician is someone trusted to treat a sick child, but not to manage a problematic investment portfolio. A CIO and their team may have built trust in the context of building and running robust and predictable back-office IT systems, but they may be untrusted to create innovative and flexible solutions for end consumers. Paradoxically, past success is not an indication of future success. Studies have shown that people who have been successful in the past are actually more likely to perform worse in a new context. They can become overconfident and less open to feedback.

    Trust is a measure of expertise and rapport

    CIOs may try to develop trust by first demonstrating expertise in all things digital. However, unlike consultants or contractors, who are external parties and can rely on expertise alone (part of an open social system), the CIO must rely on expertise plus rapport (part of a closed social system), but rapport leads the way. Focusing on expertise can diminish perceptions of warmth. Instead of coming across as an expert, CIOs risk coming across as overconfident or arrogant. Building rapport requires empathy, listening, curiosity, and genuine interest. Compare that to how one demonstrates expertise using logic, opinions, and arguments. It is wiser to have built some rapport before asserting opinions and arguments.

    Author: Ed Gabrys

    Source: Gartner

  • The 5 dimensions that help your business with a successful technological transformation

    The 5 dimensions that help your business with a successful technological transformation

    Businesses that have mastered the ability to change quickly share one common denominator: technology is transforming their business. Technology can be a transformative engine that gives your organization the power to learn, adapt and respond at the pace of change.

    Today’s IT leaders have many tools to enable speed and flexibility, including Lean IT, Agile, DevOps and Cloud First among others. However, these concepts alone rarely deliver the technology transformation that organizations need because companies are tempted to think of transformation as a predominately organizational journey. Organizations need to think much more holistically in order to lead a technology transformation and enable a flexible and efficient business.

    There are five essential components, the 5 dimensions, that can lead to a successful technology transformation. Each dimension allows you to learn something unique about your organization, somewhat similar to an archeologist digging through an archeological tell. The 5 dimensions can be used to drive a holistic technology transformation that fits your historical and cultural context.

    Here's a brief look at the 5 dimensions and how they can serve you:

    1. Business alignment 

    Far too many organizations build their technology strategies by aligning with the tactics of their business operations. The result is strategic dissonance, as IT resources are not correctly prioritized to meet strategic business priorities. This misalignment leads to new architectural debt. Today's tech leaders need to understand the organization's business model and build a technology strategy that unlocks and empowers that model, ensuring alignment along the way.

    2. Architectural debt 

    Most organizations suffer from technical debt: systems built for expediency instead of best practices. Architectural debt, on the other hand, is the systemic root cause in the creation of technical debt. A recent survey by IDG and Insight Enterprises found that 64% of executives cited legacy infrastructure and processes as a barrier to IT and digital transformation. ‘Legacy infrastructure and processes’ is just another way of describing architectural debt. Debt is an important concept for technology organizations because it constrains flexibility and results in an IT organization managed by the inertia of their systems. If you want to lead an IT or digital transformation, you must quantify your architectural debt and pay down (minimize) or pay off (eliminate) that debt in order for your transformation to be both successful and sustainable.

    3. Operational maturity 

    IT organizations exist on a spectrum of maturity, classified into three distinct phases: operators, automators, and innovators. Operational maturity is a critical enabler of an organization’s ability to execute their vision or goals. There is a high correlation between business value and operational maturity. Mature IT organizations are focused on high quality, business value-added activities. An IT organization’s capabilities directly correlate with its phase of maturity along our spectrum. You must look at the people, processes, technologies and artifacts to understand where change must occur in order to increase operational maturity.

    4. Data maturity

    Clive Humby, U.K. mathematician and architect of Tesco's clubcard, famously said in 2006 that 'Data is the new oil… It’s valuable, but if unrefined it cannot really be used'. Nearly a decade later, The Economist called data the world’s most valuable resource. Many organizations are sitting on mountains of unrefined data, uncertain how they should be storing, processing or utilizing that valuable resource. Top-performing organizations that are using data to drive their business and technology decisions have a distinct competitive advantagetoday and tomorrow.

    5. Organizational dexterity 

    Your organization’s capacity for innovation and change directly correlates with its dexterity. To quote Peter Drucker: 'In times of turbulence, the biggest danger is to act with yesterday’s logic'. Organizations falter when they have institutionalized a culture of yesterday’s logic. An agile organization isn’t just a decentralized organization, it’s an organization that has the capability to learn and unlearn, demonstrates complex problem solving, emotional intelligence and much more.

    We live and work in turbulent times, with more volatility on the horizon. Is your technology ready? How about your organization? The 5 dimensions play a critical role in building a holistic understanding of your organization. Seeing the whole picture enables you to build a pragmatic path forward that leads to a true technology transformation.

    Author: Alex Shegda

    Source: Information-management

  • The challenge AI creates for IT and business leaders

    The challenge AI creates for IT and business leaders

    Artificial intelligence (AI) and AI-augmented data analytics have captured the imagination of everyone from kindergarten to the boardroom, as they change the ways we live, shop, consume news, and govern ourselves. From an IT-centric viewpoint these technologies are changing our business models. They’re also creating fierce competition to retain the limited number of people with the skills needed to transform AI into competitive advantage. IT and business leaders across industries all face the same challenge: how to close the skills gap that’s been created by advancements in AI and data analytics.

    Overcoming this challenge is essential to compete in today’s AI-enabled and disruption-obsessed tech environment. There is a wealth of technology platforms and resources available to businesses to become more data-driven and competitive. To reap technology’s full benefits, though, IT leaders need to reskill their staffs and attract top talent that are equipped with the right data skills and mindsets. Achieving this won’t happen overnight; it  will require support and investment from senior leadership. The strategy outlined below will increase the likelihood of these efforts succeeding.

    Obtain senior management buy-in before proceeding

    Making major changes in an organization requires the support of senior management because projects of this magnitude will have significant business, staffing and budget implications. IT professionals should build their cases for change from a business, not a technology, vantage point. They need to focus on how their plans will create competitive advantage by reducing lost opportunity costs, improving the success rate of new development projects, and enabling new business models.

    First, there needs to be a clear link between IT spending and specific revenue streams. How will each IT dollar spent impact initiatives from various departments; whether it’s IT, marketing and sales, or HR and accounting. This encourages good user behavior by linking requests to costs and encourages management to ask questions like the following:

    • How much does an application outage cost per hour?
    • What does it cost to shrink an application’s RTO from 4 hours to 2 hours?
    • What are its effects on customer relationships, stock prices, revenues, etc.?
    • And, last but certainly not least, can that money be better spent elsewhere?

    Finally, IT needs to set realistic expectations with senior management regarding the difficulty of retraining and hiring staff, as well as developing and testing new capabilities. Many IT organizations that have provided infrastructure for decades often lack the skills needed to exploit data analytics to their fullest advantage. From a recruiting perspective, many still struggle with the process of creating job descriptions that align with the revised role of IT. The list of new job titles is long and often fuzzy, encompassing everything from Chief Data Officer to Cloud Engineer to IoT Architect. Investing in training and development for existing staff while also allocating resources to recruit for new roles can be a timely and costly investment. However, it’s an investment worth making when done wisely, helping to create a more competitive business model. IT needs to be ready to sell this into the C-suite or risk losing out on the data-driven economy and being outpaced by competitors.

    Use consultants

    Treat the need to reskill your staff with a sense of urgency. Your competitors are, so don’t pinch pennies. Consultants can shorten your time to market with new services built on data analytics and AI/ML by helping to identify missing skills and assist in creating job descriptions and profiles of ideal candidates. This profile should include technical skills and personality traits, education, certifications, prior work experience, and other factors such a willingness to work evenings or weekends, and career expectations.

    Competent consultants can also help you avoid products that do not fit your requirements by helping to assess functionality, scale, performance, ease of use, etc. In doing so, they help avoid pitfalls that their previous clients encountered as they leveraged their data and AI to a competitive advantage. They can also help you create a shortlist of possible solutions and identify technology and marketing trends that may indicate changes in your strategies.

    Build relationships with local colleges and universities

    Schools are redesigning their curriculums to satisfy the need for technical professionals with skills in data analytics, AI/ML, cyber-security, and helping users leverage these technologies into competitive advantage. The lofty salaries commanded by graduates with these skills means there is fierce competition for them as previously noted, so you want to be first in making them job offers. The best ways to gain access to them is by building relationships with department heads and individual professors, offering professors consulting engagements where they make good business sense, sponsoring research projects that align with your business needs, and establishing an intern program. Internships not only expose potential new hires to your company, they introduce AI-related skills to existing employees, which can help management identify those with the potential to grow into new roles.

    While providing critical business insights for significant competitive advantage, data analytics and AI/ML are providing CIOs and other technical leaders with opportunities to reskill their staffs and engage with a whole new generation of data-savvy candidates. It doesn’t stop at just training and recruitment though. Leaders need to invest in the right tools and technologies that empower their workforce to harness the full potential of data and AI. Done well, these projects will transform IT’s role within an organization from being a provider of infrastructure to being a source of competitive advantage. Since mastery of these technologies is not optional, now is a great time to start to start the process.

    Author: Stanley Zaffos

    Source: Insidebigdata

  • The Challenges of Moving to a Cloud Environment

    The Challenges of Moving to a Cloud Environment

    While no business could have fully prepared for the COVID-19 pandemic’s impact, those with strong cloud-based strategies have been able to adapt to the remote work reality. But even for companies that have migrated to the cloud or are in the process, a dispersed workforce presents challenges when you consider the trade-off between a streamlined, cohesive work process and network security. Despite this, the move from on-premise to cloud-based solutions isn’t slowing, making cloud migration still desirable.

    In fact, recent research points to increasing public cloud adoption over the next year, even amid, or perhaps a result of, the pandemic and an overall downturn in IT spending. According to Instinet, 68% of CIOs indicated cloud services would become more of a priority for their businesses and reported a reduction in on-premise workloads, from 59% relying on on-prem assets in 2019 to an estimated 35% by 2021.

    For businesses, cloud and SaaS services offer an easy way for employees to collaborate and access the information they need outside the confines of a physical office space. For employees, these solutions are desirable in part because they’re so easy to use. When not sanctioned through an employer, all it takes is an email or credit card to sign up, and an employee can start a CRM package, open a Dropbox, or create an iCloud account, and a range of other activities. While it sounds benign, any of these services could be a place for sharing company information, from trade secrets, to intellectual property, and personally identifiable information.

    In order to enable employees to get work done and safeguard sensitive information, organizations must find a way to both connect and manage systems and access. Cloud migration is a big undertaking, and far too often organizations overlook what a crucial part identity governance plays in implementing successful and sustainable cloud migration initiatives. By baking identity governance into your plan from the get-go you can avoid some of the main security pitfalls of transitioning to the cloud.

    One major challenge is employee buy-in. It may sound counterintuitive, as the cloud is meant to streamline work processes, but learning new systems and working out permissions can be a learning curve company’s need to account for. People want to get work done as quickly and efficiently as possible, and adding another roadblock for them to access what they need can result in bypassing security protocols. Organizations who have not already should implement safeguards like multi-factor authentication (MFA), but also consider making the second form of identity something easy to access, like a code sent to a mobile device or something the person has at all times versus a security question or a physical token they need to remember.

    A good cloud migration strategy is not just about wrangling your employees, though—it’s about choosing your cloud partner wisely. When you rely on cloud solutions, you’re entrusting another party with your valuable customer and company data. Even if the information is compromised under their care, it’s your business that will pay the price at both a financial and reputational cost. Before embarking on a cloud journey be clear about your prospective cloud provider’s security practices, and don’t just make them tell you—have them show you. Ask where your data will be stored, what measures they take to protect it, and what practices they use to keep it secure.

    Another challenge beyond vendor selection and employee onboarding is simply keeping up with the pace of technology. The last few years have looked like an arms race to the cloud, and as a result, a lot of projects fail. Migrating all your data with different levels of sensitivity and access privilege should be done intentionally, and many bite off more than they can chew. This causes mistakes and headaches in the long run, and the worst part is, it’s easily avoidable. Leverage third-party resources that have identity expertise, such as an outside consultant or an analyst firm to help you define your cloud requirements. Make sure stakeholders—leadership, investors, department heads, etc.—are involved in executing cloud projects, as they span the business.

    The work doesn’t stop there, though. Once you do have a solid strategy, select a vendor to partner with, and start onboarding and training employees, think ahead about how you’ll maintain a healthy security posture. Consider using a cloud access security broker, an independent software that sits between cloud service users and cloud applications, and monitors all activity, or an ethical hacker to help identify weak areas and enforce security policies. For highly-regulated industries, such as healthcare and life sciences or finance, managing evolving threats becomes especially important. By not complying with strict laws and requirements to protect sensitive information, you could be setting yourself up for a world of hurt.

    Security is a top reason that organizations stall their cloud endeavors—and for good reason. However, with the promise of better IT processes, increased productivity and collaboration, and a host of other benefits, the challenges of cloud migration far outweigh the risks. Success takes due diligence and a digestible strategy to prevail, so be sure to do the homework, tweak as you go, and remember, it’s a marathon, not a sprint.

    Author: Jackson Shaw

    Source: Open Data Science

  • The future of cybersecurity threatened by the emergence of IoT devices

    Imagine being able to communicate effortlessly with the devices around you. This means having your devices fully automated and connected by sharing data through the use of sensors. This will definitely improve the quality of life and make our day to day activities much easier. This will also make businesses more efficient and facilitate in driving new business models.

    Well, there is no need to imagine as this is already a reality. These are the wonders of the innovation brought about by the Internet of Things (IoT), which simply refers to the network of devices, such as vehicles and home appliances, that contain electronics, software, sensors, actuators and connectivity that allows them to connect, interact and exchange data. The emergence of IoT brings about numerous benefits, but also poses a huge threat to security as it creates new opportunities for all the information it gathers to be compromised.

    Cybersecurity is already at the top of the agenda for many industries, but the scale and scope of IoT deployments escalate security, making it harder than ever to protect businesses and consumers from cyber attacks. intelligent organizations already need to protect their data and information, but cybersecurity is growing more important than ever with the emergence of IoT devices. Although IoT developments have made life easier on so many levels, it has also brought about serious security implications, as the scale of connected devices greatly increases the overall complexity of cybersecurity, while the scope of the IoT which isn’t operating as an independent device but an ecosystem magnifies these challenges — any data breach can cause significant damage to a whole business database.

    As HP found out, 70% of the Internet of Things devices are vulnerable to external attacks. With the technical vulnerability of most of these devices, it can only escalate these threats. Also, with its constant evolution and little attention to security, the potential for damaging cyber attacks can only tend to increase in the future. The implementation of IoT networks opens up the grid to malicious cyber attacks and any form of compromise in the network could lead to great data leakage.

    8 IoT threats to cybersecurity in the future

    8 IoT threats to cybersecurity in the future

    1. Complexity

    Variation of devices connected to a network is accompanied by risks worsening cybersecurity worries with its diverse and wide ecosystem.

    2. Volume of Data

    With IoT’s great need of data to work, it opens up nearly every part of our lives to the Internet, posing an important threat to the possibility of data manipulation. As a result, we must consider what this kind of access to the Internet means for your digital and personal security, as the availability of numerous access points leads directly to an increase in the risk of a breach or hack.Unified attacks can bring down a system or a network of data that is relied upon by millions. IoT is an incredible idea with the potential to change our lives dramatically but brings with it a flurry of concerns that will stretch your abilities and require you to be on your toes at all times.

    3. Continuous Expansion

    The IoT evolution doesn’t seem like slowing down anytime soon and, in fact, it continues to evolve and expand rapidly. This makes it difficult for cybersecurity to keep up with the pace.

    4. Over-Dependence On the Cloud

    With the cloud infrastructure, IoT has a heavy reliance on the cloud for safety, which makes cyber attacks to be targeted to the cloud. With this knowledge, it’s important to look for more ways to reduce those threats. More monitoring will be highly needed for cloud configuration, as well as logging. This monitoring can also be done with the use of external tools — These includes antivurus softwares and VPNs needed to be reviewed and compared carefully. These reviews and comparisons will enable you to choose the tool best suited for your device and needs, while the use of these tools will go a long way in securing your internet connections.

    5. Privacy Issues

    The issue of privacy is generated by the collection of personal data in addition to the lack of proper protection of the data.

    6. Deficiency In Authentication

    This area deals with ineffective mechanisms being in place to authenticate to the IoT user interface and/or poor authorization mechanisms whereby a user can gain a higher level of access than allowed with regard to their weak authentication mechanisms. For example, there is usually a large amount of data that is not sufficiently encrypted and these data are transmitted via wireless networks, many of which are public and lacking in security.

    7. Insecurity

    Over the past two years,AT&T’s Security Operations Center has logged a 458% increase in vulnerability scans of IoT devices. The risk with this is that the IoT device could be easier to attack, allowing unauthorized access to the device or its data. Most IoT manufacturers concentrate more on the efficiency of the device and less on the security, making devices vulnerable to cyberattacks. It is also difficutl to secure these devices after they become an end product, which only increases the challenges of cybersecurity.

    8. Industrial IoT

    According to Forcepoint, in 2019 attackers will break into industrial IoT devices by attacking the underlying cloud infrastructures. This target is more desirable for an attacker, as access to the underlying systems of these multi-tenanted, multi-customer environments represents a much bigger payday.<

    What does the future hold?

    Due to the aforementioned IoT-related weaknesses, which give cybercriminals more access to manipulate connected devices, it’s clear that IoT is painting a scary future for cybersecurity. However, it’s noteworthy that no system can ever be perfect. A continuous effort has to be put into work in order to provide more effective cybersecurity measures to ensure more safety in our day-to-day use of the IoT devices around us.

    Author: Joseph Chuckwube

    Source: SAP

  • The human impact of data literacy

    The human impact of data literacy

    What if I told you only 32% of business executives said that they’re able to create measurable value from data, and just 27% said their data and analytics projects produce actionable insights? Let me put it another way: How excited would you be if I said I made you some chocolate chip cookies, but I only put in 32% of the required sugar and 27% of the required flour?

    I sure hope you wouldn’t eat those cookies. The cookies would be underprepared and not correctly baked with all the necessary ingredients for tasty success. To make an analogy, there are companies creating data and analytics (think: making cookies) without the necessary cultural and organizational ingredients to derive the greatest value from their creations.

    To help others better understand how data literacy – properly and programmatically implemented – can encourage organizations to use these needed ingredients, I recently co-presented a webinar with Martha Bennett, VP and Principal Analyst, from Forrester, and Rishi Muchhala, Manager of Enterprise Intelligence, from Nemours Children’s Health System. The webinar had thousands of attendees, and we received many good questions. I’ve formulated them and provided detailed answers below.

    Question topic 1: What about the data culture of an organization?

    This was a recurring theme in each of the questions that were asked and for good reason. The number one obstacle to data literacy success has nothing to do with data, technology or the software you deploy; it has everything to do with your culture and the people in your organization. Now, how many of you reading this think changing a culture is easy? If so, trust me – it’s not.

    Changing a culture is definitely not easy. It involves changing the DNA of an organization, so that people embrace – not just accept – data. This means data fluency, data literacy, analytical competence and data mentoring must be encouraged and reinforced at multiple touchpoints throughout the organization. Part of the solution is convincing people at all levels that data is empowering.

    Question topic 2: What are key areas to focus on in a data literacy program?

    This question is very large in scope, and you could get lost trying to address all facets of a data literacy program. Below are a few key areas a data literacy program should concentrate on.

    • Leadership – For any data literacy program to succeed, it must have leadership buy-in. The leaders of any organization set the tone and agenda for cultural change, marking how to measure it, conveying its progress and extolling its virtues.
    • Tailored learning – Remember that each individual is at his or her own data literacy stage, and we cannot expect a program to succeed if we try to fit everyone into the same puzzle space. One size does not fit all – people learn at different speeds in different ways, and you should provide for differing learning experiences that nurture data literacy growth across that spectrum.
    • Curiosity, creativity and critical thinking – Work hard to foster the '3 Cs of Data Literacy', which form the foundational pillars of nearly all data literacy programs. People should have a strong desire to know and understand, as well as engage in divergent and novel thinking. This is more likely to occur when the tenets of such thinking are embedded in every part of a data literacy program.

    Mind you: I am not recommending that everyone go back to school, study statistics and so forth. But, I am saying we need a culture that encourages the questioning and challenging of assumptions.

    Question topic 3: Who should lead the data literacy effort in the company?

    This is another great question. I have been approached by people who wonder if a grassroots movement among the employee base is the key to data literacy success. I have been approached by people who wonder if it is the executive team that leads the charge. The short answer is both.

    In order for your data literacy program to succeed, you must have leadership and executive buy-in. By having buy-in from the executive team, you ensure the workforce understands the company is behind the data literacy initiative. Then, create excitement through grassroots work and data literacy evangelists. These two techniques help organizations drive a holistic and inclusive approach to data literacy.

    Conclusion

    The human impact of data literacy cannot be overemphasized. A workforce and society empowered by data leads to smarter, better-informed decision making, which makes us less prone to errors, groupthink and orthodoxy. This means we will be more open to challenging others’ practices that are not supported by evidence and also more accepting of data-based feedback that challenges our own approaches. In short, as a society, increased data literacy can only help us grow, as professionals and people, enriching and deepening our perspectives.

    Author: Jordan Morrow

    Source: Qlik

  • The importance of ensuring AI has a positive impact on your organization

    The importance of ensuring AI has a positive impact on your organization

    Arijit Sengupta, founder and CEO of Aible, explains how AI is changing and why a single AI model is no longer smart business.

    There’s lots of buzz about artificial intelligence, but as Arijit Sengupta, founder and CEO of Aible, points out, “Everyone has heard a lot about AI, but the AI we’ve been hearing about is not the AI that delivers business impact.” Where is AI headed? Why is a single AI model no longer the right approach? How can your enterprise make the most of this technology?

    Arijit SenguptaAI needs to deliver context-specific recommendations at the moment a business user is making a decision. We’ve moved away from traditional analytics and BI, which looks backwards, to a forward-looking technology. That’s a fundamental shift.

    What one emerging technology are you most excited about and think has the greatest potential? What’s so special about this technology?

    Context-specific AI has the greatest potential to change business for the better. The first generation of AI was completely divorced from the context of the business. It didn’t take into account the unique cost-benefit tradeoffs and capacity constraints of an enterprise. Traditional AI assumed that all costs and benefits were equal, but in business, the benefit of a correct prediction is almost never equal to the cost of a wrong prediction.

    For example, what if the benefit of winning a deal is 100 times the cost of unnecessarily pursuing a deal? You might be willing to pursue and lose 99 deals for a single win. An AI that only finds 1 win in 100 tries would be very inaccurate based on model metrics, although it would boost your net revenue. That’s what you want from AI.

    The second generation of AI has a laser focus on the specific business reality of a company. As Forrester and other analysts have pointed out, AI that focuses on data science metrics such as model accuracy often doesn’t deliver business impact.

    What is the single biggest challenge enterprises face today? How do most enterprises respond (and is it working)?

    Solving the last-mile problem of AI is the single biggest business challenge facing companies today. Right now, most business managers don’t have a way to understand how a predictive model would impact their business. That’s a fundamentally different question than finding out what the AI has learned.

    Just because I tell you how a car works doesn’t mean you know how to drive a car. In fact, in order to drive a car, you often don’t need to know all of the details about how a car works. In the first generation of AI, we obsessed over explaining how the car works in great detail. That’s what was considered “explainable AI.”

    What we are shifting to now is the ability for businesses to understand how the car affects their lives. Enterprises need to know how the AI affects their business outcomes under different business scenarios. Without this knowledge, you can’t get AI adopted because you’re asking business owners to play Russian roulette. You’re not giving them the information they need to understand how a given AI model will affect their KPI. You’re just giving them a few models and telling them to hope for the best.

    Is there a new technology in data or analytics that is creating more challenges than most people realize? How should enterprises adjust their approach to it?

    Traditional AI built on model accuracy can actually be incredibly harmful to a business. AI that’s trained to optimize model accuracy is often very conservative, and that can put a business on a death spiral. A conservative model will tell you to go after fewer and fewer customers so you’re assured of closing almost every deal you pursue, but many times that means you end up leaving a lot of money on the table and slowly destroying your business. AI that maximizes accuracy at the expense of business impact is worse than useless - it destroys value.

    What initiative is your organization spending the most time/resources on today? In other words, what internal project(s) is your enterprise focused on so that your company (not your customers) benefit from your own data or business analytics?

    We’re an early-stage startup with a relatively small volume of data, but we believe in getting started with AI quickly rather than waiting to get a ton of data. What we first started doing is using AI to predict which customers were likely to go from a first contact to a first meeting and which were likely to click on an email.

    Over time, we’ve collected more data and been able to optimize our marketing spending across different channels and figure out exactly which customers to focus on. If we had waited until we had a lot of data to get started, we wouldn’t have progressed as far as we have. By getting started with AI quickly, we were able to improve our AI process much faster.

    Where do you see analytics and data management headed in 2020 and beyond? What’s just over the horizon that we haven’t heard much about yet?

    Everyone has heard a lot about AI, but the AI we’ve been hearing about is not the AI that delivers business impact. The AI we’ve been hearing about is the AI of labs that’s abstracted from business realities.

    What’s just over the horizon that people are beginning to wake up to is that to get business impact, you have to have a very different kind of AI. Creating a single AI model doesn’t make any sense because business realities constantly change. What you need to do is create a portfolio of AI models that are tuned to different business realities. You need a different model if your cost to pursue a customer goes up 10 percent or if your average deal size goes up 20 percent. If you create a portfolio of AI models, your business will be much more resilient to change - and the only thing you can count on in business is change.

    Can you describe your solution and the problem it solves for enterprises?

    Aible’s AI platform ensures business adoption by giving users tools tailored to their existing skills and needs. Aible overcomes the last-mile problem by enabling end users to customize models and see how they affect the business. Aible lets you get started quickly with the data you have by fully automating the machine learning process; team members can contribute their unique business insights to AI projects. Uniquely, Aible delivers dynamically balanced AI models so you always deploy the right model at the right time. Aible ensures data security by running in your secure AWS or Azure account or on premises and never sees your data or trained models.

    Author: James E. Powell

    Source: TDWI

  • The omnipresent challenge of collaborating effectively

    The omnipresent challenge of collaborating effectively

    Whether talking about collaboration tools or just the idea of collaboration, at  this point in time it should not be “news.” We’ve collaborated forever and the tools have been around for well over 20 years.

    And yet it seems we still struggle to figure out how to collaborate effectively.

    Maybe it is our org structures, where competing goals, or just different leaders, cause conflicts.

    Maybe it is biases and underlying tensions, where we either seek to be part of the crowd or worry about what an effective collaboration means to me as an individual.

    Maybe it is that we just assume we all know how to collaborate and yet fail to make it work.

    Whatever the reason, organizations that talk the collaboration talk, often fail to walk the walk effectively.

    Let me give you an example.

    There is all kinds of research and commentary that effective collaboration requires, at a minimum, a well declared goal/purpose and clearly defined roles for “collaborators.”

    But let’s look at in practice using, you guess it Gartner’s Enterprise Technology Adoption Profiles (ETAs).

    I’m gonna make some connections between a few different research studies, but bear with me. There are 3 of our profiles where the “control” attribute is based on a Collaborative Approach between business and IT:

    • SCD – Strict Planners, Collaborative, Dynamic pace of change
    • ACR – Accommodating Planners, Collaborative, Responsive pace of change
    • FCM – Flexible Planners, Collaborative, Measured Pace of Change

    In our study of high quality deals (situations where both the customer and vendor are happy)”, we saw one of these groups stand out in terms of the % of high quality deals among our respondents:

    • SCD – 50%
    • ACR – 15%
    • FCM – 12%

    A big, big difference. That stood out, but then we did a second survey, where we asked respondents how much they agreed with a statement that basically said “we regret nearly every technology product we purchase via subscription.” The results (higher is worse, of course):

    • SCD – 28%
    • ACR- 71%
    • FCM – 71%

    Again, a huge difference. It seems that our SCD group has figured out how to collaborate. Their strict approach to technology planning and focus on moving as fast as it prudently possible helps them standout. The other two seem to muddle along, trying to delay or avoid decisions (yes, we have other research that shows those profiles have lots of no decisions) as much as possible.

    But just recently, I found some other data in some of our research. I’ll be writing about this with some colleagues, but we recently asked about roles responsible for decision making or influencing decisions around SaaS and IaaS/PaaS. What jumps out for me is all of these ETA groups involve a diverse set of roles in decisions, with a range of 8 to 11 roles typically involved.

    But, there was a big difference–the average number of roles designated as decision makers.  The means for the SCD groups was 3.69 (SaaS) and 3.20 (Iaas/PaaS).  The means for the ACR group was 5.12 and 4.77. Finally, the mean for the FCM group was 5.82 and 5.45. The two groups that struggle with effective decisions have significantly more people designated as decision makers (vs. influencers). More decision makers is not a bad thing–if the specific decision responsibilities are clear. But if they are muddled, then it backfires.

    Putting the data from the different studies together, it seems that the ACR and FCM groups have a lot of unclear roles and responsibilities, leading to paralysis, passing the buck, or just getting stuck (BTW, another fact from studies–these two groups have many more no decisions than any other ETA

    It is also important to remember that the ACR and FCM groups are two of the largest ETA groups–typically accounting for well over 40% of the market. For vendors, this is both a warning signal and an opportunity. Collaboration, done right, is powerful (look at the SCD group). You have an opportunity to help your customers that want to be collaborative, but haven’t figured out how, to do it more effectively. It will make a big difference for you, and them.

    Author: Hank Barnes

    Source: Gartner

  • The risk of undervaluing creativity

    The risk of undervaluing creativity

    Agencies’ creative perspective, the very currency of the business, is at risk and can only be realized by shifting billions from tech to fund creative differentiation.

    “The value of agency creativity is at risk of disappearing”

    The marketing industry is woefully out of balance, from agency/client relationships to new business requirements and compensation. The healthy tension of creativity that once balanced the needs of the brand with the needs of its customers, the commercial effectiveness of the work versus its cultural impact, and the needs of agency economics versus the client’s growth is all eroding. These are now one-sided issues. The tension is no longer healthy. Nowhere is this more evident than in agency economics. Agencies today barely grow at the current rate of inflation. Insourcing, margin compression, cost-cutting, new competitors, and tech priorities threaten the existence of agencies and undermine their value.

    “Customer experience has stagnated”

    Strong evidence of creativity’s languish is already underway. Customer experience has stagnated. Forrester’s Customer Experience Index (CX Index™), a study of 100,000 consumers and 300 brands that has been run for more than a decade and acts as a barometer for CX performance, is flat for the fourth consecutive year. Most brands are stuck in the middle, struggling to improve over competitors. Zero brands are rated to have an excellent experience. Forrester determined that there are four types of CX performance: the Languishers, Lapsers, Locksteppers, and Laggards. No brand is performing well. Worse still, for every 1-point drop in CX Index score, companies lose 2% of their returns. It’s only a matter of time before companies’ growth is impacted.

    “We’ve commoditized the brand and homogenized experiences”

    The issue is that the work looks, feels, and behaves too similar. The industry obsession for meeting every customer need and want for ease and convenience by using technology has left little room for creative differentiation. That has come at a cost. The front door to your brand is a web or app experience that is virtually indistinguishable. Fashion experiences look the same. Quick-service restaurant and coffee apps allow you to order ahead and skip the line. All airline apps allow travelers to check in, manage travel, and use a mobile device as their boarding pass. What can make one brand different from another when the experience is built from the same common technology platform, designed to solve the same user or category need, and programmed for the same two devices? Creativity.

    “We’ve overfunded technology and underfunded creativity”

    Unfortunately, just when creativity is needed the most, business leaders are investing in it the least. Forrester forecasts that spending for adtech, martech, data, and analytics will grow between 9% and 11% through 2022. Agency spending will only grow a mere 2.4%. And client budgeting and priorities are only part of the problem. Agencies are underfunding creativity, too. As of 2014, agencies had spent $12 billion-plus for data and technology resources and acquisitions. While the agency data platforms do power media and audience activation, all but one integrates with the creative process. And creative departments remain skeptical and dismissive.

    “It’s time to fund creative differentiation”

    Forrester developed an ROI for creative agency investment that determined that moving a portion of the marketing budget out of technology and into agency creativity will bring a higher return on investment compared to currently projected spending levels. This serves as a six-year growth plan for CMOs that ultimately helps achieve 20% growth for the entire industry. These are not new dollars but rather a reallocation of currently projected spending that maintains significant adtech and martech investments.

    “It’s time to reinvent creativity”

    To deliver clients the growth they need and customers the experiences they demand, agencies must innovate their structures, capabilities, workforce, and process. Structurally, data, technology, media, and creative should all come together and put creative problem-solving at the center. This means the newly acquired data, tech, and operating agencies should also come together. And especially, it means agencies leaders will need to make consolidation and coordination a priority. Tough decisions must be made in the name of agency brand coherence and a model that is easier for clients to engage. Training today’s workforce to be tomorrow’s data-, technology-, and creative-literate is critical. And creative departments must embrace data- and tech-driven creativity.

    We’re living during one of the most interesting times in the history of the industry, with the opportunity to shape and define it. A whole new era of amazing marketing is only possible if we fund the balance of creativity and technology. Take up the mantle to modernize the industry. Reinvent the creative process.

    Author: Jay Pattisall

    Source: Forrester

  • Using data successfully: the role of data democratization

    Using data successfully: the role of data democratization

    An effective culture to underpin your strategy

    A business that looks to become truly data-driven knows that employees are more likely to back the wider strategy if they have access to data that helps them do their jobs better. Data democratization and the positive culture it can create is, therefore, critical to the long-term success of any organization.

    According to a recent reportData Strategy and Culture: Paving the Way to the Cloud, senior decision-makers are confident that they’re opening up access to data sufficiently.

    So do your employees at all levels actually have adequate access to data to boost their decision-making? Does the data at their disposal and how they work with it turn employees into strong advocates for your organization’s data strategy? I seek to address these questions.

    This blog focuses on the crossover between data strategy and deployment decisions. It covers:

    • Why data democratization is critical to developing a positive data culture
    • What are the main barriers to this
    • What else you can learn?

    Data democratization: room for improvement

    Successful organizations identify the key capabilities that are required to execute their data strategy effectively. Infrastructure decisions are an important part of this as any limitations can cause frustration and poor engagement — ultimately, the wrong choice can restrict how well an employee can perform in their role.

    Almost four out of five respondents to our survey say their current IT infrastructure makes it challenging to democratize data in their organization. This is a significant obstacle to be overcome. There are additional barriers, too, such as a lack of relevant data skills or too many new data sources.

    At this point, businesses have to focus on which deployment model best meets their needs. On the topic of data democratization, many will naturally think of the benefits the cloud can bring. The right deployment model allows for data sharing in a secure and cost-effective manner across all levels and departments. It allows people, and therefore the company, to perform at their best.

    Don’t limit your potential

    Despite the importance of this, almost half (46%) of respondents to our latest research believe that the democratization of data isn’t feasible for them.

    This could be a big risk. If your technology infrastructure doesn’t allow you to open up access to data across the whole business, you’re stopping your organization from becoming truly data-driven. This could ultimately mean that insights can’t be gathered quickly enough, projects could be stalled, and a competitive edge on competitors can be lost.

    Make the data work

    There is a clear need for organizations to carefully consider which deployment option gives them the freedom needed to effectively open up access to data. Yet, the story doesn’t end once a decision has been made.

    Teams must constantly monitor whether employees are able to work with the data at their disposal effectively. Can they get the insights they need from the data? Is there an ambition to increase the spread of data democratization within the organization?

    So when it comes to optimizing the success of your data strategy, data democratization is an important and key step in the process — and your company needs to get it right.

    The report investigates all of the key points raised in this blog and explains how developing a positive data culture starts with data democratization. This is the point when you secure your employees’ backing of the project. Only then are you truly ready to choose the right deployment model.

    Author: Mathias Golombek

    Source: Dataversity

  • Using the right workforce options to develop AI with the help of data

    Using the right workforce options to develop AI with the help of data

    While it may seem like artificial intelligence (AI) has hit the jackpot, a lot of work needs to be done before its potential can really come to life. In our modern take on the 20th century space race, AI developers are hard at work on the next big breakthrough that will solve a problem and establish their expertise in the market. It takes a lot of hard work for innovators to deliver on their vision for AI, and it’s the data that serves as the lifeblood for advancement.  

    One of the biggest challenges AI developers face today is to process all the data that feeds into machine learning systems, a process that requires a reliable workforce with relevant domain expertise and high standards for quality. To address these obstacles and get ahead, many innovators are taking a page from the enterprise playbook: where alternative workforce models can provide a competitive edge in a crowded market. 

    Alternative workforce options

    Deloitte’s 2018 Global Human Capital Trends study found that only 42% of organizations surveyed said their workforce is made up of traditional salaried employees. Employers expect their dependence on contract, freelance and gig workers to dramatically increase over the next few years. Acceleratingthis trend is the pressure business leaders face to improve their workforce ecosystem as alternative workforce options bring the possibility for companies to advance services, move faster and leverage new skills. 

    While AI developers might be tempted to tap into new workforce solutions, identifying the right approach for their unique needs demands careful consideration. Here’s an overview of common workforce options and considerations for companies to select the right strategy for cleaning and structuring the messy, raw data that holds the potential to add rocket fuel to your AI efforts:

    • In-house employees: The first line of defense for most companies, internal teams can typically manage data needs with reasonably good quality. However, these processes often grow more difficult and costlier to manage as things progress, calling for a change of plans when it’s time to scale. That’s when companies are likely to turn to alternative workforce options to help structure data for AI development.
    • Contractors and freelancers: This is a common alternative to in-house teams, but business leaders will want to factor in extra time it will take to source and manage their freelance team. One-third of Deloitte’s survey respondents said their human resources (HR) departments are not involved in sourcing (39%) or hiring (35%) decisions for contract employees, which 'suggests that these workers are not subject to the cultural, skills, and other forms of assessments used for full-time employees'. That can be a problem when it comes to ensuring quality work, so companies should allocate additional time for sourcing, training and management.
    • Crowdsourcing: Crowdsourcing leverages the cloud to send data tasks to a large number of people at once. Quality is established using consensus, which means several people complete the same task. The answer provided by the majority of the workers is chosen as correct. Crowd workers are paid based on the number of tasks they complete on the platform provided by the workforce vendor, so it can take more time to process data outputs than it would with an in-house team. This can make crowdsourcing a less viable option for companies that are looking to scale quickly, particularly if their work requires a high level of quality, as with data that provides the intelligence for a self-driving car, for example.
    • Managed cloud workers: A solution that has emerged over the last decade, combining the quality of a trained, in-house team with the scalability of the crowd. It’s ideally suited for data work because dedicated teams develop expertise in a company’s business rules over time by sticking with projects for a longer period of time. That means they can increase their context and domain knowledge while providing consistently high data quality. However, teams need to be managed in ways that optimize productivity and engagement, and that takes something. Companies should look for partners with tested procedures for communication and process.

    Getting down to business

    From founders and data scientists to product owners and engineers, AI developers are fighting an uphill battle. They need all the support they can get, and that includes a dedicated team to process the data that serves as the lifeblood of AI and machine learning systems. When you combine the training and management challenges that AI developers face, workforce choices might just be the factor that determines success. With the right workforce strategy, companies will have the flexibility to respond to changes in market conditions, product development and business requirements.

    As with the space race, the pursuit AI in the real world holds untold promise, but victory won’t come easy. Progress is hard-won, and innovators who identify strong workforce partners will have the tools and talent they need to test their models, fail faster and ultimately get it right quicker. Companies that make this process a priority now can ensure they’re in the best position to break away from the competition as the AI race continues.

    Author: Mark Sears

    Source: Dataconomy

  • What is Machine Vision and How Does it Help Businesses Grow?

    What is Machine Vision and How Does it Help Businesses Grow?

    The idea of computers having eyes and a human-like brain is old news. And machine vision was the first to bring this sci-fi idea into being.

    This engineering discipline leverages evolving tech comforts to automatically inspect and analyze objects under study. AI-based machine vision boasts a myriad of use cases, from barcode reading and pattern matching to defect detection and sorting.

    But how exactly can this tech wonder speed up your business growth? And what is machine vision in its nature? Let’s unravel these questions further in the article.

    What is Machine Vision in Artificial Intelligence?

    Since the advent of AI-powered technologies, machine vision, also known as the inspection system, has been gaining momentum to discover its full potential.

    This discipline refers to a scientific branch in artificial intelligence, in particular robotics and related technologies. It is aimed at obtaining images of real-world objects, processing them, and using the gathered data to solve various kinds of applied tasks without (full or partial) human participation.

    In layman’s terms, it can be described as the ability of machines to see.

    Bloomberg suggests that its global market is projected to hit around $18bn by 2025. Also, Allied Market Research forecasts the global machine vision systems market to increase up to $74 bn by 2027. Therefore, this technology is on the rise today and is expected to keep its stronghold.

    Machine Vision vs Computer Vision

    A computer vision system and inspection system are often cited interchangeably to reflect a similar concept. Indeed, they both rely on AI, but what is the difference between the two? To solve the machine vision vs computer vision dilemma, let’s get grounded in the basics first.

    Computer vision is both a discipline and a set of related technologies. It is a subject of artificial intelligence that deals with how computers can visually sense objective reality and develop a profound understanding from digital images or videos.

    The most popular among all computer vision applications is facial recognition systems. You must have come across those image transformation apps that allow you to swap your gender to see what you’d look like if you were male or female. Or you can post a photo from a party on social media, and this technology recognizes a friend on it in a split second and offers you to tag him or her.

    Our second candidate follows a different pattern since it is about the application of knowledge and technology. It helps make the production of goods and services more efficient, using the same principles as computer vision.

    To make the machine vision vs computer vision comparison more tangible, let’s imagine the former as the body of a system. In this case, computer vision takes the form of the retina, optic nerve, brain, and central nervous system. Thus, a machine vision system uses a camera to view an image, whereas computer vision algorithms transform the media source into machine-readable code.

    Therefore, those two tech advancements aren’t mutually exclusive. Instead, they amplify each other’s capabilities and use Artificial Intelligence to address the same problem.

    Basic Principles of Machine Vision in AI

    In the 1980s-1990s the machine vision systems market began to expand. Back in the early days, inspection systems were in their infancy and successfully recognized only 65-70% of the items. Today, a powerful combination of machine vision software and hardware work together to enable complex systems. Let’s have a look at the typical components of this procedure:

    • Sensors or cameras
    • Lens
    • Lighting systems
    • Software and computer to process the images
    • Algorithms to identify the patterns
    • Physical inputs/outputs as the means of communication

    In short, the process of AI-based machine vision goes through the following steps:

    • The lens captures the image and transmits it to the sensor as a light projection.
    • The system performs automated processing and analysis of the image.
    • The system decides what to do with the image next.
    • Then, the results are communicated through physical outputs.

    Now that we’ve scratched upon the main steps of this process, let’s dwell on how those systems work in more detail.

    Machine Vision Systems: Step-by-Step

    Here’s a breakdown of how machine vision AI works in the realm of the industrialized flow.

    Step 1. It all starts with the lens capturing the image and transmitting it to the sensor as a light projection. To optimize the vision system, the video camera must have a powerful zoom lens. Although there are many types of lenses, machine vision companies typically use lenses with a fixed focal length for ease of control.

    Step 2. The camera sensor turns the light into a digital image, which is then transferred to the processor for further analysis.

    Step 3. Then, it’s the software tools that come on stage. We use some of them before image analysis (preprocessing), others are used to determine the given properties of the object under study. At the preprocessing stage, effects can be applied to the image to sharpen the edges, increase the contrast or fill in gaps. This is done to improve the capabilities of other software tools, which come into use later.

    Step 4. After that, computer vision models analyze the digital images against a defined set of criteria to identify defects, sort products, or complete a variety of other tasks.

    Step 5. Finally, the circle is round with an I/O signal or data that is forwarded to a logged device.

    As you see, it is quite a straightforward process, yet it performs mundane tasks more quickly and efficiently than humans ever could.

    Why Use Machine Vision Software For Business?

    Since the 1970s, to a variety of industrial applications today, the machine vision systems market has grown from a theoretical viewpoint to the lifeblood of manufacturing. Machine vision technology allows computers to identify people, places, and objects in images with an accuracy that is comparable to or even greater than human ability.

    From a business standpoint, inspection systems, often based on deep learning models, automate the extraction, analysis, classification of useful data contained in a digital image or series of images.

    Nowadays, any business producing items to a particular standard avail of vision capabilities as part of its manufacturing process.

    In particular, the differentiators of machine vision software include:

    • Optimized quality through a thorough inspection of the products
    • Minimized production waste is achieved by identifying errors early in the process
    • Maximized throughput thanks to identifying inefficiencies in the process. It also enables organizations to reach their full performance potential.
    • Minimized part damage that is ensured by removing physical contact between a test system and the parts being tested.
    • Supreme safety is done by eliminating the human factor during a manufacturing process.

    Now let’s go over from mere rhetoric to how machine vision ai is implemented in various industries.

    Machine Vision AI: Popular Use Cases

    Although the machine vision process is most often associated with manufacturing, the reach of this approach has transgressed industrial applications. Today, this technology spans a gamut of fields through to our daily lives. Below, we’re sharing some of the most common vision trends in various industries.

    Manufacturing

    As we’ve mentioned, manufacturing is the major task generator for inspection systems. Hence, this field incorporates the largest number of applications and makes the largest contribution to inspection systems.

    The complexity of machine vision AI varies from Industry 4.0 to a small sawmill with input wood volume control.

    Below, you’ll find the most popular applications of vision system cameras:

    • Production lines (barcode scanning, quality inspection)
    • Robotics (vision-guided robots)
    • 3D scanning (three-dimensional inspection and measurement of complex 3D free formed surfaces)
    • Logistics (including track trace)
    • Product yield inspection (automated quality control)
    • Barcode scanning (Sorting, measuring, identifying, and delivering items).

    Traffic Management

    Currently, traffic engineers can benefit from the innovations to optimize and enhance traffic management. In particular, it is applied to the following areas:

    • Traffic monitoring,
    • ITS projects,
    • Infrastructure condition monitoring (asphalt, railroad, overhead contact line scanning).

    Thanks to the reliability of hardware, engineers can successfully execute these tasks seamlessly and round the clock.

    Entertainment and Film Industry

    This AI-enabled technology has also been revamping the visual media and entertainment sectors. VR/AR content volumetric capture augmented reality vision, and much more own its inception to machine vision cameras.

    The latter can synchronize with each other and seeing not only in color but also in the infrared range.

    Science

    A growing number of science centers, universities, SCBs, and schools use the visual concept for scientific and educational purposes. The market is inundated with cameras with various characteristics and resolutions.

    Thus, such cameras enable scientists to target the following problems:

    • High-speed cameras
    • High-resolution cameras
    • Water and temperature resistant cameras
    • Monochrome cameras with 12-bit color depth (usually 8-Bit up to 256 color gradations)
    • High spectrum sensitivity, etc.

    Sports

    Among others, machine vision companies also supply AI equipment to enable sports performance analysis. Photo-finish systems, track, and trace systems have found broad implementation in the sporting field.

    Thus, in 2017 Wimbledon partnered up with IBM to seize the data surrounding performance to drive improvement.

    Final Word

    We have stepped into a wonderful era of tech breakthroughs. An era when computers can mimic the human language and machines have gained the ability to identify and recognize a variety of objects. Vision technologies, in particular, have fast-forwarded the AI revolution and redefined daily business processes.

    Today, vision technology spans lots of industries and fields, including manufacturing, science, logistics, and others. Global businesses can leverage tech advancement to reduce defects, drive down costs, and improve on-site safety.

    Author: Tatsiana Isakova

    Source: InData Labs

  • What to expect for data governance in 2020?

    What to expect for data governance in 2020?

    Data governance always has been a complicated issue for most organizations. That won’t change in a big way in 2020. In fact, the increasing prevalence of technologies like artificial intelligence (AI) and machine learning (ML) may show up some of the pains even more. Don’t take that to mean that companies aren’t becoming more mature in their approach to Data Governance, though.

    AI, ML, the Internet of Things (IoT), and full process digitization will be a focus for organizations in 2020. Companies see them as required capabilities in the future and so are willing to invest in more digital innovation. 'This is expanding the governance lens and I’m seeing AI Governance becoming a reality in leading organizations', said Kelle O’Neal, founder and CEO of First San Francisco Partners. This trend shows that companies are seeing value in Data Governance so they’re extending successful practices into other areas of their business, she said.

    Organizations are realizing that AI is only successful when built upon a solid data foundation, thus driving the need for data governance, agreed Donna Burbank, managing director at Global Data Strategy:

    'I’ve had venture capital organizations approach us to train their AI startups in the foundations of data governance as a condition for investment', she said. 'I see that as an extremely positive sign pointing to the widespread need and adoption of data governance principles'.

    And yet poor data quality resulting from problems with data governance bedevils AI and ML outcomes and there’s no sign that that won’t be the case next year too.

    'Artificial intelligence and machine learning have been way oversold. Data quality gets in the way of getting good results and organizations spend way, way more time cleaning things up', said Thomas C. Redman, Ph.D., 'the Data Doc' and President of Data Quality Solutions. He estimates that more than 80% of AI and ML programs continue to fail because of this.

    Governance defined …Yet?

    One question that many companies will continue to grapple with in the new year is figuring out just what data governance is. In simple terms, said Redman, it’s a senior oversight function whose leaders advise the board or senior management about whether a data-related program is designed in the best interest of the company and is operating as designed. And as he sees it, no one is doing that yet.

    'There’s all talk about data as the most important asset, but having that oversight level would be essential if that statement were correct', he said. It’s not about plugging in various tools but about thinking of just what data governance is … and what it isn’t:

    'The term ‘governance’ is being used for everything from moving data from here to there to something about how you operate analytics. That’s not the proper use of the term'.

    Getting roles and responsibilities right is critical, he said. Data governance should be business-led and IT supported, Burbank remarked: 

    'All areas of the business need to have accountability for the data in their domain and establishing data stewardship roles is critical to ensuring accountability at all levels of the organization from strategic to tactical'.

    Chief Data Officer (CDO) roles are becoming more common, and the office of the CDO does best when it reports up through a business function like operations, strategy, or shared services, said O’Neal, or even finance if that team is influential in driving enterprise programs that result in corporate growth.

    Organizations that have matured their data governance practices will grow from a program culture to a data culture, which is one:

    'Where new employees start learning about data governance as part of their new-hire training, and data governance and management are part of the conversation at the board level', said O’Neal.

    What will data governance look like in 2020?

    It’s true that there haven’t been drastic changes in how far we’ve come with data governance over the past year, but O’Neal finds that companies are showing progress:

    'More and more companies are moving from ‘what is data governance and why should I do it,’ past creating a strategy, into not just implementation but also operationalization, where their data governance is really embedded with other project, decision-making, and ‘business as usual’ operations', she said.

    In terms of a formal, structured approach, the DAMA DMBoK is gaining wide acceptance, which is a positive step in aligning best practices, Burbank said:

    'While data governance is certainly not a ‘cookie cutter’ approach that can be simply taken from a book, the DMBOK does offer a good foundation on which organizations can build and customize to align with their own unique organizational needs and culture'.

    In 2019, Global Data Strategy supported data governance for a diverse array of sectors, including social services, education, manufacturing, insurance, building, and construction. 'It’s no longer just the traditional sectors like finance who understand the value of data', she said.

    Big value in small wins

    It’s really hard to impose Data Governance frameworks on big data at enterprise scale. It is better to start with small data first and Redman is optimistic that more companies will do so in 2020.

    'Practically everyone sees the logic in small data projects', he said. 'Suppose that only half of a hundred small data projects succeed, that’s a huge number of wins', with positive implications for cost savings and improvements in areas like customer service. And solving more of these leads to learning about what it takes to solve big data problems. 'If you build the organizational muscle you need doing small data projects you can tackle big data projects'.

    Following the classic rule of thinking big and starting small in order to have the proper data governance framework and foundation in place is what works, Burbank said. Establishing small 'quick wins' shows continual value across the organization.

    Tools to help

    2018 saw astounding growth in the data catalog market, O’Neal said. Data catalogs provide information about each piece of data, such as location of entities and data lineage. So, if you haven’t thought about that yet, it’s time to do that this year, she said.

    The good news is that the modern tools for Metadata Management and data cataloguing are much more user-friendly and approachable, according to Burbank:

    'Which is a great advancement for giving business users self-service capability and accountability for metadata and governance'.

    Redman noted that 'you can love your data governance tools, and I do too. But if you approach the problem wrong it doesn’t matter what tools you have'.

    What’s up next

    In 2020, the organizations that are able to get their own data governance in order will reach out to others in the industry to establish cross-organization data governance and data sharing agreements:

    'For example, organizations in the social services or medical arena are looking to provide cohesive support for individuals across organizations that provide the best level of service, while at the same time protecting privacy', Burbank said. 'It’s an interesting challenge, and an area of growth and opportunity in the data governance space'.

    There’s an opportunity this year for companies that are moderately mature in data governance to think about how to embed practices in business processes and decision-making structure of the organization. Places to look for embedment opportunities, O’Neal commented, are new project initiation and project management, investment approval and funding, customer creation and on-boarding, product development and launch, and vendor management/procurement.

    Expect data analytics and BI to continue to be large drivers for data governance:

    'As more organizations want to become data-driven', Burbank said, 'they are realizing that the dashboards used to drive business decision-making must be well-governed and well-understood with full data lineage, metadata definitions, and so on'.

    Author: Jennifer Zaino

    Source: Dataversity

  • Why agile learning is essential to your business

    Why agile learning is essential to your business

    The digital deadlock is affecting many organizations today, big and small, and across all industries. Vast amounts of technology investments are being poured into the engines of aggressive digital strategies, but are delivering little in the way of progress. In fact, many are 'stuck in their journey'. What’s holding them back?

    IDC has looked into this very closely in the past few years and has found that the ´digital skills gap´ (when demand for IT skills is not met with adequate supply) is a top obstacle for those organizations in their digital agenda. Interestingly, the challenge is not only in recruitment, but most crucially in bringing up to speed the current workforce with the new skills. Employees are not learning fast enough.

    Our surveys show that the impact of the skills gap is broadly felt across the organization, from quality performance to customer satisfaction to business revenue growth. In fact, IDC estimates it will affect 90% of all European companies, resulting in $91 billion in lost revenue in 2020.

    The skills gap is now a board-level issue, and employers are determined to tackle the problem themselves by reskilling their own workforce. If colleages and professional schools are not providing an adequate supply of IT professionals, corporate training programs and internal mobility could fill the gap.

    This is a significant shift by employers in their training practices and policies. After a decade of austerity following the global financial crisis in 2008, they have now realized that learning means business.

    Is the workforce ready for the jobs of the future? Welcome to agile learning

    IDC believes agile learning is the way forward for any digital organization because it aligns skills and required training with business value and strategy. It is permanently evolving, keeping pace with new market needs and technology developments.

    From content, format, to channels of delivery, agile learning is business relevant while driving superior employee experience.

    Agile learning has the following common traits:

    1. Employee focused: Training needs to be applied to the task and woven into the flow of work (easily digestible). It ultimately has to help employees “get the job done” and achieve better performance (impactful). This could include the consumption of bite-size content (even in minutes), any time and by multiple channels, to fit work demands.
    2. Business relevant: Training cannot be decided unilaterally by the employee, manager, or HR. It has to be a cross-functional effort to ensure that career development goals and training needs are aligned with business requirements: the right materials, to the right employees, at the right time.
    3. AI/ML enabled: Training can be enhanced by intelligent technologies in multiple ways. AI/ML can help employees by providing career pathway recommendations; for employers, it can identify training that addresses the skills gap. In the not-so-distant future, intelligent technologies will be able to measure the impact of training on performance and business outcomes, helping to make it business relevant.

    Agile learning will be ingrained in our work culture moving forward, helping us to become more competent in our jobs (upskilling) or even to move into new jobs (reskilling). It can also prepare us for the new jobs of the future, those that have not even be created yet. In this respect, IDC expects micro-degrees to become increasingly popular.

    Micro-degrees can be useful to equip employees, reasonably quickly, for new jobs such as a flying car developer or an algorithm bias auditor. Developed in partnership with academia, industry, and employers, micro-degrees complement lectures with on-the-job training.

    Agile learning affects us all. As the retirement age rises, we should be able to expect significant mobility throughout our careers. Agile learning will be part of a lifelong learning work culture, mandated by the C-suite and instilled into the organization.

    To quote the World Economic Forum’s Future of Jobs Report 2018, 'By 2025, 75 million current jobs will be displaced by the shift in the division of labour between humans, machines and algorithms, but 133 million new jobs will be created as well'.

    Your current job might be one of those 75 million. Act now to enjoy the Future of Work with the other 133 million.

    Author: Angela Salmeron

    Source: IDC UK

  • Why it is key for leaders to infuse data and analytics into organizations

    Why it is key for leaders to infuse data and analytics into organizations

    Data and analytics are proving more important to companies’ strategies than ever before, according to a survey by Harvard Business Review Analytic Services. However, many organizations still fall short of achieving their analytics goals, owing to a skills gap and issues with data access and usage. That’s because almost 75% of organizations don’t have a leadership that completely backs a data-driven culture or nurtures and supports analytics-led innovation. Alarmingly, 24% of respondents say their company culture tends to limit access to information, and 20% think that organizational structure impedes use of analyzed data.

    That figure just hints at part of the problem: 19% of those surveyed blame a lack of leadership prioritization, and 11% say that the failure of previous data projects has led to disillusion and disinvestment in data and analytics. The end result of these combined issues is that 74% of companies experience leadership or cultural reluctance to use data and analytics to their fullest.

    Senior executives are failing to lead by example and embrace data and analytics. In turn, their teams have failed to adopt data-led approaches to their work, neglecting intelligence that could provide beneficial strategic insights and missing opportunities to drive growth, increase revenue, and evolve their businesses.

    We asked some of our experts what leaders should do to put this right. Read on to find out why execs of all kinds must be data evangelists.

    Execs driving evolution: Infusing data across the entire organization

    To maximize the benefits of data and analytics for any organization, our experts agree that business leaders must foster a cultural shift in their companies. Thinking differently and encouraging new habits throughout a business starts at the top.

    Achieving this requires them to develop a vision around being an intelligence-led company. Execs should support the education of colleagues and advocate an organizational culture that adopts the use of analytics as more than just a best practice. They need to adopt technological solutions that infuse analytics into every process, product, and service.

    “First and foremost, it’s an issue of C-suite leadership,” observes Guy Levy-Yurista, former Sisense Chief Strategy Officer. He explains: “Typically, they don’t concern themselves with data and analytics. It’s something they prefer to outsource to data specialists. This has to change if they want their businesses to survive. For an organization to become data-driven, the culture needs to change, and that change must be led by those at the top. The C-suite must embrace, and be seen to embrace, data and analytics. When the top leads, the rest will follow.”

    Envisioning a data-focused company

    Guy calls for companies to build a two- or three-year cohesive strategy that inculcates the use of data and analytics throughout the organization. He says, “Every company must have an embedded data strategy that takes into account the working practices and all of the data needs of every division.”

    This involves taking a fresh approach to data in order to get better results. 

    “Data-driven culture doesn’t mean ‘bring charts to meetings’ or ‘make decisions with numbers,’” explains Scott Castle, Sisense’s VP and General Manager, Internal Analytics. “It means implementing a hypothesis-driven culture. Identify theories, test them, rigorously seek to disprove them while rapidly implementing those that show promise. Make decisions with evidence. Don’t let your team search out favorable statistics. Encourage them to look at the complete data picture and come to conclusions based on the preponderance of the evidence.”

    To this end, Charles Holive, Managing Director of Sisense’s Strategy Consulting Business, calls for the appointment of a chief data officer in every leadership team to be the main advocate for data-driven working practices, and he says they should have revenue targets. He concludes: “This is not an initiative. It’s a way forward, a mandatory muscle for all companies to develop by infusing analytics in everything they do internally and externally, to overall increase returns on investments for their companies and their customers.”

    Analytics success stories: Companies infusing analytics to win their industries

    Smarte Carte has done an excellent job of bringing all of its data together and putting it into the hands of its field team, so everyone is working with near real-time data from their mobile devices. This helps ensure better forecasting, reduces product/kiosk downtime, and ensures that its people have the answers they need when and where they need them.

    Another huge example of a company leading with data is Amazon: The Seattle tech giant is an extremely data-driven company, with customer service at its core. Amazon measures the effectiveness of almost everything it does, including innovation. Guy observes that Amazon gives employees license to innovate almost at will. (Indeed, it’s an unofficial motto around the “Everything Store” that employees should innovate their own jobs away.)

    It’s becoming increasingly important to try to measure innovation with data. Providing it can, and providing the numbers show a benefit, the innovation becomes regular operating procedure. With this in mind, Guy recommends embracing innovation as a critical driver of success. 

    “Innovation can be inexact and inefficient, often by design,” says Guy. “So, any company needs to create a team and allocate a budget that’s dedicated to innovation and that has the latitude to examine data further, stretch parameters, and explore whether there are new possibilities out there.”

    Pressure drives evolution: Companies transforming under COVID-19

    The coronavirus pandemic forced companies to think differently and improve their agility. Innovation became critical for nearly every business. Scott describes it as “a perfect example of a sudden market change that required every business to reconsider its fundamental assumptions.”

    Some companies, like those in air travel and hospitality, saw demand for their products almost completely disappear overnight, and others, like Zoom and grocery stores, saw it scale unexpectedly. They all required quick responses to adjust to, capitalize upon, or even simply survive in the new reality. Recall the supply chain problems that paralyzed supermarkets in April 2020: Organizations faced with totally new market dynamics needed to test new hypotheses and run experiments quickly — and those that did, using data and analytics, survived and thrived. 

    The pandemic brought a new focus to data and its timeliness because conditions could change daily; a dashboard being updated once a week or month was no longer acceptable. The customer experience changed as well, almost overnight, and will only continue to evolve.

    Taking a macro view, Charles observes that, “Many markets got to be reset through the pandemic … giving an opportunity to large, small, and new players to reinvent themselves and tackle the market from a redefined environment. It’s been surprising to me to see how fast companies, doing it through the value and data-driven approach, went on to win more.”

    Lead with change, or change leaders

    The consensus is clear — organizations can’t stand still. To flourish, they must be led using actionable intelligence, derived from data and analytics. They must infuse analytics into their practices, their products, their services, and even alter their organization’s DNA if necessary, to become modern businesses.

    To do that, they need their leaders to become evangelists for analytics. They must expedite the infusion of analytics everywhere and enable everyone to use actionable intelligence. The choice is stark for the leaders of every business: Do this for your organization to survive and thrive, or die.

    Author: Adam Murray

    Source: Sisense

  • Why startups are often cautious about investing in big data

    Why startups are often cautious about investing in big data

    In recent years, the term Big Data has become the talk of the town, or should we say, the planet. By definition, big data analytics is the complex process of analyzing huge chunks of data, trying to uncover hidden information - common patterns, unusual relationships, market trends, and above all, client preferences. All these are taken into careful consideration and big decisions are made based on the calculations, with high hopes of success.

    When described as such, it seems that an average entrepreneur would simply jump at the opportunity to use big data for their startup, yet that is not as common as one may expect. Why is it so? 

    We’re going to offer several possible explanations for why startup owners are not keen on investing in Big Data .

    Obsolete and infinite mindset

    According to Dr. Tom Davenport, a renowned academic researcher, entrepreneurs are not analytical by nature. Hence, they can oversee the importance of data analytics. Dr. Davenport argues that most startup leaders rely on the good old gut feeling when it comes to making decisions. In addition, he claims that the change of mind regarding Big Data most probably occurred due to the successful examples of companies using analytics to their advantage, such as Amazon, Netflix, Facebook, or Google. Another reason would be that a startup’s niche is such that analytics has become inevitable for any company working with it.

    Therefore, the shy attitude of some startup owners towards data analytics is explained as something that is in their character. However, while it is true that there is quite a number of entrepreneurs that are used to the “old ways”, we’d say that nowadays a self-aware, modern entrepreneur with an infinite mindset will think of Big Data advantages at the very moment when they’re founding the startup, or even before.

    On the other hand, if a startup leader is studying any of the leadership traits for the 21st century, then they are well aware of the Big Data downsides too. An entrepreneur may avoid it because they are very well informed.

    Overwhelming for the inexperienced

    Another reason why a startup company may be missing out on valuable information on consumer behavior lies in the fact that not each startup company has skilled staff.

    A startup, as the name says, is a company at the beginning of its exciting journey. Thus, there is a number of positions in the company that are yet to be filled out by professionals. Data analytics, if anything, is overwhelming to research and analysts. It is often too unstructured.

    Pricey 

    Of course, there is always the option of hiring a professional to do detailed analytics. However, that requires extra funds. Open source technology relies heavily on staffing, maintenance, hardware, and so on. Going over budget is quite common for data analytics, so the companies providing data are pressured into charging more for the services as well.

    Doing it on your own also comes down to big costs. First, there’s the software tool you’ll be using. Then, all that data needs to be properly stored, which is another expense. The workforce managing this also costs money. 

    For new startup companies, this may be too much to handle. Hence, the benefits of Big Data are sometimes not overlooked, but simply set aside for the better days.

    Privacy-invading

    Big Data has two sides. It provides insight into someone else’s data, but it may leave one exposed too. Startups are especially vulnerable unless they have established a strict policy and training on cybersecurity from the very beginning. Otherwise, their own data may be manipulated.

    Even worse, should the client’s privacy be invaded, that would automatically harm the company’s reputation, potentially leading to fines and lawsuits. A data breach can have grave consequences for the startup.

    Time management issues

    Moreover, it takes some time to discover customer and market trends regarding a particular company, so it does not provide quick results that can be used immediately. For a startup, fast progress is extremely important. Why risk it then that the data collecting lasts longer than the startup itself?

    Next, technology is changing at an extremely rapid pace. One may invest in certain software that will become obsolete in a couple of months. That means a startup would have to wait longer for the results, but also that the final results could be totally wrong, or incomplete.

    The bottom line

    Startups are a specific type of company that has to be risk-taking and careful at the same time. Big Data services are very appealing and popular for a reason. They help overtake the competition and earn higher revenues. Nevertheless, it may be a wise decision not to invest too much into data analytics until a startup gains some firm grounds.

    Author: Milos Mudric

    Source: SmartDataCollective

  • Why we must work together to gain safety and trust in the digital identity age 

    Why we must work together to gain safety and trust in the digital identity age 

    As consumers across the globe become increasingly aware of their digital identity and personal data rights and further regulations take hold, it’s unsurprising that Google has announced it will not be replacing third-party cookies with identifiers and email addresses.

    Advertisers now need to look for new ways to engage valuable customers on a one-to-one basis. Digital targeting and measurement strategies that the industry has grown up around will need to be rebuilt for a privacy-first world.

    This is both a challenge and an opportunity for the industry – to champion privacy while finding new and innovative ways to provide marketers and consumers with relevant, targeted ad experiences. The industry needs to determine the best path forward and partner to develop strategic identity solutions, enabling publishers to maximize the value of their first-party data, help advertisers meet their business goals, and build consumer trust in digital advertising.

    A new vision for a new digital identity ecosystem

    Collaboration between partners within the digital identity and advertising ecosystem is now more important than ever. Suppose advertisers want to increase the effectiveness of their campaigns across the whole of the Internet. In that case, they need to be working with partners who can join up these conversations without operating a walled garden. Greater collaboration is also vital for local premium publishers to continue developing creative, engaging content for consumers, which is the foundation of their ongoing success.  

    The central principle of navigating this changing landscape is for the digital advertising industry to understand where it goes with respect to identity, and it needs to do that with consistency. This means how it will handle identity in the face of the death of third-party cookies, the rise in regulation, and the evolving ways that it is buying and selling advertising today.

    Increasing regulation around data privacy – such as the GDPR in Europe – has been one of the biggest drivers for change in our industry. So, advertisers will want to work with companies adhering to data regulations and encouraging transparency within the supply chain. On top of that, many brands will need to feel a sense of ‘safety through familiarity.’ When discussing compliance, it helps to work with a partner with similar challenges, protocols, and internal processes. For example, a bank or a telecommunication company is going to want partners that can demonstrate their security frameworks meets the country’s data privacy standards, as well as your company’s individual privacy standards.  

    With cookies, these have been relied on for a very long time, yet we’ve seen over the past year or two that we can generate brilliant performance leveraging solutions that do not rely on this. However, as things stand, there isn’t one silver bullet to identity or one single solution, and it won’t be solved for some time. What needs to be done now is to take a very deliberate multi-pronged approach to solve identity. While first-party data goes some way to achieving this, brands can get market-leading performance and competitive advantage even by just using strong and innovative contextual solutions. It’s important for brands not to stand still at this point; testing innovative new solutions will mean you’re well equipped to deal with what comes next. 

    Adopting new models to meet changing needs

    For publishers, this means that they need to look at how they can use their proprietary assets to evolve their business models and package and sell their inventory in a way that best meets the needs of the buyer in our rapidly changing digital advertising landscape.

    Developing different ways to generate and acquire authenticated first-party data will be one key area of focus for publishers. Many are already doing that as they look to build out subscriber bases. This means that if a person uses their email address every time they visit a site, the publisher can use it as a persistent identifier. From here, they can start to build a profile of that user and what their interests are. By better understanding individual users, publishers’ inventory becomes more valuable to advertising partners, as they can effectively target specific audience profiles. 

    Alternative ways that publishers can use their assets, such as building up contextual solutions. The ability to build contextual profiles has advanced greatly since the early days of simply placing adverts for mortgages in financial publications. Today there is much more accurate contextual information about specific articles, so publishers should be looking at utilizing this. Today you can even use contextual solutions to match the sentiment of a piece; for example, if you’re a brand selling retro cameras, you can target context that generates the feeling of nostalgia. 

    In the future, publishers will need to consider device-based advertising. If we consider the devices that will support advertising or do already support advertising, very little of that is cookie-based anyway. A raft of different devices will come into play here, such as smart speakers, CTV, and even wearable tech. None of this will be dependent on a cookie, so there needs to be continued investment in exploring these areas and the new audiences they offer. 

    With the right data protection, privacy controls in place, and the right partners on board, it remains possible to provide consumers with critical choices and insight into the value exchange of advertising and content. By these means, we can also ensure that we enable publishers and marketers to achieve the required outcomes. At this point in time, the worst thing you can do is stand still and wait for something to happen around you. Your audience is still there online, so it’s important that you take all the steps necessary to continue connecting with them.

    Author: Karan Singh

    Source: Dataconomy

EasyTagCloud v2.8