3 items tagged "Data quality"

  • An overview of Morgan Stanley's surge toward data quality

    An overview of Morgan Stanley's surge toward data quality

    Jeff McMillan, chief analytics and data officer at Morgan Stanley, has long worried about the risks of relying solely on data. If the data put into an institution's system is inaccurate or out of date, it will give customers the wrong advice. At a firm like Morgan Stanley, that just isn't an option.

    As a result, Morgan Stanley has been overhauling its approach to data. Chief among them is that it wants to improve data quality in core business processing.

    “The acceleration of data volume and the opportunity this data presents for efficiency and product innovation is expanding dramatically,” said Gerard Hester, head of the bank’s data center of excellence. “We want to be sure we are ahead of the game.”

    The data center of excellence was established in 2018. Hester describes it as a hub with spokes out to all parts of the organization, including equities, fixed income, research, banking, investment management, wealth management, legal, compliance, risk, finance and operations. Each division has its own data requirements.

    “Being able to pull all this data together across the firm we think will help Morgan Stanley’s franchise internally as well as the product we can offer to our clients,” Hester said.

    The firm hopes that improved data quality will let the bank build higher quality artificial intelligence and machine learning tools to deliver insights and guide business decisions. One product expected to benefit from this is the 'next best action' the bank developed for its financial advisers.

    This next best action uses machine learning and predictive analytics to analyze research reports and market data, identify investment possibilities, and match them to individual clients’ preferences. Financial advisers can choose to use the next best action’s suggestions or not.

    Another tool that could benefit from better data is an internal virtual assistant called 'ask research'. Ask research provides quick answers to routine questions like, “What’s Google’s earnings per share?” or “Send me your latest model for Google.” This technology is currently being tested in several departments, including wealth management.

    New data strategy

    Better data quality is just one of the goals of the revamp. Another is to have tighter control and oversight over where and how data is being used, and to ensure the right data is being used to deliver new products to clients.

    To make this happen, the bank recently created a new data strategy with three pillar. The first is working with each business area to understand their data issues and begin to address those issues.

    “We have made significant progress in the last nine months working with a number of our businesses, specifically our equities business,” Hester said.

    The second pillar is tools and innovation that improve data access and security. The third pillar is an identity framework.

    At the end of February, the bank hired Liezel McCord to oversee data policy within the new strategy. Until recently, McCord was an external consultant helping Morgan Stanley with its Brexit strategy. One of McCord’s responsibilities will be to improve data ownership, to hold data owners accountable when the data they create is wrong and to give them credit when it’s right.

    “It’s incredibly important that we have clear ownership of the data,” Hester said. “Imagine you’re joining lots of pieces of data. If the quality isn’t high for one of those sources of data, that could undermine the work you’re trying to do.”

    Data owners will be held accountable for the accuracy, security and quality of the data they contribute and make sure that any issues are addressed.

    Trend of data quality projects

    Arindam Choudhury, the banking and capital markets leader at Capgemini, said many banks are refocusing on data as it gets distributed in new applications.

    Some are driven by regulatory concerns, he said. For example, the Basel Committee on Banking Supervision's standard number 239 (principles for effective risk data aggregation and risk reporting) is pushing some institutions to make data management changes.

    “In the first go-round, people complied with it, but as point-to-point interfaces and applications, which was not very cost effective,” Choudhury said. “So now people are looking at moving to the cloud or a data lake, they’re looking at a more rationalized way and a more cost-effective way of implementing those principles.”

    Another trend pushing banks to get their data house in order is competition from fintechs.

    “One challenge that almost every financial services organization has today is they’re being disintermediated by a lot of the fintechs, so they’re looking at assets that can be used to either partner with these fintechs or protect or even grow their business,” Choudhury said. “So they’re taking a closer look at the data access they have. Organizations are starting to look at data as a strategic asset and try to find ways to monetize it.”

    A third driver is the desire for better analytics and reports.

    "There’s a strong trend toward centralizing and figuring out, where does this data come from, what is the provenance of this data, who touched it, what kinds of rules did we apply to it?” Choudhury said. That, he said, could lead to explainable, valid and trustworthy AI.

    Author: Penny Crosman

    Source: Information-management

  • Business Intelligence Trends for 2017

    businessintelligence 5829945be5abcAnalyst and consulting firm, Business Application Research Centre (BARC), has come out with the top BI trends based on a survey carried out on 2800 BI professionals. Compared to last year, there were no significant changes in the ranking of the importance of BI trends, indicating that no major market shifts or disruptions are expected to impact this sector.
     
    With the growing advancement and disruptions in IT, the eight meta trends that influence and affect the strategies, investments and operations of enterprises, worldwide, are Digitalization, Consumerization, Agility, Security, Analytics, Cloud, Mobile and Artificial Intelligence. All these meta trends are major drivers for the growing demand for data management, business intelligence and analytics (BI). Their growth would also specify the trend for this industry.The top three trends out of 21 trends for 2017 were:
    • Data discovery and visualization,
    • Self-service BI and
    • Data quality and master data management
    • Data labs and data science, cloud BI and data as a product were the least important trends for 2017.
    Data discovery and visualization, along with predictive analytics, are some of the most desired BI functions that users want in a self-service mode. But the report suggested that organizations should also have an underlying tool and data governance framework to ensure control over data.
     
    In 2016, BI was majorly used in the finance department followed by management and sales and there was a very slight variation in their usage rates in that last 3 years. But, there was a surge in BI usage in production and operations departments which grew from 20% in 2008 to 53% in 2016.
     
    "While BI has always been strong in sales and finance, production and operations departments have traditionally been more cautious about adopting it,” says Carsten Bange, CEO of BARC. “But with the general trend for using data to support decision-making, this has all changed. Technology for areas such as event processing and real-time data integration and visualization has become more widely available in recent years. Also, the wave of big data from the Internet of Things and the Industrial Internet has increased awareness and demand for analytics, and will likely continue to drive further BI usage in production and operations."
     
    Customer analysis was the #1 investment area for new BI projects with 40% respondents investing their BI budgets on customer behavior analysis and 32% on developing a unified view of customers.
    • “With areas such as accounting and finance more or less under control, companies are moving to other areas of the enterprise, in particular to gain a better understanding of customer, market and competitive dynamics,” said Carsten Bange.
    • Many BI trends in the past, have become critical BI components in the present.
    • Many organizations were also considering trends like collaboration and sensor data analysis as critical BI components. About 20% respondents were already using BI trends like collaboration and spatial/location analysis.
    • About 12% were using cloud BI and more were planning to employ it in the future. IBM's Watson and Salesforce's Einstein are gearing to meet this growth.
    • Only 10% of the respondents used social media analysis.
    • Sensor data analysis is also growing driven by the huge volumes of data generated by the millions of IoT devices being used by telecom, utilities and transportation industries. According to the survey, in 2017, the transport and telecoms industries would lead the leveraging of sensor data.
    The biggest new investments in BI are planned in the manufacturing and utilities industries in 2017.
     
    Source: readitquick.com, November 14, 2016
  • Machine learning, AI, and the increasing attention for data quality

    Machine learning, AI, and the increasing attention for data quality

    Data quality has been going through a renaissance recently.

    As a growing number of organizations increase efforts to transition computing infrastructure to the cloud and invest in cutting-edge machine learning and AI initiatives, they are finding that the main barrier to success is the quality of their data.

    The old saying “garbage in, garbage out” has never been more relevant. With the speed and scale of today’s analytics workloads and the businesses that they support, the costs associated with poor data quality are also higher than ever.

    This is reflected in a massive uptick in media coverage on the topic. Over the past few months, data quality has been the focus of feature articles in The Wall Street Journal, Forbes, Harvard Business Review, MIT Sloan Management Review and others. The common theme is that the success of machine learning and AI is completely dependent on data quality. A quote that summarizes this dependency very well is this one by Thomas Redman: ''If your data is bad, your machine learning tools are useless.''

    The development of new approaches towards data quality

    The need to accelerate data quality assessment, remediation and monitoring has never been more critical for organizations and they are finding that the traditional approaches to data quality don’t provide the speed, scale and agility required by today’s businesses.

    For this reason, highly rated data preparation business Trifacta recently announced an expansion into data quality and unveiled two major new platform capabilities with active profiling and smart cleaning. This is the first time Trifacta has expanded our focus beyond data preparation. By adding new data quality functionality, the business aims to gain capabilities to handle a wider set of data management tasks as part of a modern DataOps platform.

    Legacy approaches to data quality involve many manual, disparate activities as part of a broader process. Dedicated data quality teams, often disconnected from the business context of the data they are working with, manage the process of profiling, fixing and continually monitoring data quality in operational workflows. Each step must be managed in a completely separate interface. It’s hard to iteratively move back-and-forth between steps such as profiling and remediation. Worst of all, the individuals doing the work of managing data quality often don’t have the appropriate context for the data to make informed decisions when business rules change or new situations arise.

    Trifacta uses interactive visualizations and machine intelligence guides help users by highlighting data quality issues and providing intelligent suggestions on how to address them. Profiling, user interaction, intelligent suggestions, and guided decision-making are all interconnected and drive the other. Users can seamlessly transition back-and-forth between steps to ensure their work is correct. This guided approach lowers the barriers to users and helps to democratize the work beyond siloed data quality teams, allowing those with the business context to own and deliver quality outputs with greater efficiency to downstream analytics initiatives.

    New data platform capabilities like this are only a first (albeit significant) step into data quality. Keep your eyes open and expect more developments towards data quality in the near future!

    Author: Will Davis

    Source: Trifacta

EasyTagCloud v2.8