Joining up the information disconnect

Today s business executives of large enterprises have a ravenous appetite for information about their business performance but are frustrated by the inability of the IT infrastructure to keep up with their changing demands.

Part of the cause of this is the habit of some IT departments to apply a straightjacket of formal processes to all applications, with careful separation of the chain of stages from specification to design to coding to testing to production so-called waterfall methodologies, where each link in the chain of project stages is signed off before proceeding to the next stage-. Of course such a cautious approach makes perfect sense for large transaction processing systems, where the generic processes remain constant, upgrading to a new package release is in itself a major project and the consequences of a failure are dramatic. Such systems are automating well-defined business processes or else they would not be packages- which by their nature do not change too often to take finance, for example, where double entry bookkeeping was invented by the Venetians-. However, such rigidity can be a problem for business intelligence applications, where the business needs are much more fluid and where this month s burning issue for management is entirely different from last month s. The consequence has been frustration on the part of businesspeople who have invested millions in business intelligence applications or the data warehouses that feed these applications with business information but find that these systems cannot quickly adapt to their new requirements. IT departments can hardly be blamed for being cautious, since most data warehouses - databases that store a copy of the company s data for the purposes of analysis - cannot exactly turn on a dime when it comes to major changes in requirements. This is due to their rigid structures whose initial build and development is usually based upon the systematic, yet time-consuming waterfall project methodologies. The need to design and build the data warehouse against a formal set of requirements that are to be carefully defined at the start of the project is an inevitably slow process. Traditional data warehouses are typically built in 12-18 months, yet the business can change its requirements several times within this time frame. The rewards for those customers who boldly embrace an iterative or evolutionary approach to data warehousing can be substantial. This alternate approach is based on a flexible data warehouse structure that is created and updated through data warehouse life cycle management DWLM- software. This data warehouse automation software has enabled companies to deploy the first phase of their data warehouse in just 30 days, then adding further feeds and new reports incrementally. At Intelstat, this application has had huge business success, identifying significant additional saleable network capacity through greatly improved matching of capacity to forward contracts. A later phase of the same project improved billing capacity through better analysis of customer receivables processes. Finally, improved system capacity information led to a tangible improvement in free cashflow for Intelsat. One key to success was the iterative approach taken to the project, allowing new feeds to be added and new reports to be produced incrementally, taking advantage of the flexibility of the data warehouse technology deployed to be able to rapidly deal with change. In another example, Cadbury Schweppes fully integrated management information for their acquisition Orangina within just two months. This allowed the company to gauge the business performance of Orangina within the standard Cadbury Schweppes formats and also allowed the acquired company to halve its billing cycle. Again the key was a rapid, iterative development approach in conjunction with a flexible data warehouse technology. In a similar example, Shell was able to bring the financials of acquired company Pennzoil within their main reporting system within just three months. Let s face it, most companies spend more than three months just deciding what technology to select for a project, never mind going live in this time frame. In yet another example, HBOS, the leading U.K.-based financial services conglomerate, was able to produce a complete view of their GBP 1.7 billion procurement spend following the merger or Halifax and Bank of Scotland within three months, again by using an iterative development approach with a flexible data warehouse technology, achieving very large business benefits quickly. This project has gone on to add four further phases to the initial deployment for example, better analysis of exceptional ordering patterns and the addition of supply chain information. By delivering measurable benefits to the business users so quickly, the project team was able to gain the confidence of their sponsors. It seems self-evident that this is a better approach than the traditional big-bang implementation a year or more after the project begins, by which time the business needs may themselves have changed. These examples show that by taking full advantage of the flexibility of modern data warehouse technology, forward-thinking IT departments can become heroes in the eyes of the business by delivering real benefits to their customers much faster than they are accustomed to with traditional approaches. Using waterfall methodology with sequential linked chains of stages of requirements, design and development approaches for fast-changing data warehouse applications is to miss opportunities to deliver rapid benefits to your customers. To paraphrase Karl Marx, IT departments of the world unite you have nothing to lose but your chains. Source: www.datawarehouse.coma>