The financial crisis of 2008 made it painfully clear that critical decision-making in financial services hinges on the quality of analytical data. This, in turn, has cast new scrutiny on the data center and firms' analytical capabilities, as well as spurred requirements for new approaches. In many cases, traditional approaches to data warehousing and analytical capabilities that were in place during the recent financial crisis (many of which remain even to this day) failed to meet demands during this critical period.
At the same time, increased competition has ramped up the pressure for banks to support their operations with analytical information that can be delivered in time for in-transaction decision-making. This is in stark contrast to traditional, "offline" analytical applications used solely to support senior management decisions.
In today's transformed environment, many financial services organizations are waking up to a new reality: It is time to rethink and rework their data management strategies.
Financial services organizations, and their CIOs, are finding that they can no longer approach data management using a "data provider" mindset that is disconnected from downstream functional needs. Instead, to efficiently serve multiple, and often intersecting, user communities, they need to develop a holistic mindset that spans the entire data lifecycle from operational data to analytical systems and reporting. And, time is of the essence, as institutions no longer have the luxury of large, multi-year initiatives to identify and address data management issues using a "big-bang/big data project" approach.
Where should today's financial institution CIO start? Three core principles point the way forward:
1. Understand the Use Cases. The core underlying principle driving the data warehouse should not be "data for data's sake" but a clear, unwavering focus on the end uses supported by the data warehouse environment. The first step is to clearly identify the key, top-level analytical solution areas in financial services:
These represent the constituencies in any financial services institution that are the owners of analytical data flows, and hence consumers of data from operational systems that are used as inputs to these flows.
Second, understand the use cases that exist at the intersection of the above areas. This is a key idea -- since it involves moving beyond a siloed view of departmental data to a holistic view that includes all the cross-functional use cases. Increasingly, this is a strategic imperative for financial institutions, as a new class of regulatory and competitive mandates emerges. Any "single source of truth" data platform has to, therefore, recognize and account for these new emerging needs, such as liquidity risk regulatory and economic capital, and customer profitability.
2. Understand the Processes that Produce or Consume Data. The second critical point in rethinking data warehousing is the need to truly appreciate the computational methods and techniques that are used in financial services analytical processes. These computations place specific, clear demands on the underlying data provider (and hence the data model).
For example, stress testing a balance sheet involves the need to clearly identify specific scenarios used in a stress test, the scenarios used in developing or calibrating a computational model, and the detailed dataset corresponding to each such scenario, all of which need to be accounted for in the underlying data model.
Knowledge of the computational processes that depend on the warehouse expands the warehouse from being merely an "external" data provisioning platform, to a central platform that encompasses all these analytical data flows, thereby guaranteeing consistency, traceability, and verifiability of both the data inputs and the generated results.
3. Understand the Ultimate Use of Analytical Outputs. The warehouse environment needs to support not just an enhanced approach to data provisioning and computations, but also the complete set of use cases with respect to analytical outputs/results used in reporting and business intelligence delivery.
The discipline of data warehousing in financial services is fundamentally shifting from an assembly of generic components and tools toward a holistic, integrated platform that supports the unique analytical needs of financial services institutions worldwide. Key tenets for creating this new environment include:
A Focus On End-to-end Data Flows to Support Key Use Cases. Rather than merely being a provider of operational/business data for downstream analytical consumers, the warehouse should be a single foundation that supports end-to-end analytical processing including data sourcing, calculation and aggregation processes, and results/reporting for every use case.
Minimally, this requires the IT organization and the user communities to work together to understand and agree on:
Purpose-built Analytical Platform Rather than a Collection of Tools. Custom assembly of a data-warehousing environment has historically proven costly and prone to high failure rates partially because of ill-defined and often overreaching scope. A more reasonable way to address this problem is to utilize a unified analytical platform that can support all the key requirements and usage patterns of a typical financial services institution -- rather than attempting to combine general-purpose tools to achieve the needs of the institution.
Scalable, integrated infrastructure. At a technical level, the warehouse platform should be deeply integrated with the underlying infrastructure -- specifically being able to leverage the power of the infrastructure to scale out in a flexible, transparent, and cost-efficient manner.
In summary, it is time for financial services institutions to recognize that the new reality of data management requires a shift from passive data warehouses to integrated analytical platforms.
Venkat Krishnamurthy is Director, Product Management, Oracle Financial Services Global Business Unit.