09:30 AM
Data Aggregation: Why Some Banks Succeed & Others Fail
A January report by bank regulators shows that most of the top 20 banks still get a failing grade when it comes to data aggregation. The reasons for this are not hard to decipher. With cost cutting being the focus in most banks, investing in upgrading IT infrastructures and data initiatives has not been a top priority. Also data governance, the cornerstone of effective data management and aggregation, is looked upon as distraction from the focus of running the bank. However, it is not all gloomy out there when it comes to effective data management. Some banks are beginning to show some progress from where they were five years ago. There are three things these banks are doing right to ensure that data initiatives that are cost effective, have top management sponsorship, and are more tightly integrated with business operations.
1. Engage with regulators to come up with cost-effective action plans
Most institutions understand that dealing with data aggregation problems requires action plans. However, few institutions would approach regulators to help them with their action plans. Some organizations are doing just that as they realize that regulators can play a crucial part in translating the regulations and principles into specific requirements and help them with the cost-effective action plans.
Anecdote: A large Midwestern brokerage firm was hit with multiple MRAs and MRIAs because of reporting errors. The consulting company it hired to validate the problem suggested the company involve the regulator. The regulator helped not only define the minimum set of capabilities for compliance, but also helped shift focus to those capabilities that maximize business value. The company was pleasantly surprised to see this interaction come up with an action plan requiring lower cost and effort than the one it had originally come up with.
2. Integrate data governance into risk governance
Ask any bank the key reason data aggregation projects fail and they will say it is data governance. The major reason cited for failure of data governance is lack of support from board and senior management. As companies try to cut costs, data aggregation projects are often passed over for other initiatives related to regulatory compliance or risk management.
Well, if you can’t beat them, join them. Some organizations are doing just that -- integrating their data governance with risk governance programs that definitely have board and senior management support.
Anecdote: A large Midwestern bank integrated into data governance into its operational risk governance program. This has resulted in several interesting results. The data problems are recorded as risks, and potential losses are computed. As part of its risk policy, board reviews risks with high potential losses. These are addressed with urgency. This integration of data governance into risk governance has helped this organization ensure data issues get the attention of the highest levels of management and the board, besides appropriate funding.
3. Define data quality metrics in terms of operational efficiency parameters
The traditional metrics for data quality are related to accuracy, consistency, completeness, and latency. What does this mean to operations? Although it could be argued that data and processes are really two sides of the same coin, most organizations have a difficult time connecting the data-related metrics to operational efficiency. Some organizations are discarding the traditional metrics and adopting operational metrics. This is helping these organizations change the perception of data integration projects -- from “change the bank” expensive initiatives to “run the bank” cost-saving ones.
Anecdote: A large US investment bank has had a number of successes in kicking off data aggregation projects by defining operational metrics to monitor data quality. For example, in a fraud operation the metrics tracked are the average time a fraud operator takes to resolve whether the alert is fraud or not, and the number of false positive alerts per month. Although expressed in operational metrics, the first metric depends on data completeness and accuracy, while the second deals with data latency. The data initiative is also perceived as cost-saving, since it helps reduce the effort of the fraud operator, who can now handle more alerts and fraud cases.
It is clear from the above that with the right partnerships and business cases, some organizations are using innovative methods to ensure data aggregation initiatives get the right attention they need. What innovative method is your institution doing to ensure that it does not get a failing grade in the next survey of the regulators?
Thadi Murali, CFA, is a principal consultant in Capco's banking practice. He has more than 18 years of consulting experience focusing on risk, compliance, and IT across banking, asset management, and security operations. He has helped financial industry clients in the US, ... View Full Bio