Bank Systems & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Data & Analytics

10:30 AM
Michael Flynn
Michael Flynn
Commentary
100%
0%

What Does Big Data Mean for Banks?

Why traditional data management strategies and storage solutions are holding back big data projects.

In a recent Said Business School study, 63% of banks recognized proficiency in big data as a competitive advantage. However, 91% indicated that they lack key skills necessary to execute more effectively, and only 3% reported that their organizations had deployed big data initiatives on a continuous basis. Many banks are trying, but few appear to be succeeding.

Why are banks struggling?
When faced with the requirements of a new big data initiative, banks too often only draw on prior experience and attempt to leverage familiar technologies and software-development-lifecycle (SDLC) methodologies for deployment.

Traditional technologies, particularly the industry’s most-common data stores (e.g., relational databases), were designed to enforce structure and optimize processing performance within a constrained hardware environment. As a result, many bank technologists are used to transforming data to meet these constraints, including aggregation to satisfy scalability limitations and data-normalization to satisfy schema restrictions.

Aggregation and normalization of data in this manner can result in several weaknesses:

  • Rigid schemas do not tend to allow for flexibility in responding to upstream and downstream data changes
  • Data lineage may be lost after aggregation and summarization
  • Data governance is likely weakened when several constituents retain responsibility for an extended, multi-stage data flow.

These weaknesses are detrimental to the success of big data initiatives, where things like data-structure flexibility, deeper data granularity, and improved data traceability are core to execution and analytical effectiveness. Traditional data-management architectures are usually not viable within this context.

In addition, also often insufficient are traditional approaches to project management, implementation, and change management. In previous decades, systems were developed under the assumption of a relatively steady-state after they were deployed, where the requirements of the platform were unlikely to change significantly over time. However, the requirements of big data initiatives are often guided by the insights derived from the data itself and more likely to evolve over time. Therefore, any rigidity in the deployment approach can pose immediate risks of stagnation and failed adaptation.

New approach required
Big data represents a new way that banks can interact with and leverage their data. As a result, banks need to shift the paradigm for designing, developing, deploying, and maintaining big data solutions.

A wave of technologies has emerged to provide the flexibility and scalability required to support the shift. New approaches to data storage (e.g., NoSQL databases) can eliminate the burden of structure definition and enable cheap storage. Maturity of distributed-computation software frameworks (e.g., Hadoop) can provide the performance expected of a modern platform, while leveraging data on a scale never before attempted. Such visualization and reporting platforms offer a view into information not previously available. And such tools are at virtually everyone’s fingertips.

Just as banks need to reevaluate technologies, the approach to big data implementation also needs to change. Agile-development methodologies have evolved to provide rapid, iterative, and incremental deployment of solutions in a way that aligns well to the speed at which the underlying data are measured, understood, and parsed. While seemingly counterintuitive given the large scale of required information and complexity of analysis, effectively-executed big data development programs greatly shrink time-to-market and reduce development costs relative to traditional SDLC. 

The building blocks of comprehensive big data frameworks are readily available; it is time for banks to take the plunge.

Moving from concept to reality
With the introduction of any new framework comes an investment in people, processes, tools, and technologies to bring them up to an acceptable level of competency and capability. Banks need to be open and willing to change; they need to be prudent and practical in their strategy for rollout; and they need to be open to failure -- at least on a small scale, so that large-scale success can be achieved.

Big data is no myth, and the opportunity is significant. It just takes an open mind, a different approach, and the right selection of supporting technologies to bring the concept to realization.

Michael B. Flynn is a Managing Director in AlixPartners' Information Management Services Community. He oversees technology and information management advisory services for the financial services sector, including commercial and retail banking, capital markets, asset ... View Full Bio

Register for Bank Systems & Technology Newsletters
Slideshows
Video
Bank Systems & Technology Radio
Archived Audio Interviews
Join Bank Systems & Technology Associate Editor Bryan Yurcan, and guests Karen Massey and Jerry Silva from IDC Financial Insights, for a conversation about the firm's 11th annual FinTech rankings.