Bank Systems & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Data & Analytics

00:01 AM
Ross Wainwright
Ross Wainwright
Commentary
50%
50%

5 Best-Practices for Bringing Big Data to Banking

If there is one thing banks do not lack, it is data.

The volume and velocity of data continue to increase exponentially. IDC research indicates that the global volume of data will increase from 130 to 40,000 exabytes by 2020. Banking is one of several vertical industries where this growth will be most pronounced. The new digital consumers are demanding much more from their financial institutions when it comes to quality of service and benefits. To hold onto this audience, banks are heavily investing in integrated channels, CRM, and data collection to better understand and communicate with them. This creates a massive flood of data that needs to be properly managed in order to be effective.

Regulatory requirements such as Dodd-Frank are contributing to more integrated, granular data. Meeting compliance, along with achieving ideal risk and return scenarios, requires heavily detailed financial and transactional data -- more detail than has previously been needed or was possible to process before the arrival of advanced analytical tools.

Big data is the new reality for banks both big and small. They need to efficiently parse through large sets of data for meaningful insight and information that can guide the business. It is this insight that leads to better, faster decisions. Database innovations such as in-memory are revolutionizing how structured and unstructured data can be consolidated and analyzed. For banks, it is powering real-time, personalized offers to customers, identifying fraud, and providing a more granular view of credit and liquidity risks. Most banks have analytical systems in place, but big data takes it to a macro level with deeper insight coming from a multitude of sources.

So how does one adopt a big-data framework? As the number of sources and data variety grow, it becomes much harder to get an accurate read on all the information that is out there. Here are some best-practices for managing big data in financial services:

1. Start with key business problems. Analyzing data in a vacuum will marginalize its potential value. Identify critical business issues upfront, such as cross-selling, fraud detection, and risk management, and explore how the insight associated with those issues can be leveraged. 

2. Collect clean data. In order to be of any value, datasets need to be as clean and accurate as possible. Automated processes can be put into place for filtering out bad data at the point of capture or later in the process. They should be able to detect and eliminate possible errors, duplications, and other inconsistencies, leading to a single, accurate depiction of the truth.   

3. Speed is essential. The value of data can decay quickly with time, requiring a need for real-time accessibility. When deciding on a big-data system, consider the issue of reduced data latency to aid folks dealing with consumers on the front end. Combine this speed with easy-to-use, intuitive software interfaces that make querying and visualizing complex datasets more inviting to the user. This will insure productive usage.

4. Pull a diverse set of data to achieve greater value. To get an accurate reading of a situation you need an all-encompassing view taking into account variables of all shapes and sizes. Look at external unstructured data like social media or websites for customer sentiments, and internal structured sources like risk data and payment traffic. The broader the field, the more likely your reading of the situation will be correct. Don’t dismiss new datasets at face value until you have thoroughly tested them for their potential. 

5. Apply human oversight over automated processes. A human touch is needed to guide and shape the data into useful insight. Appoint a Chief Data Officer who can oversee a bank’s data administration and data mining across the entire organization. He or she should be an equal member of the executive team who can implement strategies and procedures on the enterprise level. Also loop in a broad range of disciplines to contribute as big-data scientists -- their backgrounds and unique perspectives can add to the data. 

Big data needs to be front and center in the IT plans of all financial institutions. As the financial sector becomes increasingly competitive, the value contained within these datasets can determine success or failure. Following these steps will ensure you have a system in place that leverages big data rather than being left paralyzed by it.

Ross Wainwright is global head of financial services industries for SAP. He is responsible for SAP´s end-to-end footprint and ambition in FSI, which today covers more than 17,000 FSI clients. His team's objective is to help customers transform their legacy ... View Full Bio

Register for Bank Systems & Technology Newsletters
Slideshows
Video
Bank Systems & Technology Radio
Archived Audio Interviews
Join Bank Systems & Technology Associate Editor Bryan Yurcan, and guests Karen Massey and Jerry Silva from IDC Financial Insights, for a conversation about the firm's 11th annual FinTech rankings.