Bank Systems & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


04:45 PM
Connect Directly
RSS
E-Mail

Q&A with RainStor Advisory Council Chairman Frank Fanzilli

As chairman of the RainStor Advisory Council, Frank Fanzilli is applying his expertise to help the financial services sector manage "Big Data," the enormous task of retaining and referencing huge stores of data.



As the former managing director and global CIO of Credit Suisse First Boston, Frank Fanzilli defined the global investment bank's technology architecture for 18 years (he retired in 2002). Today, as chairman of the RainStor Advisory Council, which advises the San Francisco-based data retention provider, he is applying his expertise to help the financial services sector manage "Big Data," the enormous task of retaining and referencing huge stores of data. In an exclusive interview with BS&T, Fanzilli discusses how regulation is driving firms' data management strategies.

BS&T: What are some of the challenges facing CIOs as they work to modernize their data infrastructures? What are some of the opportunities?

Fanzilli: Today's CIO is under constant pressure to deliver technology projects on time and on budget. External-facing challenges have become even more difficult with stringent government-led regulations, all with the goal of protecting the consumer, preserving data, and providing greater transparency with all trades and transactions.

Most financial services and banking organizations face annual data growth rates of 50 percent on the low end but often as high as 200 percent. Because of these high data growth rates, data management and infrastructure solutions are reviewed on a more frequent basis than was the case even in the recent past. And in the post-credit crunch reality, banks and financial services institutions are experiencing increased M&A activity, which causes even greater complexity around managing large data sets spread across disparate applications and databases.

The time has come for IT to evaluate solutions for these data types with the goal of scaling in a cost-effective way. Relying on the traditional RDBMS [relational database management system] for all types of data across the enterprise is no longer feasible. There is an emerging need for data retention solutions specifically optimized to store large volumes of data online for ongoing query access. Reducing the overall footprint of retaining critical data has at least two advantages: You can significantly reduce data infrastructure costs while also improving the sustainability of that data. Efficient data storage effectively frees up resources and budget for projects to grow the business.

BS&T: How have regulations affected the ways in which financial institutions manage data?

Fanzilli: The Dodd-Frank Wall Street Reform and Consumer Protection Act ... is the most ambitious U.S. financial system reform in decades. With this new law and 2004's Sarbanes-Oxley bill, there is increased emphasis on transparency and corporate governance. More important, however, there's an immediate impact on how IT and their financial services businesses need to be in lockstep more than ever. Data management solutions are required to not only provide the obvious high availability, scalability and performance, but also to ensure accessibility, retention and recovery with a high degree of precision and timeliness. And data needs to be accessible in the right format for external regulators and auditors, as well as ongoing customer interactions and internal decision makers.

Because of these reform bills, IT needs to carefully think through all aspects of data management and retention. Data needs to be retained online in a tamper-proof, immutable format, accessible for time periods that are both internally required and externally dictated. Data architecture reviews also need to focus on the advantages and disadvantages of managing systems on premise or outsourcing them to an external service provider or even the cloud.

The most significant direct impact that regulation has had on how IT manages data and database systems is the strict time span over which records need to be retained. Traditional database systems are not capable of enforcing auto-purge based on preconfigured business rules and lack other data-retention capabilities, such as tagging records for future lookup. What's required is a repository or set of repositories able to manage and retain large volumes of data with unprecedented scalability and with efficient ways of storing and handling data to control total cost of ownership. What's needed is a solution that costs an order of magnitude less than today's approaches.

BS&T: What has been the No. 1 change in data management in the past decade? What does the future have in store for data management?

Fanzilli: The cloud as we know it was not even around 10 years ago. Cloud capabilities are more attractive and more viable now than ever before. Of course, the cloud poses some security challenges for sensitive financial and personal data, but encryption and security technologies continually evolve, as do the quality of service providers. It's only a matter of time before cloud becomes a trusted and mainstream approach.

The economics of cloud make sense and should be investigated for specific data management use cases. Specialized data repositories coexisting alongside both online transaction processing (OLTP) and online analytical processing (OLAP) systems -- which we call OLDR, or "online data retention" -- are easy to set up and maintain at significantly lower cost.

Copyright © 2018 UBM Electronics, A UBM company, All rights reserved. Privacy Policy | Terms of Service