Bank Systems & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


01:50 PM
Deena M. Amato-McCoy
Deena M. Amato-McCoy
News
Connect Directly
RSS
E-Mail

Data Takes on a Life of Its Own

Information life-cycle management is a cost-effective, manageable way for banks to organize data from creation to deletion.



As regulators continue to ratchet up the pressure on the financial services industry, it is more critical than ever for banks to have access to current and historical data. But, as regulators require more and more granular levels of detail, traditional storage options no longer are sufficient. By implementing an information life-cycle management (ILM) strategy, however, banks can tap automated solutions to prioritize and store critical data in cost-effective, highly accessible tiered repositories.

Whether banks are collecting data to facilitate customer transactions, improve business operations, create new products or respond to market trends, they are swimming in volumes of actionable information. And the data comes from various disparate sources. "There is an avalanche of data due to sources like electronic check images, EFT [electronic funds transfer] transaction records, mortgage applications and so on," relates Richard Winston, a Dallas-based senior executive in Accenture's financial services practice. "As this data piles up, banks face a variety of retention requirements."

For example, within the next 10 years, as many as 34 billion check images could be presented for payment, according to Paul Abbott, director of industry solutions for worldwide banking and finance, Mobius Management Systems, a Rye, N.Y.-based provider of content management solutions. "Yet, states are requiring banks to keep the original data on file," he says.

While "the length of time required to retain data is governed by regulations," Abbott continues, "market demand is challenging banks' storage operations as well." Consumer information, such as account statements and customer transaction histories, are taking up large amounts of storage, he explains. Similarly, internal reports also are filling data repositories across the industry.

To limit the storage burden, banks must decipher how long to keep this information. "Banks may not need certain pieces of information this year, but 17 years from now regulators may want to see this," Accenture's Winston says. "They need a business strategy to respond to these potential requests."

And banks need not look far for motivation to ensure their retention strategies meet regulators' demands. In May, for example, Morgan Stanley agreed to pay $15 million to settle Securities and Exchange Commission charges stemming from the company's failure to produce e-mails relevant to an SEC investigation.

Often, in such cases, "It is not that the information did not exist," contends Bruce Backa, chief executive officer of Nashua, N.H.-based NTP Software, a provider of storage management solutions. "Rather, their environment is so big and contains so much information that they could not find what they needed."

Similarly, banks are challenged by how to cost-effectively store information and easily retrieve it when needed. "There is a clear challenge," Backa continues. "The business wants all data available in real time, ... [but] banks still cannot afford to keep all data in real-time formats."

More important, banks are struggling with how to get ahead of the problem. "Banks are in need of a strategy that keeps them in charge of their destiny -- not just react to the needs of their user community," says Backa. "There are costs involved, and banks often underestimate the cost on their back end. For example, banks could spend close to $3 million in response to a $1 million lawsuit," he adds. "They need to manage their storage costs and somehow understand and prepare for their future needs."

While the challenge is daunting, banks have begun to address the problem, adopting information life-cycle management strategies and automating their back offices by adding hardware and software solutions that store mission-critical information in tiered storage repositories. ILM manages the flow of strategic information from its creation and initial storage up to the time it is obsolete or ready for deletion. Called a framework that assesses the business value of data, ILM enables banks to prioritize data and move information to different tiers.



Enter: ILM

ILM products automatically organize data into separate tiers according to specified policies and criteria. As important, ILM is an integration strategy that enables banks to interface different "flavors" of storage components available on a bank's mainframe platform.

"Software resides within the infrastructure of the various storage hardware components on a bank's mainframe," explains Accenture's Winston. "Regardless of whether a bank uses storage options from Sun Microsystems or IBM, for example, workflow operations and interfaces integrate storage options and support data storage."

Since various tiers comprise an ILM strategy, banks must establish a storage hierarchy. The first step is to evaluate which information must reside in primary online storage locations. Typically, this information is critical to business operations, and must be available for real-time or near real-time retrieval. Customer statements, for example, often are stored in this tier.

"My bank only keeps customer statements online for 30 days. After that, consumer requests for the documents often fall off," says Mobius' Abbott. "Then this data can be moved to a tier that allows information to be retrieved in 30-second intervals compared to real-time access."

The next storage tier holds data that only requires second-degree protection. While this information also is available on demand, "If there is a mishap or technical error it will not be a problem to retrieve data within 24 or 48 hours," NTP's Backa says.

The final storage tier before disposition is considered a reporting tier. "This data is used for manipulation purposes, or for read-only situations," Backa notes. Since the data stored at this level does not have to be available in real time, the tier may leverage robotic tape or offline tape. "The final step is disposition. Information can reside here forever or be deleted," Backa adds.

In addition to the retrieval benefits offered by its automated, rules-based storage, ILM also offers significant potential cost savings, as the tiered environment enables banks to cost-effectively archive data. "A larger percentage of information technology budgets is not being spent on storing current transaction data but on reference data," says James Redd Wilson, XP product marketing manager for Palo Alto, Calif.-based Hewlett-Packard. This data need not be available in real time and, while top-tier data storage carries the highest price tag, prices fall as data is moved to lower-tiered storage that requires slower response time, he explains.

"Intermediate storage is slower and has a lower cost," adds Accenture's Winston. "This could cost 20 percent less than primary storage. As data keeps moving down a tier, automated retrieval time is slower and therefore less costly for banks. It is the evolution in data management."

And as the cost of storage options decreases, ILM is becoming more popular. "The price per terabyte [of storage] decreases by 35 percent to 40 percent each year, and ILM helps banks take advantage of these savings," asserts Bill Braun, vice president of information systems for the Texas Credit Union League (TCUL), a state trade association that represents approximately 600 nonprofit Texas credit unions and their 7 million member-owners.

You've Got Stored E-Mail

E-mail is a prime area for tiered storage. "Customers send inquiries or complaints to the bank via e-mail -- now the message has a life in the bank," Accenture's Winston says. "ILM can help banks create a strategy of where to keep e-mails, reconstruct conversations and decide how long to keep it," he asserts.

When Farmers Branch, Texas-based TCUL (www.tcul.coop) wanted to centralize and back up its ever-growing volumes of e-mail data, the company added a solution from Toronto-based Fortiva, a provider of managed e-mail archiving service for regulatory compliance and legal discovery. As e-mail comes into the company, it resides on TCUL's e-mail server, but it also is moved to Fortiva's storage appliance. "It may seem like a simple storage solution, but it becomes difficult for banks to manage their infrastructure to back up this data and determine how much storage they need," asserts Chris Tebo, Fortiva's CTO.

By archiving e-mails "in a central repository, we gain more control," adds TCUL's Braun. The Fortiva solution optimizes storage by keeping a single file of data, even if the e-mail is sent to multiple recipients. The solution also enables the company to establish the appropriate time to dispose of the e-mail.

"You need to keep copies or you lose control," Braun says. "The better the [archiving] solution you have, the more likely you can dissuade people from taking e-mails out of the system and disposing of the correspondence. When that happens, you lose control."

Currently, TCUL has approximately 100 gigabytes of information stored and archived in the Fortiva solution. While TCUL implemented the solution just six months ago, Braun plans to endorse the solution to its credit union members. "Our role is to find solutions for our members that we deem as best-in-class and simple to deploy," he explains. "Later this year, we plan to endorse and market Fortiva's solution to our credit union members."

A Single Archive Is Key

To keep its storage environment under control, Cleveland-based KeyCorp ($84 billion in total assets) uses the ViewDirect TCM enterprisewide content management and archival system from Mobius. The solution provides 13,000 internal users with secure access to monthly content comprised of 80 million checks, 36 million statements and 175 million report pages, according to Mobius. This configuration also enables the bank's external customers to securely access their check images, statements and other account information via the Web.

"We wanted one electronic archive to support our requirements for regulatory compliance, internal operations and customer service," said David Harris, KeyCorp's vice president and division manager, in a prepared statement. "With Mobius' ViewDirect TCM integrated content repository as the cornerstone, we've achieved that objective."

KeyCorp replaced its check microfilm process with an imaging solution centered on the ViewDirect TCM content repository and e-presentment facilities. Images now are scanned, indexed and stored for 30 days to 90 days. Then they are compressed and migrated to virtual tape storage silos for near-line access.

By making these documents available to customers online, KeyCorp reduced service center calls, as well as its printing and mailing costs. Specifically, the bank estimates that its hard-dollar savings amount to more than $12 million a year. This is based on reductions in mailed customer documents, tape storage consolidation and elimination of microfiche. "For example, we used to spend $92,000 a month on microfiche," said Allyn Pytel, Key's senior vice president of media and output management, in the statement. "In January, our bill was $350."

Continuing Evolution

Despite ILM's obvious advantages, experts acknowledge that the related technologies still are evolving and there is room for improvement. For example, banks are struggling with piecing together the individual technology solutions that comprise an ILM initiative.

"Pieces are still fairly disconnected," says NTP's Backa. "There needs to be different technology across each storage tier to ensure a single view of data, yet those pieces are still disconnected from a technology perspective." To leverage ILM, banks need a complete view of corporate information on an enterprisewide basis. To achieve this, the technologies must work together, Backa suggests.

Exacerbating the problem is the increasing globalization of the banking industry. "As banks deal with global regulatory compliances like Basel II, Sarbanes-Oxley and MiFID [Markets in Financial Instruments Directive], they are playing a whole new ball game," says Anne Ambrose, director, brokerage, trading and investment management, worldwide financial services industry, HP. "Banks need to keep massive amounts of data on hand to be able to audit financial transactions. This can only happen with enterprisewide access to market data." This remains a challenge for banks that are geographically dispersed across the globe.

Still, stresses NTP's Backa, while ILM software and hardware is still emerging, banks should not wait to pursue an ILM strategy. "Banks should be evaluating their existing data strategies and determine where their current integration holes lie," he says.

"In the end, banks need a strategy where data continuously flows and each platform can work together from beginning to end," Backa adds. "The industry is just beginning to do this." **

Comment  | 
Email This  | 
Print  | 
RSS
More Insights
Copyright © 2018 UBM Electronics, A UBM company, All rights reserved. Privacy Policy | Terms of Service