Expanding data volumes and regulatory changes are putting more and more pressure on banks' core processing systems. Yet banks must continue to ensure that their platforms are equipped to handle their original tasks -- to optimally service customers and deliver top-notch products in a speedy time frame.
According to Boston-based Aite Group, to combat high maintenance costs, speed product launches, improve compatability with third-party applications and increase access to data for compliance reporting, U.S. banks and major credit unions will spend $4.2 billion over the next three years on new core banking systems. Citing the cost of replacement and banks' fears, however, Aite stresses that only 4 percent of financial institutions will deploy a new system this year.
Similarly, many banks historically have chosen to bolt point solutions onto existing legacy systems rather than rip out and replace aging platforms. But now, they're finding themselves tangled in a new dilemma. These nested solutions are creating information silos, making it difficult to comply with reporting rules, for example. As a result, banks are seeking tighter integration with new open architectures, enabling them to extend the life of existing core systems while keeping operating costs low, upholding customer service and complying with regulations.
Every bank relies on its core processing platform to process day-to-day customer transactions, from checking deposits to mortgage payments. However, the definition of -- and the resulting demands on -- a core processing system are changing. "Many banks are adding their new customer systems under the umbrella of core processing systems," explains Bob Hunt, research director, retail banking practice, TowerGroup (Needham, Mass.). "More than a customer relationship management solution, customer information systems combine CRM data as well as all nonfinancial related data into a central information file ... that reveals [a customer's] relationship with the bank."
This expanded functionality is the result of decades of technology enhancements. The banking industry has made significant investments over the past 25 years in new delivery channels, such as ATMs, call centers and the Internet, and banks have had to create new functionality around existing systems. Some banks even created separate platforms to support the information delivery of these channels. "The fact is that consumers want to be able to bank anytime of the day, and systems need to be available to accommodate that," says Mark Forbis, VP and CIO for Monett, Mo.-based Jack Henry & Associates. "For banks to ensure services like this, multichannel integration and integration of related processing systems is critical."
And as consumers rely on these new delivery channels more and more, the levels of data moving through the pipeline also are increasing. "Data volumes in the financial sector continue to grow by more than 50 percent every 18 to 24 months," says Michael Mullaley, director of enterprise networks for networking solutions provider Ciena (Linthicum, Md.).
Charlotte, N.C.-based Wachovia ($23.69 billion in assets), for example, currently handles 350,000 external and 850,000 internal data transmissions each month. And these levels are boosted further by federal regulations. As new compliance regulations come down the pike, banks are forced to integrate point solutions that can replicate operations and produce reporting across different operating systems.
Consider the Check 21 legislation. Banks welcomed the idea of electronic checks that reduce paper, operating costs and reimbursement lengths. "Within the next two or three years the process will eliminate traditional back-office check preparation, and the move to an electronic process will foster more real-time data access," explains TowerGroup's Hunt. Yet it takes time, capital and labor to reengineer processes to take advantage of this.
"The industry was glad Check 21 came about," says Roger G. Leblond, EVP, Lone Star National Bank (Pharr, Texas; $1.4 billion in assets). "However, there was no infrastructure in place to support it." Leblond likens the event to the industry's move to "envelope-less" ATM deposits. "ATM manufacturers developed the capability to support this functionality, and that was great," he explains. "Yet, if this functionality is not integrated into our core product, then all the vendor's work was for nothing."
Of course, acquisitions further exacerbate integration challenges. Besides dealing with their own aging systems, banks often take on new legacy platforms that do not communicate well with existing systems and channel applications. Further, multiple versions of departmental solutions often are maintained in many places throughout the organization, according to Paul Danola, president of account processing service provider Metavante Financial Solutions Group (Milwaukee). And this adds new burdens on core banking operations. "Systems that cannot support immediate updates of account information and/or require long outages for batch processing" take a toll on customer service and internal productivity, says Danola.
Some banks have chosen to bite the bullet and update systems. However, with a price tag of approximately "$100 million to $200 million per bank, this is not something that banks enter into lightly," says TowerGroup's Hunt. "Whether they chose to replace or modernize systems, these upgrades can impact customer service and affect the bottom line." As a result, many banks have created homegrown interfaces to "get a few more years out of the system," he says.
But this leads to other challenges. Banks are burdened by additional customizations combined with a high level of retirement among the workforce that created the code, explains Charlotte-based Eric Livingston, senior executive for Accenture's banking solutions for North America. Meanwhile, as new industry regulations are imposed, they place new weight on all previous customizations, driving up delivery time and cost, he adds.
While they still are hesitant to rip and replace existing systems, banks also realize that traditional point solution strategies are not ideal. Thus, companies are exploring new ways to reengineer processes and expand solutions across the organization. "They are clearly trying to remove point-to-point architecture and replace that with a platform that will leverage all of a company's content assets and delivery channels to improve service levels," says Frank Sanchez, president of enterprise solutions for financial technology processing systems provider Fidelity National Information Services (Jacksonville, Fla.).
The ideal solutions will help companies break down silos, reuse data and business logic, and, often, deliver one view of the customer. "As a result, banks are opting for more-open, scalable architecture that allows them to evolve wherever their business leads them," says Rudy Wolfs, CIO, ING Direct (Bloomington, Md.), a division of ING Group N.V. (Amsterdam; US$91.14 billion in assets). One of the secrets of ING's success, according to Wolfs, is to avoid silos by tightly integrating core processes in an open, component-based architecture. "We don't have the ability to physically touch the customer's hand, so based on our business model, we need to have a unique approach when delivering our telephone and Web experiences," Wolfs says.
By adding the Profile core banking platform from Fidelity National Information Services, ING is able to leverage "a lot of built-in functionality, including replication, journaling and security," Wolfs relates. "It also delivers amazing speed to deliver a fast customer experience in real time." Profile also integrates ING's Web and call center, ensuring that all transactions are updated in real time regardless of the method with which a customer conducts business with ING. "Channel loggings record transactions and update results of transactions within seconds," Wolfs adds. "Customers can see results of their transactions instantly."
Salem, Ore.-based Pioneer Bank & Trust ($225 million in assets) has a similar story. Facing the Fed's image exchange mandate and other regulatory hurdles, Pioneer knew its disparate systems would not support compliance. The bank decided its best bet was to find an open platform that could work with systems from a variety of vendors, according to Mary Ann Vasend, Pioneer's COO.
By using online banking, core processing, financial management, MoneyVest reserves management and document management systems all from Brookfield, Wis.-based Fiserv's Information Technology Inc. (ITI) division, "everything is on one system," says Vasend. "You no longer need to go back and forth between different systems to get all the information you need."
Since adding ITI's integrated solutions in May 2004, Pioneer has decreased costs, experienced smoother transactions at teller windows and increased processing efficiency. "There have also been fewer system outages," says Joyce Boettger, assistant vice president, operations administrator, Pioneer. "In the event of an outage, there are also easier resolutions because we are only dealing with one vendor."
Tight integration also was a priority for Lone Star National Bank. The company transitioned to a core system from Jack Henry & Associates three years ago. "The company's ancillary products are already tightly integrated to the core system," says the bank's Leblond, who adds that the system supports the bank's item processing, check processing, Internet banking, ATMs, voice response and telephone center.
Components of Integration
To achieve tighter integration, many banks are implementing component-based architecture. While component-based architecture leverages a bank's core system, integrated components are wrapped around the core. "The architecture allows for more shared resources," says ING's Wolfs. "Simultaneously, the core still runs independently so there is limited impact on the back end."
ING's component-based shop comes in handy during the bank's regression and maintenance testing. "Given that all the pieces are centralized, we need to make sure our test results do not impact other systems. That is the No. 1 issue we run into today," Wolfs says. "But using component-based architecture allows for more-isolated, discrete sets of functionality that can be updated with no impact on other areas."
The most popular component-based architecture, and the hottest buzzword in the industry, is SOA, or service-oriented architecture. The strategy is a services layer, or interface, that enables companies to leverage data and transaction capabilities of existing systems across an enterprise through reusable services. As companies wrap SOA interfaces around legacy systems, they often use Web-based front ends to access data on existing back ends, helping to modernize legacy systems and automate manual processes.
"Banks have a need and desire to get all content within core systems and deliver information across the distribution channels they have," says Fidelity National's Sanchez. "SOA can accomplish this."
Open, Web-based SOA interfaces are speeding up implementation times and lowering the barrier to entry for developing robust, well-layered applications, "resulting in the promise of greater cross-system interoperability than was generally seen in the past," says Metavante's Danola.
"Often, it takes less than $1 million to get a full service layer of applications," says Shane Tulloch, CEO of SOA applications provider SEEC (Pittsburgh). "It is a fast, flexible way to connect business components and create automation."
However, experts warn that SOA may not be a silver bullet. "While the promise of SOA is very real, some companies tend to get carried away and try to apply it across systems -- even in areas it doesn't need to be," says Daniel Chait, managing director of technology consulting firm Lab49 (New York). "Clearly, SOA has the potential to reuse existing solutions, provide additional functionality and enhance integration across a conglomerate enterprise," he adds. "However, SOA is not the hammer for every nail."
As the industry evaluates its options, the key is to adopt solutions that provide more flexibility while upholding security and data integrity. But regardless of the configuration, the message is clear: Banks will need to trade in aging, inflexible systems in order to meet customer expectations and federal regulations in a global marketplace. "We are at a point where U.S core banking systems will need to make a quantum leap to adapt to the emerging market forces of rising customer expectations, globalization and the electronification and acceleration of payments," Metavante's Danola adds. "The core banking systems that successfully adapt to these changes will likely survive as the core banking platforms of the next several decades."