Banks' batch-driven IT architectures aren't ready to handle the real-time needs of the industry, according to Stanley Young, a partner at Capco (New York). Given the increased data demands by customers, internal users and regulators alike, many financial institutions are taking a hard look at what it would take to transform to real-time systems.
In fact, banks that stand still may face a revenue decline from their increasingly fickle customers. "Banks' profitability will be affected as transaction charges come under pressure in the future," says Young. "They're going to look for better ways of adding value to what they do for their clients."
For corporations, that boils down to real-time information. "Treasurers want to know instantly -- almost intra-day -- what their position is," says Young. "They want access to real-time data, almost transaction by transaction."
Not only that, corporate treasurers and CFOs want analysis based upon that real-time data, designed to answer important questions affecting their ultimate financial performance: "If they have a 'fail' [failed transaction] or a late invoice or late payment in one part of the world, what does that do to their liquidity position? How does that affect their foreign exchange exposure?" asks Young.
It's not just customer demand driving the need for real-time systems. There are also substantial computational requirements associated with risk management, fraud detection and anti-money laundering. "When you talk to the really sharp banks about this, they use multiple methodologies," says Lawrence Ryan, director at the financial services industry practice of HP (Palo Alto, Calif.), which has partnered with Capco in the financial services industry. "Especially when you get into areas like neural networks and pattern-matching, those things are hugely processor-intensive."
But despite the customer demand and risk management needs, most financial institutions aren't ready to handle real-time information. That's due to a dependence on mainframe systems and overnight batch processes. "The banking model is very much, at the moment, a batch process," says Young. "They're running reports on a monthly basis."
Adds Young, "That model has got to change."
However, transforming the operational model from batch-processing to real-time systems requires a substantial shift in IT architecture. It's not that the banks can't receive transaction information quickly enough through the Internet, SWIFTNet, or other inter-bank networks, but rather that they cannot instantly act upon such real-time information. "The pipes are there now -- it's not a network bandwidth issue," says Young.
Instead, the bottleneck has shifted to the processing side. "What happens when that transaction gets within the four walls of a bank?" asks Young. "It goes into a humongous database and then sits there."
The common response? Build another humongous database. "Some of the banks are creating data warehouses so that they can manipulate and manage that data, but it's still not real-time," says Young.
"Given the pressure that banks are under in terms of transaction costs, the ability to aggregate data, create business intelligence out of that data and get it back out to clients is going to be extremely important in how they compete in the future," says Young. "With that in mind, they've got to have the processing power to do those kinds of things."