Barings, Daiwa, Sumitomo-a trilogy of operational failure-still spook the minds of many bankers as their worst nightmare realized. But as financial institutions continue to invest in increasingly complex technologies and ever-new delivery channels, they open a whole new spectrum of operational risk, a loose-limbed concept that includes potential losses from business interruption, technological failures, natural disasters, errors, lawsuits, trade fixing, faulty compliance, fraud and damage to reputation, often the intangible fallout from these events.
Hard to define and measure, operational risk has traditionally occupied a netherworld below market and credit risk. But headline-grabbing financial fiascoes, decentralized control, the surge in e-commerce and the emergence of new products and business lines have raised its profile.
"Things changed a lot slower 10 years ago, and it was easier for us to stay on top of technology," said Brian Nappi, vice president and manager of audit risk at Summit Bancorp. Previously, Nappi said, his group would show up at the various business units once a year, spend a month or two on the audit, hand in the report and return 12 months later. "Now we have to meet with them monthly, just to keep abreast of all the changes."
The Internet has turned some classically mundane, low-risk activities, such as lending, into high risk. "It creates a whole new risk profile," said Bill Langley, executive vice president and general auditor at Wachovia Corp. "All of a sudden, your customer is not in front of you but at the end of a terminal somewhere. As auditors, we have to recognize that and react to it, which might mean spending time on something that traditionally has not represented all that much risk. We need to be sure management has responded by putting in other forms of mitigating that risk, such as some type of call back or confirmation to the customer to make sure you know who you're dealing with."
And the new applications and greater interconnectivity increase the possibility of privacy and security breaches, and fraud.
"There's all kinds of risks a bank is exposed to when you look into these new channels, but the risks aren't necessarily new. The Internet brings a more public connectivity than we've had in the past," said Peter Murphy, senior vice president of information protection at Bank of America. "When you increase connectivity, you automatically increase your risk."
Software failure can destroy entire portions of a network, and large-scale network outages bring huge losses. The increasing use of technology in back-office support systems makes banks more vulnerable. Distributed denial of service attacks from the hacker community that brought down 11 sites, including Yahoo, eBay and E*Trade a year ago, pose enormous threats.
"As these distributed denial of service tools have become more point and click, it takes less and less technical sophistication to aim one of these guns at a company and turn it loose," Murphy said. "We've seen tools as unsophisticated as 'Download this Web page and change the setting in your browser and access this company's Web page.' The browser just repeatedly refreshes itself every second or two."
The interconnectivity of business-to-business electronic marketplaces also threatens business continuity. "One party relies on another, who relies on another and the domino effect becomes staggering," said Dave Singer, vice president of technology, Internet and e-business risk at Royal Bank of Canada, giving as an example Royal Bank's participation with IBM and the Fluor Corp. in TradeMC, a global electronic procurement market. "Here we have three organizations creating a marketplace. What happens if a smaller company puts an order in that needs a letter of credit that Royal Bank is supposed to approve, and the server goes down? You can't afford a failure. We could put somebody out of business."
GAUGING THE RISK
Keeping pace with the technological change inside the bank has meant new challenges for auditors. For one, it necessitates more reviews of the technological controls. "There are so many different techniques in e-commerce. The technological component requires more attention and a greater degree of sophistication by the reviewer," said Dan Phelps, senior vice president and general auditor at Chicago-based Northern Trust. "The audit group needs to work with risk management and the product development people to assess the control environment." Maintaining reconcilements is another operational risk that comes with automation, said Langley. "Most banks will do a great job of automating the front office but the back-office, nonsexy stuff...no one wants to pay a whole lot of money or spend a lot of time on that. As a consequence, you may have this Cadillac front end but a Studebaker on the back end. Eventually, the volume catches up and you've got a train wreck."
Banks had wrestled with operational risk long before the Web came down the pike, it's just the nature of it that's changed. "As you layer new forms of technology, you don't give up anything of the old world," said Jean E. Davis, senior executive vice president of operations at Winston-Salem, N.C.-based Wachovia. "You're still handling a lot of paper items, doing a lot of things through your branch network, but you've multiplied the ways your customer comes to you."
The financial services industry sunk $18 million into operational risk technologies in 1999, compared with $11 million in 1999, according to Meridien Research. Most of the spending occurred in North America and the United Kingdom, although banks in France and Germany have begun to catch up.
Until recently, banks trailed insurance companies and securities houses in establishing operational risk initiatives, according to Meridien. In 1997, Meridien estimated that of the leading 500 financial institutions, only 11% of banks had operational risk measures in place, compared with 16% of insurance companies and 12% of securities firms. By 1999, Meridien found that the percentage of banks tackling operational risk at some level exceeded 70%, surpassing both insurance and securities firms. Meridien predicted that at least 80% to 90% of the top 500 banks would begin initiatives within the next two or three years.
"Interest has skyrocketed in the last 12 months," said Tara McLenaghen, vice president and head of operational risk at Royal Bank. "It's the increasing use of technology-in back-office systems, support systems, information systems, in collecting information on our clients, and, of course, Internet error-that all feeds into it."
A push to comply with the privacy provisions of the Gramm-Leach-Bliley Act has also focused attention on operational risk, as has the move toward mergers and consolidation, which heightens the potential for systems slips.
"When you're integrating two banks' operating systems into one, you've got huge opportunities to have accounts not balance, not reconcile," said Wachovia's Davis.
Another major motivator came from the Bank for International Settlement's Basel Committee's announcement in mid-1999 that it intended to levy capital charges against banks for operational risk, just as it does for market and credit risk. "There was a fear that regulators would start assigning capital to operational risk, and banks didn't have a good understanding of it," said Deborah Williams, research director at Meridien. The Basel Committee will publish new regulatory capital requirements for operational risk sometime this year.
Many banks use a percentage-typically 30%-of their noninterest-related expenses to come up with a capital charge. "That's just a number thrown out," said Wachovia's Langley. "It might be right for a lot of institutions, but you'll never know unless you take the time and do the homework." And that's the tough part.
Its catchall character and dearth of historical data have stymied efforts to quantify operational risk. Market and credit risk draw on pools of observable market and peer data that serve as a baseline for benchmark comparisons, but operational risk lacks such fundamental underpinnings. "There's no golden code out there for measuring the risk, and everybody's kind of taking a shot at it and coming up with something that works for their institution. It's all art at the moment," said Langley.
RISK BY THE NUMBERS
Even if it can't be pinned down, operational risk has still had an impact, accounting for almost 30% of all derivative losses, or about $5 billion, in 1999, according to Capital Markets Risk Advisors. Not only has it triggered some of the financial industry's most colossal catastrophes, like Barings and NatWest, but has also touched off a number of smaller day-to-day losses, such as penalty interest charges from failed settlements, which add up.
"The amounts can be quite substantial and quite often have a significant impact on the profitability of the business," said Andrew Gray, a partner and operational risk expert at PricewaterhouseCoopers in London. "We've done some studies, and, cumulatively, some of the large, global U.S. investment banks have lost up to $300 million a year this way."
For these daily, attritional losses, banks can usually gather data and determine their frequency and severity, provided they can detect them, said Meridien's Williams. "The difficulty comes when you know there's a possibility of a loss, but it has never occurred or you don't have enough data points to estimate what the severity or frequency might be."
Trying to quantify the extreme "rogue trader" type of losses or determine their probability, said Williams, is somewhat akin to estimating the damages and likelihood of a catastrophic earthquake hitting Manhattan, except in the case of the earthquake, banks could assess the risk from insurance actuarial tables. "But they've not invented an insurance company that would normally underwrite a Nick Leeson type of thing," she said, referring to the 28-year-old trader who brought down 233-year-old Barings.
To overcome the data shortage, financial institutions have formed consortia, such as the Multinational Operational Risk Exchange (MORE), to share nonpublic operational loss data and risk characteristics anonymously. Sponsored by NetRisk, a risk and performance management company, the Global Association of Risk Professionals, and the Risk Management Association, MORE compiles the risk and loss data in NetRisk's RiskOps database. The data will be used mainly for statistical modeling and benchmarking by the approximately 17 member financial institutions. Participating banks, which have already submitted third quarter 2000 data, include FleetBoston, Canadian Imperial Bank of Commerce (which also has licensed Risk Ops software), Toronto-Dominion Bank and Royal Bank of Canada.
"We hope the data will give us a sense of what the average experience is," said Royal Bank's McLenaghen. "Even if something hasn't happened to us, we're hoping to learn from what happens to other institutions."
A few vendors have come out with technologies that help companies quantify or assess their risk, but none yet work as a standalone application, according to Meridien. J.P. Morgan & Co., for example, developed Horizon. Used internally throughout Morgan for about a year and a half and distributed by Ernst & Young, Horizon replaced paper-based risk self-assessments with automated controls and procedures monitored from one location. Designed to help management pinpoint vulnerabilities at every level, its Web-based technology allows for standardized reporting formats.
Morgan has added an e-mail component, "so that when action plans are coming due, people get notified," said Craig Spielmann, vice president and global controls officer at Manhattan-based Morgan, and an action plan update screen. A just-released version includes an audit module, and allows for greater self-assessment of the degree of risk instead of just control quality.
The J.P. Morgan-Chase merger gave Horizon a boost, as one of Chase's businesses, which Spielmann declined to name, will use it. Spielmann said Horizon also has agreements with a European bank, an offshore bank, an insurance company, a major brokerage and an energy company.
Operational Risk Inc.-which, like NetRisk, grew out of Bankers Trust-focused exclusively on operational risk software but ceased operations late last summer.
Algorithmics markets its Algo WatchDog as a near full-service operational risk system. Geared specifically to financial institutions, its system employs models to predict losses, calculate capital and identify vulnerabilities. It also simulates risks and loss scenarios, making use of internal and external data.
Although a standalone product in its own right, it will soon become part of Algorithmics' Algo suite of market and credit risk products. Although company spokesman Dave Paolini concedes that the market for operational risk products "has not expanded as fast as we hoped," Algorithmics has signed up six major bank clients for Algo WatchDog, said Paolini, including Paribas. The Big 5 accounting firms have proposed solutions too.
WHO GETS BLAMED?
The increasing technological complexity in banking poses other threats that may not have the initial punch of a virus attack or a credit-card-number-theft-and-blackmailing scheme but could still cause damage.
With banks offering customers an ever-greater number of services, they're relying more on outside providers to deliver them.
"You suddenly expose yourself not just to the risk that you as a company can manage your people, processes and technology, but you become dependent on your service providers as well," said Wachovia's Davis.
To contain this risk, Wachovia's audit group created a database that maintains a complete file of every outside company with whom the bank has a relationship, a contract or any type of partnership. "A few years ago, that was not nearly as required," said Davis.
To offer its customers online transactional account aggregation, Royal Bank partnered-and took a stake in-CashEdge and raised its reputational risk a considerable notch, said Royal Bank's Singer. "If anything goes wrong with privacy or security, who's going to get blamed? It's not CashEdge, the small company out in California. It's Royal Bank."
Liability questions haven't been completely resolved, said Sai Huda, president and CEO of ComplianceCoach.com, a regulatory compliance resource and training company in Washington, D.C., and San Diego. "The regulators are still struggling with that and haven't issued any guidance. Even the Gramm-Leach-Bliley Act doesn't quite address that directly. That's an issue that's evolving."
The Office of the Comptroller of the Currency emphasizes the need for proper due diligence. "As there are more dependencies, it's imperative that the bank and the service providers define the responsibilities and take strategies to ensure these scenarios don't happen," said Clifford A. Wilkie, director of the OCC's bank technology division.
The Internet also raises questions about legal jurisdiction when business transactions cross borders. "How do the regulators assess a server that might be in Canada conducting business that might be in the U.S.? There really hasn't been a clear statement from them, other than they know this is an issue and they want to look at it," said Singer.
France, for example, has for a long time claimed jurisdiction over matters that involved its nationals, but other countries have not drawn up specific legislation about how they're going to treat jurisdiction, said Matthew Norris, a technology underwriter at Lloyd's Hiscox Syndicate in London. The uncertainty, he said, increases a bank's risk. "And the longer you sit in court, the more you increase your financial exposure."
A Model Approach to Risk
Royal Bank of Canada has developed a methodology that will allow it to quantify operational risk once it has the data, "which is really the next problem," said Tara McLenaghen, head of operational risk at Toronto-based Royal Bank. "That requires us to collect loss data across the entire organization in a systematic and consistent way, and we'll need two or three years of consistent data before we can model with any kind of confidence. But we know how to do it now, which was the first big step."
Royal Bank has based its system on actuarial models, similar to what an insurance company would use, and will categorize loss data by business line and loss type, noting, for example, whether it was a write-down, fraud, client reimbursement, regulatory fine or legal cost.
Ideally, if the bank does that for each line of business, it should begin to see an underlying pattern. "We've got the models all figured out, and we have a loss database set up," said McLenaghen.
After some basic actuarial calculations, the bank should be able to come up with a distribution for each type of loss in each line of business, "and we'll hold enough capital to make sure we're covered with 99.5% confidence. We're only now trying to figure out how to add up what our operational losses are in any given category. We've never really understood where our biggest losses are. What we're trying to do is give each business unit some tools to help them rate their operational risk. We'll start adding up how much we're losing to different types of risk and feed that information back to the businesses."