Bank Systems & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


04:54 PM
Connect Directly

Bank Austria Revisits Risk Models Post-Crisis

The Austrian bank has already developed a universal market data engine for sending accurate market data to its risk models, it's now focused on credit basis risk and new rules from BIS and IASB.

The global credit crisis has caused banks all over the world to reconsider the way they measure and monitor risk and Bank Austria is no exception. "With hindsight one can say that the credit crunch has changed the way we manage risk, but to go even further, I'd say it redefined the kind of risk you want to manage," says Peter Schweighofer, head of section markets at the bank, who spoke to Bank Systems & Technology in an interview last week. "Let me give you an example: the crisis led to an increase in importance of risk types that have been considered negligible previously. We are now monitoring credit basis risk across currencies, and across different countries' tax rules, which is of equal or even greater importance than the interest rate risk measured in terms of basis point value."

Quants in Bank Austria's risk management department wrote and verified the new models using Matlab from The Mathworks. "The process of identifying the problem, developing the model, testing the model, and incorporating it into IT, is a very short, slim and fast process," says Schweighofer. "You're not having to draw up huge documents explaining the model and describe the problems for IT, which has to develop something and have it reviewed by the business, that's a long process."

Bank Austria also uses Matlab as a data calculation engine for computing the near-time and end-of-day derived market data required for market risk and performance management. The bank has a central market data repository where it collects external and internal data related to market movements, such as stock prices, interest rates, FX rates, and issuer ratings. Sources include Thomson Reuters, Bloomberg, Markit, and Superderivatives, as well as internal sources such as the bank's trading floor. To cleanse the data, the bank has been working on a sophisticated routine for doing pattern matching on this market data in order to identify any misalignment across asset classes and instruments that will help enhance data quality. It helps to automatically correct abnormalities and missing data by constructing new data points. Schweighofer says that by using Matlab, the bank was able to reduce its development time by 50 percent, improve risk management and reduce operational, audit and maintenance costs.

To calculate Value at Risk and perform stress tests, Schweighofer's team uses an internally developed model programmed in C++.

The latest areas of risk focus for the bank have been influenced by new rules from the IASB Expert Advisory Panel on "measuring and disclosing the fair value of financial markets that are no longer active," which stipulates the calculation and usage of fair value adjustments for credit risk and liquidity risk; and Basil Agreement amendments that provide guidelines for computing capital for incremental risk in the trading book.

"These documents have two things in common: they try to bridge the gaps of previous frameworks and in order to meet their requirements, you need to provide sound and reliable market data," Schweighofer says. Here, again the market data engine helps ensure the soundness and reliability of the market data.

One potential change in the works for Bank Austria's use of MathWorks software is to run it in parallel computing mode. "The calculations we run with Matlab are fast, yet for some tasks (e.g. the end of day confirmation/scrubbing/verification of market data) we have a very short time span (15-20 minutes)," Schweighofer says. "Here, every minute is a minute won for the human user. With that perspective, we are not thinking of using a cluster but rather leveraging the multicore architecture of the application server."

The bank is also considering adding a richer user interface and adding an enhanced data quality management feature for error detection and correction. "The recent months have shown that a crucial, if not the biggest, factor in the risk model is the input market data," says Schweighofer. "Not even the best risk model can compensate for failures in market data " I guess in accounting they'd call such an approach GIGO " garbage in, garbage out. This effect gains even further importance since models for measuring our counterparty risk in the OTC business require longer and equally weighed times series."

Register for Bank Systems & Technology Newsletters
Bank Systems & Technology Radio
Archived Audio Interviews
Join Bank Systems & Technology Associate Editor Bryan Yurcan, and guests Karen Massey and Jerry Silva from IDC Financial Insights, for a conversation about the firm's 11th annual FinTech rankings.