The FRTB demands a careful read because the changes required are in the name – fundamental. Lynn Strongin Dodds looks at the latest set of revisions and what banks should be doing to prepare.
Earlier this year, the Basel Committee on Banking Supervision (BCBS) published the long awaited revised standards for minimum capital requirements for market risk of the Fundamental Review of the Trading Book (FRTB). Although not as onerous as the original recommendations, they will still have far reaching consequences for assets that are perceived as risky such as OTC derivatives.
“It is important to remember that FRTB is only a proposal at this point in time, but many in the investment banking industry are concerned about the implications of these proposals because they claim that FRTB would largely eliminate the ability of banks to take risks in the marketplace,” says Russell Dinnage, lead consultant at consultancy Greyspark.
Overall, the FRTB, which is set to launch in 2019 is a complete overhaul of the way banks assess market risk in the trading book. The new framework is designed to tackle both the internal modelling for market risk which was partially revised in 2009 as part of Basel 2.5 as well as the standardised approach which has been untouched in the past seven years. The main objectives are to harmonise the treatment of market risk across national jurisdictions and to rectify some of the issues that caused banks to falter during the financial crisis.
There are many moving parts but in essence the FRTB requires banks to change the calibration methods for risk models, replace Value at Risk (VaR) with an expected shortfall for internal models, incorporate liquidity horizons into market risk and make considerations around hedging. “It is forcing financial institutions to widen their remit and look at the whole bank’s P&L,” says Peter Farely, Marketing Strategist, Capital Markets at Misys.
According to Thomas Ehmer, senior manager at Baringa Partners there are three main sections to the FRTB – the Trading Book/Bank Book boundary; the revised standardised approach and the internal model based approach. He notes that some of the main problems with the past regime, were that the Trading Book/ Banking Book boundary was too permeable while the standardised approach was not risk sensitive enough. In addition, the internal model used liquidity, pro-cyclical calibration approach which was inconsistent plus had a patchy track record in capturing risks.
Given the scale of the overhaul and the complexity of the issues, it is not surprising the FRTB was hotly debated. It took three rounds of consultations, four quantitative impact studies and around five years of discussions to agree on the final text. Looking at the different components separately, the new framework imposes stricter rules for internal transfers between the trading and banking books which will reduce the potential for regulatory arbitrage.
“It is important to remember that FRTB is only a proposal at this point in time, but many in the investment banking industry are concerned about the implications of these proposals because they claim that FRTB would largely eliminate the ability of banks to take risks in the marketplace,” says Russell Dinnage, lead consultant at consultancy Greyspark.
Moreover, the bank will also have to produce a presumptive list of assets that should be placed in the trading book unless a justifiable reason exists not to do so. Under the old regime, firms were effectively encouraged to push non-performing trading book holdings to the banking book, thereby reducing the amount of required capital cover by 80%. However, now it will be trickier to switch instruments between the books.
The other significant alteration is the push to the standardised approach, which is in line with current regulatory thinking in terms of the overall use of internal models. However, the revised rules for capital calculations relies on a bank’s pricing models to capture more detailed risk factors than the current format. For debt instruments and commodities, the capital charges will be determined by more granular risk-weight tables while equity risk-weights will be driven by the issuer’s geographic location and financial soundness. Positions will be subject to an additional ‘Default Risk Charge’ and a ‘Residual risk add-on’.
“In general, the direction of travel is towards a standardised from an internal based approach but that is not a surprise given what happened in 2007, 08 and 09,” says Richard Bennett, head of Regulatory Reporting, EMEA for Wolters Kluwer. “If a financial institution uses its own internal model, it will need to calculate its capital requirements and generate the reporting, but it will be subject to regulatory approval at the trading desk level versus the portfolio level.”
Banks will have to prove the value of their internal models through back testing and P&L attribution assessments using daily model results. Those that generate inaccurate results will face harsher penalties than the old system and be forced back to the standardised approach for a minimum of 12 months.
Internal models will also be subject to more detailed liquidity horizons – the time needed to sell or hedge an asset during market stress without adversely affecting prices. While riskier assets will suffer, credit exposures have been the main beneficiaries from the newly revised roadmap. For example, high-yield sovereign credit spread and investment-grade corporate credit spread risk has moved from the 60-day liquidity bucket to a newly introduced 40-day bucket. High-yield corporate credit spread and small-cap equity price volatility has also shifted from the 120 to the 60-day bucket, while “other credit spreads” have moved from the 250-day bucket to the 120-day bucket.
Banks sticking with internal models will also be squeezed by the introduction of non-modellable risk factors, which under the final framework means only those risks that meet strict data availability and quality are considered as modellable, according to Daniel Mayer a manager in Deloitte’s risk advisory business. All other risks must be accounted for by a catch all capital charge which will be calculated using risk specific stress scenarios. If the BCBS’ 2015 quantitative study is anything to go by, this could be significant.
“This is more than just a recalibration exercise,” says Zeshan Choudhry, partner at Deloitte. “It requires a significant re-engineering of processes from front to back that will involve multiple internal and external stakeholders. It also not just covers the building of the models but also the back testing, P&L attribution models and risk tolerances.”
Numbers crunched by the Basel Committee show that the capital requirements will be roughly 40% higher in the internal model versus the standardised approach. Breaking it down per risk component, the additional capital ranges from about 20% (for credit spread risk: non-securitisations and foreign exchange risk) to about 50% for equity risk. The highest standard deviations are with interest rate risk and, foreign exchange risk – the latter being applicable to bank with only a banking book as well.
FRTB’s implementation would result in wide-scale management-level, operational and technology changes within many banks, according to Dinnage. Traditionally, all the information is stuffed in the quarterly or annual reports but at the end of the day the FTRB would require trading desks, business units, middle and senior offices to know the risks out in the market and present a picture of their P&L on a daily basis.
This will lead to banks carefully reviewing the instruments they trade. For example, as Sven Ludwig, Managing Director & Head of SME Risk Management and Analytics, EMEA, at FIS (formerly Sungard), notes, “the liquidity horizons forces banks to think about the capital consumption of every trade and how they should be structured,” he says. “The broader more complex products such as OTC products will have longer liquidity horizons and they will be more costly to trade.”
Mayer believes that the introduction of non-modellable risk factors also means that “illiquid credit names will have a higher charge than more liquid assets due to non modellable risk factors. “They will need more data but it is not as readily available for these types of instruments and trading could become more volatile,” he adds.
The art of hedging will also come under greater scrutiny. “Under the current regime, the heads of trading know what needs to be done from a risk management perspective,” says Bennett. “However, that knowledge and experience has not had to go through a standardised model. Now, they will to do a cost benefit analysis of the different hedges they want to put in place and at what end of the curve.”
On the operational front, banks, as with many regulations, are at different stages of preparations. “This is more than just a recalibration exercise,” says Zeshan Choudhry, partner at Deloitte. “It requires a significant re-engineering of processes from front to back that will involve multiple internal and external stakeholders. It also not just covers the building of the models but also the back testing, P&L attribution models and risk tolerances.”
Farley also believes that there will need to be a stronger link between the risk function in the middle office and front office. “Banks will now to do to 30 to 40 times of calculations in real time and they need to ensure that those two functions are aligned. One of the problems is that the technology is not there and even if banks have the capability – half of tier one banks we spoke to do – they will have to invest in greater analytical capabilities to do the number crunching.”
The outcomes will vary by bank and depend on their starting points activities, according to Ehmer. Larger tier one institutions are ahead of the game and have mobilised but smaller banks need to understand where on the “pain threshold they stand and what are the most optimal ways to implement the rules. They may not have the internal resources to do so and will look to outsource to third party providers.”