The derivatives industry has the tools and clear economic and regulatory drivers that pushing ahead plans to optimise their risk management operations now. In a Q&A, Donal Gallagher, Co-Head of the Quantitative Services division at Acadia, explores the opportunities an integrated risk management approach offers firms, how tackling data centralisation is step one and why the benefits are not just about risk mitigation but also about improving the bottom line.
Q. Why is now the time to consider adopting an integrated risk management approach?
A. Firms have a new economic imperative to optimise collectively across networks. Where they used to calculate everything to do with risk management solely within their own operating environments, new standard measures of risk are driving the optimisation of initial margin (IM) and capital across the industry. For example, the International Swaps and Derivatives Association’s (ISDA) Standard Initial Margin Model (SIMM), the Basel III’s Standard Approach to Credit Counterparty Risk (SA-CCR), and Fundamental Review of the Trading Book (FRTB) all provide a standard target variable for IM, credit and market risk respectively, making it in everyone’s interest to collaborate and optimise their positions. Optimisation means taking redundant risk out of the system, not just within the organisation but across the industry.
Capital is expensive—10%-11% minimum per annum for a tier one firm—and weak return on capital can harm a firm’s share price. Previously, large institutions ran their own internal models, but the industry has reached a point of maturity where models have been standardised and the new target variable enables it to take place in a central way across a network of firms that are subject to the same regulation. Buy-side firms are affected by these changes too in terms of both pricing and a counterparties’ appetites to do certain trades.
Q. What are the most significant obstacles firms face when optimising risk management?
A. The biggest challenge is that large financial institutions in general have multiple systems and data silos, segmented by geography, trading line or asset class. Silos arise for various reasons, especially historical acquisitions, and make it difficult for firms to pull together an integrated data set that represents the positions of the institution at the global level.
All these new regulations cut across the whole institution at the group level and pulling data together in a standardised format for multiple purposes has been a big challenge for financial institutions. ISDA has come up with the standardised methodology and Acadia provides a margin and collateral data repository that centralises and optimises risk management, which has set up the foundation for firms to adopt an integrated approach. But the next step – pulling the data out of systems and into the standard format, is a heavier lift.
Risk departments spend around 80%-85% of their time generating risk numbers across geographies, asset classes and so on, which leaves only 15%-20% of their time for understanding what the risk positions actually are. If they can get that data into a clean format, or outsource the computations, they can get it all completed much faster and spend more time on active risk management.
The technology exists to do data transformation in a straightforward manner, however the more systems a firm has, the more complex it is to integrate all the systems across geographies, asset classes and technology teams. Nonetheless, regulatory drivers are pushing firms in this direction.
Q. For a large firm with a complex web of different systems and processes, what would it need to do to adopt an integrated data and risk management approach?
A. It starts with defining the data standard, or data dictionary. What data do they need in what format? Once they have done that, they look where they can get the data. Even if there are multiple systems, there is clever plumbing that can take all that data and put it into the end format.
Then they must select a model. If firms are looking for a blueprint, the ISDA Common Domain Model (CDM), is a standard representation of the trade lifecycle, so they can see a representation of their balance sheet in quasi real-time.
Once they have defined the data dictionary and selected a model, it becomes a regular IT project after that, albeit a complex one, with the potential to greatly reduce operating costs.
A couple of firms have mastered this and have one standard representation of the balance sheet that then flows through all the processes of the organisation.
Q. What about the technology side?
A. The technology used doesn’t really matter. A technology stack might include Python XML for data representation and a big MongoDB database behind it. But the concepts are independent of the technology used to implement them.
Firms that have grown through multiple acquisitions may have many systems running and will face a bigger challenge than smaller more nimble firms. Regulators have seen this problem for a long time and know that the firms that struggle to provide good data are the ones that have data “spaghetti”. Regulations such as the Basel Committee on Banking Supervisions’ BCBS 239 requiring a golden data source that represents the entire firm have been putting pressure on firms to sort this out for more than 10 years.
Firms gain massive operating efficiency and business nimbleness from having a centralised view of their world and a standard operating stack, but implementing it is a business decision. Firms can always pull things together and get the standards done but having that efficient tech stack enables them to really reap the benefits of integrated risk management.
Q. How does Acadia work with clients?
A. Acadia is in the business of industry standard shared computations. It started with collateral processing and moving initial margin. Now we have moved into risk calculation, valuations, capital and all the things you can do once you have that central representation of the OTC derivatives portfolio.
Acadia either provides just the data standard to firms that want to do the computations themselves or runs the whole project for firms that prefer to hand over the raw data and outsource the rest.
Once firms have the standard data, they can compute anything. Computations used to be hard because models had to be designed inhouse and firms needed to run and maintain expensive data centres to do the computations. Acadia provides one, central clean source of market data and a central web service that is infinitely scalable at much lower cost than any firm can provide for itself.
Thanks to the progress of time, the models are now available open source and have become more like the standard maths that the industry runs on. What makes a bank unique is its client relationships, access to capital and markets. It is about creating a unique and profitable business model—it is not about producing risk numbers.
Q. What are the main benefits to firms of centralising their data?
A. Leveraging a central infrastructure frees up a lot of time and resources because firms no longer need to do so much of the work inhouse.
It also enables firms to access services that they could not otherwise, such as new network optimisation services. Two firms know what their positions are facing each other, but they do not know what the other’s positions are with a third counterparty. There could be one side of trades between the three parties that would shrink all the redundant risk. This can only be known by an entity with a central view of all the risk.
Q. How might an integrated risk management approach help firms manage IM requirements under UMR?
A. Firms that have to post IM will almost certainly have to use Acadia because it is the place where everyone submits their IM. But firms have a choice whether to compute those sensitivities themselves, or to ask Acadia to do it. Acadia can handle all the necessary computations related to IM as well as the capital calculation for a far lower price than they could manage it internally. Outsourcing is cost effective because of the shared costs of market data, infrastructure and data standards. And it allows firms to focus on the areas of risk management that are most relevant to the institution’s business model.
Once the data has been centralised, it can be used in many different ways through the trade lifecycle management, from IM to Variation Margin (VM) to computing capital. Firms do not have to do that themselves anymore. It took a while but there is now sufficient maturity in risk models, data, computation and information security, as well as the appetite for firms to outsource within an infrastructure they can trust.
Q. What is one piece of advice you have for our readers for improving risk management in 2022?
A. Optimise, because everybody else is doing it. Get rid of redundant risk, capital, IM and VM. For the last ten years the industry has been responding to regulatory change. Now that the environment is more stable, the next five to ten will be about how to best steer and optimise in the presence of this new regulatory environment.