In a Q&A, SunGard’s Tony Scianna explains how financial institutions can build data management infrastructure to support new regulatory requirements and enterprise-wide risk management practices.
Q. What did the recent financial crisis reveal about the ways in which data management can be improved to better support new risk management strategies?
For the financial services industry, the reliance on silo-based infrastructure for trading and post-trade processing has evolved over time, so most banks manage multiple asset classes separately and in different environments with their own technology and data models.
The collapse of Lehman Brothers revealed the extent of the separation; especially with regards to Over-the-Counter (OTC) derivatives, which were managed both on and off balance sheet, processed via various systems and valued under different pricing methodologies. As a consequence, many financial institutions found it very difficult to capture all of the transaction data and understand their true exposure to Lehman Brothers. Also, because Lehman Brothers was such a big market player, especially in the OTC derivatives space, many financial institutions found themselves on the other side of transactions, which meant many had to quickly hedge or cover exposures on transactions when the bank defaulted.
The fact that many banks reported losses a few quarters after the default of Lehman Brothers showed that many financial institutions could not really get a handle on what their true exposure and liabilities were due to their failure to collect data quickly and on an enterprise-wide basis.
Q. What are the current challenges in existing infrastructure used to manage transaction and customer data specifically?
Firstly, the introduction of new instruments or products is part of our evolution of financial services, and this is what makes this industry so interesting and dynamic. Given this constant introduction of new instruments, the fact that there are multiple applications and all kinds of disparate technologies to manage different asset classes is not surprising. However, that is part of the problem because it prevents the holistic view of positions, trade activity and thus risk exposure.
Specifically, if a firm has all different types of instruments being managed in completely different environment, the challenge is getting the data together so it can have a comprehensive and single view of the risk and exposure to any given customer, counterparty, asset class, or even a particular market. This type of analysis on an enterprise-wide basis is crucial, as the recent financial crisis and ongoing market volatility shows. For instance, today there is more emphasis on finding a firm’s enterprise wide exposure to a given factor, such as a single country like Greece or Egypt.
The second challenge is that firms have historically invested in getting to the 'speed of light' for trade execution.But now, the financial crisis has brought to light the fact that the industry needs to focus and invest in the middle and back office.
Legacy applications are usually batch-based and process overnight, which means that despite having real-time capability for trade execution, the firm is waiting for the next day to find out what its positions and exposures are. Today, there is more focus on getting the middle and back office as close to real-time (or near to it) access to transaction data on executed trades and positions. There are a lot of ways to gain more timely access to this information to both enable a firm to leverage trading activities and to ensure that the data captured is reconciled to the back office to provide this more comprehensive view of activities and exposures.
Q. What is needed now to meet new regulatory requirements?
Beyond real-time access, a firm needs to first capture all the pertinent data required in order to support the calculations or methodologies used to provide reporting to meet current and future regulatory requirements.
This means a firm needs to capture every transaction and then cross-reference the data across customer accounts, multiple applications and asset classes. For instance, a customer may have multiple accounts with a bank in various asset classes with different account numbers. The bank needs to be able to cross-reference all that information so it can view the customer account holistically.
Also, in a case like Lehman Brothers’ default, the firm would need to view all the transactions open with Lehman Brothers and review the collateral deposited and the legal entity dealt with (Lehman Europe versus Lehman US, for example) to see its total exposure to the defaulting counterparty.
In addition to capturing the data and cross-referencing accounts, a firm needs to be able to cross-reference the instrument information. Different trades have different trade identifiers, such as an International Securities Identification Number (ISIN) number. A software solution must be able to support this ability to cross-reference all those different types of instruments, complete with identifiers, even if the instrument is managed in different silos and systems. If a firm cannot identify the instrument properly, then it cannot know if it is being valued accurately or consistently across the organization.
For example, a bank may have the same fixed-income instrument, but one instrument is being used as collateral to support a derivatives transaction and the other is held in the fixed-income silo. The firm needs to make sure it is using the same valuation calculation method to price both instruments so the values match. A standalone application can capture this information, or the firm can have a standard way of managing this across multiple applications.
Once a bank has captured the data and normalized it, it can then use the data for reporting. For instance, with this view of the exposures, a team may be able to incorporate this into the workflow so that it can notify senior management if there is a particular account reaching a problematic level of exposure.
It is extremely important that when a firm brings the data together for a holistic view, it also ensures that there is good understanding of the instruments and that a consistent and standardized method, such as pricing, is applied to instruments held in various systems.
Having access to this information will help a firm make more intelligent decisions and hopefully reduce the amount of time it takes to get out of a bad situation.
Q. How can a financial institution build a flexible data management infrastructure capable of keeping up with a changing market and new reporting demands?
The financial services industry is constantly evolving, so it is very important to figure out a way to account for transactions in as near real-time as possible and manage data in a standardized way so it can be reported effectively. Real-time data access and standardization needs to be part of the thought process as instruments are being created and as firms launch new services to their client base.
The regulators are going to eventually require financial institutions to report on every transaction they do, and on a global basis. Naturally, it will be difficult to govern the different legal jurisdictional issues around the world, but the recent financial crisis has proven that the world is interconnected. No matter where a firm is based or what regulators it reports to, financial institutions will have to report on transactions and in a reasonable amount of time.The Markets in Financial Instruments Directive (MiFID) and the new Office of Financial Research (OFR) reporting requirements are coming up, so firms need to prepare to comply because there will likely be capital-intensive consequences of failing to comply.
Q. Are there organizational changes that need to take place to support this new data management structure?
For many financial institutions, the focus today is on managing regulatory risk, which is a cultural change because most firms have not made the middle and back office a priority before. Historically, investment has centered on the front office, but new regulation and a new market environment have changed this.
Q. What are the other benefits gained through better management of data?
With better data management, a firm can provide the customers with an enterprise-wide view of accounts and positions. Today, a lot of customers are getting multiple statements from the same dealer, so it would be very beneficial for a client to get a single, comprehensive statement. Also, the ability to capture the information as close to real-time as possible means the banks can deliver more real-time information and increase the transparency offered to clients.
Finally, an effective data management strategy with access to real-time information and a better view of enterprise-wide risks will help firms better prepare for regulatory risk as the regulators require more information, in more detail and delivered more quickly. Assisting in the preparation for new regulatory requirements and thus managing regulatory risk is the biggest benefit of improved data management.
Tony Scianna is deputy head of strategy for SunGard’s capital markets business. He represents SunGard in industry groups and provides market insight to help shape business strategy and product development.
Mr. Scianna was previously executive vice president of product management and marketing for SunGard’s Brokerage and Clearance business. Prior to joining SunGard, Mr. Scianna was the chief operating officer of Refco Securities. During his more than 25 years in the financial services industry, he has held positions in operations and technology at Hornblower, Weeks, Hemphill & Noyes, Loeb Rhodes and Spear, Leeds & Kellogg.
Mr. Scianna has extensive knowledge of global securities and derivatives processing, brokerage operations and data management. He also has deep experience in helping firms manage regulatory change, expand their businesses globally and improve their middle- and back-office operations.