In a Q&A, Interactive Data’s Darren Marsh explains how centralising reference data management improves a firm’s ability to understand its risk exposures and enables firms to use capital more efficiently and better meet new regulatory reporting requirements.
Q. What are the new motivations driving investment in reference data management? How have new regulation and the focus on risk management impacted reference data projects?
The two main drivers behind reference data projects are strengthening risk management and regulatory reporting – those two go very much hand-in-hand.
Following the financial crisis it was widely recognised that there was a failure of risk management and controls so really the regulatory priority has been to develop the macro economic tools needed to monitor systemic risk. The thought process behind this is akin to long weather forecasting where firms model potential extreme events, but to model risk effectively and accurately regulators will need to process huge amounts of information within their calculation engines..
However, regulatory reporting is reliant on the firm’s own ability to provide positions and reference data in an aggregated, standardised view so regulators can model risk at a systemic level.
This presents firms with a set of challenges around data quality consistency and operational models.
Basel III introduces increased requirements for higher levels and quality of capital adequacy and formalises the maintenance of liquidity ratios Stress and scenario testing rules to ensure a firm to have the operating capital needed to keep operating in times of single-name or market-wide stress .
In addition to the regulatory drivers firms are beginning to recognize the potential business benefits that increased transparency can afford them.
By strengthening risk management processes to meet these new regulatory and capital demands, some financial institutions are looking to publish information about their sensitivity to risk to seek competitive advantage. At the same time they are looking to leverage the enterprise-wide view of inter-relationships between financial instruments, issuers and market data required to support reporting to create additional tools to monitor their investment strategy, to ensure the levels of risk taken are commensurate with the services they provide.
Q. How are financial institutions improving data management practices to comply with new regulatory reporting requirements and to improve risk management practices?
This is an opportunity for firms to look at how they manage structured and unstructured data with the organisation. Financial institutions need to be able to aggregate, consolidate and present data from across the enterprise in a standardised way – this where relationships between various data points becomes a critical component of the whole process.
There is a trend of grouping asset classes together by issuer to better understand the positions held and how they relate to each other by virtue of the issuing entity. So, for instance, firms are not just looking at equity and debt data, but also considering derivatives issued by that issuer. Further understanding the correlations between related entities themselves is essential to work out what the true risk exposure is by ‘rolling up’ individual exposures to the ultimate parent entity.
This equally applies to understanding exposure to a single name or market-wide stress event by virtue of the underlying securities held in an investment portfolio or fund. A risk manager’s ability, to access exposure data in a timely enough fashion, to effectively mitigate that risk is dependent on the high availability and quality of the underlying data.
Technical development to support data management improvements will vary by firms. While some firms prefer a fully centralised model we see some firms preference for a more federated approach where the objective is generally to provide a single view of the core components of the information that can be distributed throughout the organisation. In this approach, there will be a core set of these data components that every department needs, but the individual business functions themselves will maintain responsibility over the additional data requirements specific to their business function. For example, both the legal and credit department are involved in the on-boarding process of a new client; both need the core attributes of issuer data (identification codes and entity linkage) to ascertain uniqueness but the legal department will manage the specific contractual terms whilst the credit department will apply the core information required to determine credit worthiness such as credit ratings. So, there are core competencies that are shared by different business functions but departments also have their own special needs.
It may well be that the federated approach is more advantageous to some as editorial responsibility of the centralised and core aspects of reference and entity data are managed separately from the specific data required by individual business functions such as credit and legal.
Whichever approach is preferred the objective will be the same; to consolidate security to entity master and entity hierarchies into a single enterprise wide view.
Q. What is the main challenges firms face when improving reference data today? What are the challenges in moving away from silos, for instance? And are there other obstacles that financial institutions should be aware of?
The silo structure is one of the major reasons why many firms have the disintermediation with data. Certainly the silo culture a result of growth primarily and bringing in organisation and absorbing different business functions. Also, traditionally the banks have maintained data on a product basis so it is a very vertical view of information and the challenge is moving from this silo approach to a more holistic view.
The second challenge is the lack of standards within the entity data space. For the last 20 years there has been an ISIN identifier, which accurately identifies an issue but there isn’t the equivalent for entity data. This means disparate databases, systems and sources – all present slightly different versions of names and the update addresses in different time frames. And in this scenario, it is difficult for different areas of the organisation to aggregate all this data together.
A prevalence of manual processes built around specific business functions and disintermediation of essential data sets such as securities master and entity linkage data all contribute to the operational challenges faced.
The operational challenges come down to identification. So, establishing a global legal entity identifier (LEI) is an obvious a step in the right direction.
A Unique code to id business entities involved in financial transactions will reduce the amount of time required to aggregate data from various databases across the organization improving risk management and facilitating regulatory reporting.
Establishing global LEIs is an obvious starting point for the industry and has a major advantage over previous initiatives because of its regulatory influence.
Provided the proposed LEI standard (ISO 17442) has international regulatory buy-in, it will provide, at the very least, a common language for the securities industry to use codes and link entities from internal and external data sources. For this reason, it is a real enabler, certainly at the very highest level. However, the LEI is not going to provide all the data linkages in itself but because firms will potentially have a common identifier to use internally they can evaluate all the data sources (in various places) and perform aggregation and reconciliation to raise the bar in terms of data quality.
Q. How are firms improving how they manage event triggers as part of renewed focus on time-sensitive risk monitoring and use of more sophisticated risk tools or practices?
Event triggers are certainly a growing area. And there is recognition that once you have the information clean, an up-to-date data governance policy needs to be put in place to keep those data levels high. There has always been a misconception in the market that counterparty data is relatively static and it does not change, but when you consider that 20% of all companies in Companies House will have some kind of change to their core reference data information (i.e. change of name, address change), this suggests that this type of information is quite mobile. Taking into account M&A activity it is crucial that firms continue to monitor links between entities to understand where the exposure lies – whether to a single issuer or any issuing group by virtue of its parent entity.
Event triggers therefore provide risk managers with the ability to track and apply a wide spectrum of events that can impact a firm’s holdings or positions.
There are a number of different triggers that would result in company reviewing of revisiting an entity master record. These triggers can come from a variety of sources and as a result are often managed by different business areas across a firm.
Outlining the specific events material to each business function within the entity data maintenance policy can help firms to maintain a holistic view of upcoming changes that may impact customer, operational or regulatory risks.
Q. What are the operational, risk management and other advantages of investing in reference data to support new regulatory reporting and risk management needs? Can this investment translate to a competitive advantage for firms?
Through early adoption of an enterprise-wide view of reference data, firms will benefit from having data consistency across the enterprise and this can provide additional business benefits.
From a regulatory perspective, inaccurate identification of issuers can result in firms putting too much capital aside as part of their Basel II capital adequacy requirements because they could not always accurately identify the issuer of the position of the holdings within the firm had to therefore apply 100% risk weighting to these positions. And the higher risk weighting calculation means the have to set aside in a lot more adequate capital in order to mitigate that risk of that position.
Financial institutions are looking to correlate the trading databases with a single view of securities to entity data linking in a front to back approach. By creating the mappings at the beginning of the process to identify and aggregate all of those positions to the actual issuer so they can more accurately and quickly calculate the risk weightings and therefore reduce the amount of capital needed to set aside.
With Basel III we are seeing more calls for increased levels of capital adequacy so as more capital is going to be set aside and naturally firms will be focused on making more efficient use of the capital available. And again, a better handle on reference data provides the building blocks that underpin the basis of the risk calculation.
Providing a consistent and combined view of the securities and entities master is an initial step within the operational hub that allows firms to pass that information in a more timely fashion down to other downstream business functions. Knowledge and information is key in this space and if a firm does not obtain the information in a timely enough fashion then it is not able to make the decision that will inevitable help the business. So it is all about making reference data available much earlier in the process and linking it to the data governance policies across the board.
Q. How can technology and service providers help support a firm meet new reporting requirements and centralised data management strategies?
Right now we are seeing more interest in alert driven processes including complex events processing and events driven decision management. We are also seeing the use of semantics technology as well, which is a slightly different approach in that it provides match between individual data points instead of to relational databases. The migration to semantics technology allows the information to be brought together from completely different formats and databases and allows end users to track to structure documents and actively monitor market sentiment.
From our perspective, this trend means service providers, such as Interactive Data, can package information, essential to the risk management process, in a different way so we can combine core entity relationships with a whole raft of ancillary information. This includes fundamentals, credit ratings information, and all the underlying granular securities information data combined into a single package and presented to a client in a dashboard view to help track risk exposure throughout the day.
Such a comprehensive data service will be a huge benefit but this will also allow service providers to help clients from a workflow perspective. I believe we are seeing an transition from clients processing batch files and then taking the relevant information to supply to various business functions. Financial institutions would much rather work on the basis of consuming data as and when its required to be more focused around the individual workflow processes that they maintain in a daily basis. This is a much more ‘on demand’ type of approach and this is the way forward.
Interactive Data is a leading global provider of financial market data, analytics and related solutions to financial institutions, active traders and individual investors.
This article is provided for information purposes only. Nothing herein should be construed as legal or other professional advice or be relied upon as such.