Financial institutions are changing trading strategies to include more listed derivatives products in response to current market conditions and namely increased volatility and interest rate fluctuations. As such, the front office needs more data and analytical power to support the changes in trading activities and pre-trading operational requirements. In this Q&A, Linda Coffman, executive vice president, at SmartStream, explores how access to a broader and deeper pool of exchange traded derivatives (ETDs) reference data can support changing trading strategies and boost a firm’s competitiveness.
Q. How are you seeing trading activities changing around ETDs today?
A. Over the last few years, we have seen unprecedented periods of volatility, starting with the first few months of the pandemic where derivative trading volumes were extremely high. Fast forward to the current day and we see many exchanges offering more ETD products than ever before. As a result, the number of ETD contracts with open interest is at an all-time high. In short, both ETD volumes and the variety of types of trades executed are on the rise.
Q. What new ETD products are coming to market?
A. The number of new commodity products, especially in the energy sector, has soared along with equity index and ETF futures and options. In the first half of the year, new commodity future and option products accounted for 45% of the new products across global ETD exchanges while equity futures and options accounted for 38%. So, the two accounted for most of the new products brought to market in the first half of this year.
If you look at the European Energy Exchange, for instance, in the first half of 2022, they created 21 new products whereas in the same period of 2023 they created 141 products.
Another example is the ICE European Futures Exchange where we saw 33 new products in the first half of 2022, while this year there have been 101.
In particular, May and June have been busy with new products coming online. This growth means data management teams across the world have been very busy trying to keep up with the demand of setting up new products and maintaining changes to existing derivatives data in a timely and accurate manner. In addition, the increase in trading volumes means firms will need to deal with an increased number of trade breaks often caused by underlying reference data issues.
Q. What factors are contributing to the need for this growth in products?
A. There are several factors influencing the ETD market including the war in Ukraine, climate change and the pandemic. Exchanges are responding by issuing products that reflect current market conditions. For example, the war in Ukraine certainly created volatility in the energy sector which we feel directly impacted the number of new commodity products available.
Exchanges also recognise the opportunity to reach a much larger audience in today’s global marketplace. Some exchanges that typically service a more local market sector may be adding products to attract a more geographically diverse audience.
Of course, exchanges are always striving to achieve a competitive edge, which is also a driver as they seek to respond to a market need for more granular and diverse products.
Q. How can firms leverage reference data to better support trading opportunities?
A. Over the last year, there has been a growing need for broader and more granular reference data, and the need to receive that data in a timely fashion. Firms need accurate data before markets open to ensure they’re making correct pre-trade decisions. They also need the ability to execute quickly because a few seconds delay to sort out a data issue can be costly.
In the past, firms had time to ramp up if their models or strategies changed, including having time to outsource the required data. Now everything must be done at a much quicker pace. Arbitration, for instance, across exchanges is much more powerful if a firm is accurately comparing like to like – the only way you can do that is with detailed reference data. A financial instrument may look similar on the surface but it’s not until you get into the intricacies of the derivatives’ underlier that it becomes apparent that the comparison is not like for like.
Accurate reference data is the foundation on which firms are building their automation and AI applications today across the full trade life cycle.
Q. What are the limitations with basic reference data?
A. Many firms still support a narrow set of instrument reference data when it comes to ETDs. They have historically invested in other asset classes, such as fixed income, where they have deeper data models.
At SmartStream, we are currently seeing larger organisations undertaking projects to expand their derivatives data models. Sometimes these projects are attached to a move to the Cloud, for instance, where they take that opportunity to then build in a broader ETD data model to support their trading desk.
In the past, the focus has been for clean reference data to drive straight-through-processing (STP) in the back office. Today, firms recognise that a broader, more granular data set of clean data is just as important for the front office to ensure a competitive edge. As an example, there is interest in a broader set of symbology data to connect the front office to margin and risk platforms. Overall, firms want to ensure that all internal and third-party systems can accurately communicate and identify the same security given the highly automated infrastructure that they depend on to succeed.
Firms are also limited by their ability to capture changes. For example, firms need to react to exchange notifications immediately, and if they are not monitoring those changes and applying it in a way that can be leveraged in their systems, then it devalues their reference data. Missed notifications and the resulting changes can be quite costly both directly and indirectly to a firm.
Q. Are there any challenges firms need to work through to make sure they operate in a seamless fashion?
A. When connecting front, middle, back offices and the many third-party providers now utilized, reference data is often essential to connect the dots.
After spending a lot of budgets on specialised software across the trade lifecycle, firms often find themselves unable to fully implement the software due to a few pieces of missing core reference data. Assumptions are made while procuring the software that the data is available, and it is not until implementation that the problems surface. It could simply be the need for a specific identifier such as an ISIN or the need for attributes such as margin or clearing codes. Whatever the data needs are, having a more robust set of reference data can help firms avoid these costly speed bumps.
Also, a lot of organisations are very siloed and need to move to a place where standardisation of data is the norm. Applying standards, whether that be through symbology or classifications, allows data systems to talk to one another. Anytime a firm can introduce a standard data identifier, it allows its systems to better communicate with one another and external parties, including regulators. This saves both money and effort by resources who often have to manage the exceptions generated due to lack of standardization.
Q. How can firms better leverage both technology and expertise (managed services) to gain access and use of a broader set of reference data and tools?
A. A cost-efficient and flexible technology platform to manage reference data is imperative. Incoming data feeds change constantly – last year we saw the highest number of exchange feed updates – so exchange notifications need to be monitored diligently. Once you know about changes, a firm must devote the development and operational resources to address them. Outsourcing this work through managed services helps firms offset the costs and distractions those activities can bring and leaves in-house resources to focus on trading strategies and core business activities.
Outsourcing also gives access to subject matter experts who can enrich the data and ensure data quality. A firm’s technology platform is obviously important, but a firm also requires subject matter experts who understand complex derivatives and can collaborate with the business and technology to ensure downstream users are getting the data they need. Ensuring high quality data is feeding AI, financial and risk models is very important. Those subject matter experts may not be readily available in house so at SmartStream, we combine our subject matter experts together with our utility technology and offer a unique managed service, it’s all about humans plus technology.
Q. What are the main benefits from adopting more granular reference data?
A. For firms, more accurate output from their analytics, models and AI will certainly drive better revenue results with lower costs and more robust risk mitigation.
SmartStream participated in a project with a large bank analysing the impact of better reference data. They analyzed the impact the better data had on their trade breaks and data exceptions across European exchanges. Their exceptions decreased by 85% with better reference data, and they calculated it to be a cost savings in the multi-millions of dollars. Regarding risk, having more complete and accurate underlying data allows a firm to understand its exposure accurately allowing the firm to react accordingly.
Q. How do you see ETD reference data evolving in the next year?
A. Volumes are going to continue to increase, and exchanges will grow and expand globally, which means more data to manage. As firms move to the Cloud, I believe they will continue to take the opportunity to look for operational inefficiencies and start addressing them.
With exchanges continuing to create more diverse products, firms will need to ensure their front, middle, and back offices can support those products, both from a technical and an operational perspective.