Lynn Strongin Dodds looks at the progress being made, the gaps to be filled and how legacy and new tech are existing side by side as the industry strives to digitalise derivatives post-trade processing through the application of newer technologies (DLT and cloud) to improve operational processing and reduce costs.
Despite all publicity about digitalisation, technological advancement has been sluggish in the derivatives arena. Sell-side firms are still lumbered with ageing infrastructure and the cost of modernising is high especially in today’s lofty inflationary environment. This is why financial organisations are not reinventing the proverbial wheel but looking to blend old and new to create greater efficiencies and cost savings.
Blending rather than re-invention
The problems have been well documented. For example, the workhorse computing mainframe still dominates the banking industry with over 60% of organisations still reliant upon them, according to a study Preparing for a Cloud-enabled, Data-driven World by Celent and DTCC.
In the post trade universe specifically, processes evolved organically and through mergers and acquisitions over time. This created layers of complexity and a patchwork of workflows, systems and technology. Unsurprisingly this has made it challenging to manage trades in a timely and efficient manner as well as provide a holistic, accurate picture across execution and clearing.
“Over the past 15 years we have seen progress in the trade life processing, back-office functions and settlement,” says Philippe Buron, chief technology officer at cloud-based data integrity and reconciliation service provider Duco. “However, the current state of play is that many investment banks are still operating with fragmented infrastructure and relying on manual processes in the derivatives space.”
Virginie O’Shea, founder of Firebrand Research agrees, adding, “in terms of overall post-trade digitalisation outside of clearing, the pace has been slow and there has been little to motivate firms to change. There has been investment, but it is not enough and there are huge gaps especially with PDF-based documentation within for example, terms and conditions documents and it can be hard to monitor expiration dates or triggers and classification changes. If you do not have the right way to manage the underlying documentation, then it is difficult to manage the derivatives lifecycle and risk.”
O’Shea also points to the disparate regulatory regimes and the lack of standardisation hampering innovation and efficiency. In the US, there is the Commodities Futures Trading Commission (CFTC) and Securities and Exchange Commission (SEC) forging their own paths while in Europe there is the European Securities and Market Authority (ESMA). This is not even mentioning the UK’s Financial Conduct Authority (FCA), the Australian Securities and Investments Commission (ASIC), the Monetary Authority of Singapore (MAS) and Hong Kong Securities and Futures Commission.
“There is a recognition that there needs to be a common framework for data and while there are always national priorities, it would be better if they could work together for the good of the industry and for better monitoring global systemic risk,” she says. ” For example, our analysis shows that there are only about 49% of similarities between the data sets required for CFTC Re-write and Emir Refit.”
Other pain points are commission management and the inability of outdated systems to accurately capture all the trade information required to precisely calculate and accrue execution, commission and exchange fees, according to market participants. However, it is not just the big ticket items but also the smaller components that can cause problems.
Take rate cards, for instance. “Through M&A, some firms have acquired multiple legal entities using different brokerage services across different territories which has resulted in a plethora of rate cards, “says Daniel Carpenter, CEO of Meritsoft, a Cognizant company. “This can mean one rate card per instrument, per market in some cases, and with spreadsheets still commonly being used to track them, they are not rationalised, and the true costs are obscured. Without full transparency across the organisation, it’s difficult to renegotiate outdated rates effectively which impacts overall profitability.”
DLT – a slow burn
In the not-too-distant past, distributed ledger technology (DLT) was seen as a revolutionary solution but over the years, it has not delivered the promised goods. This is reflected in a recent study by Acuiti – DLT in Derivatives: Crypto innovation, traditional technology and the market of tomorrow in partnership with ION.
The report confirms that despite the efficiencies DLT will bring to derivatives it is not a panacea for every process or asset class. It noted that the technology is unlikely to play out as a full-scale overhaul of traditional finance systems in the near future. Instead, firms are set to integrate DLT in tandem with other technology, which has developed within established structures.
“For all of the justified hype surrounding DLT, technology developed in traditional finance has stood the test of time and supported the market for decades,” the report added. “Traditional technology and vendors are therefore likely to play a key role in the development of a digital market structure.”
As with many new innovations, DLT will evolve and there will be many failures along the way. One of the biggest and most heralded has been the Australian Stock Exchange’s (ASX) attempt at overhauling its systems. After a series of delays, ASX hit pause on a protracted project to replace its Clearing House Electronic Subregister System (Chess) with a new clearing and settlement based on blockchain technology. The decision, which will result in a write off between AU$245-255 million ($163-$170 million) follows an independent review by Accenture as stated in a recent ASX market announcement. The report identified multiple problems with the beleaguered project.
Lessons from the dot com crisis
“In many ways the early development of blockchain and DLT has comparisons with the dot com experience,” says Will Mitting, founder of Acuiti. “In the initial phases there was huge hype and valuations skyrocketed as people thought it would be very disruptive and quickly deliver a range of solutions. However, the dot com crash showed that expectations had too tight a time horizon. Within a decade far more had been achieved than was thought possible in the early days but it took much longer than people originally expected.”
“Ultimately, I think DLT will take time to evolve and there eventually be a hybrid between DLT and traditional technology,” he says, adding that to date, much of the activity has been in small areas with simple workflows and a smaller number of participants such as the repo market. The most recent example is HQLAx and J.P. Morgan along with Wematch and Ownera simulating a cross chain delivery versus payment repo transaction.
A group press release in October announcing demonstration of feasibility explains the process: “The demonstration showed how rights to securities, recorded in digital collateral records (DCRs) on the HQLAX ledger, and digital cash, recorded at J.P. Morgan, could be recorded and transferred using two different DLT platforms. The simulated transaction was negotiated in the Wematch trading front-end. Ownera connected Wematch and the two distributed ledgers using the open-source FINP2P routing protocol, ensuring the visibility of assets in Wematch, and coordinating the DvP settlement across the HQLAX and J.P. Morgan platforms.”
Looking ahead
Brian Collings, CEO at Torstone Technology, also believes that the applications for DLT are narrow. “I have always said that DLT itself was a solution looking for a problem,” he says. “What we are seeing is that firms are increasingly turning to cloud based modular post-trade architecture. It has become more mature, and the advantages are it can handle large volumes, while also offering flexibility and scalability. Legacy systems have become engrained in firm’s architecture, and they can pose a risk. They are too difficult to get rid of though and the cloud allows legacy systems to continue operating but surrounded by new technology.”
Marc Jay, CTO and head of engineering at Taskize echoes these sentiments. “There has been a lot of investment in DLT, but firms have not seen the rewards. It is a long game and the focus on DLT alone is not enough to address the problems of legacy systems. Firms are using cloud-based solutions as a bridge between old and new. It is more efficient and makes life easier. It means they do not have to patch their old system up and do more for less.”
Looking ahead, it is no surprise that the more liquid end of the derivatives sphere will continue to be primed for digitisation while the more complicated thornier end such as OTC will continue to be a challenge. “I think what we will end with is the very plain vanilla on one side where instruments are commoditised, easy to execute and users can trade with very little human intervention,” says WeMatch David Raccat, UK CEO and co-founder.
“The other stream will consist of the more bespoke products which will have to be handled more physically before they go onto a platform. The question then is what tools can be offered for the trading, negotiation, data back-office processes and liquidity,” he adds.