Learning Curve: ISDA Common Domain Model
Proponents of the ISDA Common Domain Model say that if properly implemented, it could generate major cost savings for financial institutions. But what is it, what prompted its creation and how could it work with distributed ledger technology?
GlobalCapital spoke to Stuart McClymont, managing director at JDX base60, a consultancy that has supported ISDA and its members on the CDM initiative.
What is the ISDA CDM and who are the major stakeholders involved?
The ISDA CDM is an initiative driven by ISDA and its members. It aims to develop an industry data standard for booking and representing OTC derivative products and the events that occur on these products throughout their lifecycle.
It is fundamentally a common language that all market participants, regardless of function — trading, risk, settlements, collateral, documentation — can use to store, manage, process and communicate derivative products more efficiently.
The major stakeholders involved are significant market participants like tier one investment banks, large asset and investment managers and large financial market infrastructure (FMI) providers.
What's the history of the project?
ISDA and its members developed an industry legal standard 30 years ago to underpin the trading of OTC derivatives, the ISDA master agreement. This legal standard agreement allowed market participants to trade much higher volume in a more efficient fashion rather than every trade requiring lengthy bespoke legal terms.
This legal standard fueled derivatives trading. But the explosion in activity resulted in a duplicative and costly patchwork of technology to store, manage and process these trades by every market participant based on their own interpretation and needs.
This then resulted in a costly reconciliation environment and infrastructure between market participants to keep everyone’s view in line. Market participants have tried to reduce the costs of supporting derivatives by outsourcing, offshoring, redundancies and cutting businesses. But the cost to income ratios of market participants are still far too high due to the costs of this legacy technology.
ISDA and its members realised in 2017 that they needed a common data standard to consistently store, manage and process derivatives, to capitalise on new innovative technology opportunities. A common data standard is critical in tackling the 30 year accumulation of legacy technology.
The premise behind the project is to create an industry data standard similar to when ISDA created its industry legal standard.
What sorts of cost savings could banks realise if the CDM is properly implemented?
A number of surveys over the years have estimated the costs of running derivative operations and technology support infrastructure across the industry to run between $3bn and $7bn. Implementing a common data standard to drive smart contracts and programmable events, processed and managed on distributed ledger technology, could ultimately cut more than 50% of that cost.
Other than banks, what other financial institutions are participating in CDM development, and how could they benefit?
Market participants will only truly realise the full benefits if all are building, storing, managing, processing and leveraging a common data standard. Players from all corners of the industry need to and are participating, from the tier one investment banks to the large buy-side institutions, to CCPs, to trade repositories and innovative technology providers.
How far away are we from implementation?
In my view this year we will see a number of proofs of concept and live production of CDM. It will be piloted, tested and used in certain cases. Then ISDA will start to drive the new data standard over the next few years.
What potential is there for the use of distributed ledger technology with the CDM?
There is huge potential as the whole concept of a DLT is that people use it as the common infrastructure to store, manage and process information. If we don’t leverage a common data standard when building DLT technology we will just create even more of a legacy technology burden and never realise the full opportunity of this innovative technology.
We would simply end up reconciling our own views of our data to this distributed view and never reduce the mountain of technology cost. Moreover, the amount of capital that could be unlocked by a simpler, more streamlined and efficient post-trade infrastructure dwarfs the cost numbers.