Solving the derivatives proficiency puzzle
The derivatives efficiency puzzle facing many organisations is not the volume of data they possess but how to maximise its utility.
From serving customers better, to streamlining operations and regulatory reporting, data is the lifeblood of every capital markets business. Moreover, many of the challenges facing financial organisations come down to their ability to marshal and process data. In the post-trade environment, the “rubbish in rubbish out” axiom has become something of a cliché. However, its validity is borne out by the fact that almost every firm has invested in standardisation and automation of data processes and reporting.
Over recent months, capital markets participants have reached a growing consensus that the best way to get a grip on data is through cloud technology. This is a marked departure from the past, when many firms were concerned about both cost and security. Now those beliefs are reversed. The cloud is seen as more secure than on-premise solutions, and its flexibility (only use what you need) is acknowledged as a potential route to cost savings. In addition, the cloud offers scalability and significant collaboration potential. According to Deloitte, about 25% of global banks’ activities are already supported by the public cloud, or use software hosted on the cloud, and over 40% of financial services may be cloud-hosted in a decade.
Probably the greatest potential benefits offered by cloud-based solutions relate to regulation. Whereas legacy technologies often require hardcoding to remap processes to new requirements, the cloud allows firms to adapt quickly to regulatory demands. The CFTC Rewrite, phase two of which is expected to come into force later this year, is one example. The Rewrite brings wide-ranging changes to the regulator’s swap data reporting framework, adding new, harmonised data and reconciliation requirements, and adding an obligation for firms to verify the completeness and accuracy of data held at Swap Data Repositories (SDRs).
The first phase of the Rewrite, which came into force in December, introduced requirements including unique transaction identifiers (UTIs) and collateral reporting, as well as more stringent rules on reconciliation. It reduced the number of reportable fields from around 200 to 128. Phase 2 is expected to bring obligations for use of unique product identifiers (UPI), ISO 20022 standards, and possibly updated reconciliation requirements—including the need for reporting parties to correct within seven business days should any errors or omissions be identified.
In Europe, the EMIR REFIT goes fully live in early 2024, and will require many of the same adjustments envisaged by the CFTC, including relating to messaging formats, UPIs and UTIs. These in turn will require dedicated linking fields, so that firms can connect open trades with trades executed after the new rules come into force.
Many of these changes are daunting, and for many markets participants will require action to remap their data processes, handle new types of data and accelerate error detection. Moreover, the regulation’s reach will extend well beyond trade counterparties, taking in SDRs, execution facilities and clearing houses. None of this will be easy across sometimes hundreds of systems, many of which are built on old databases. Indeed, reconciliations, which are inherently complex from a data perspective, will be particularly tricky, requiring analysts to obtain data from multiple systems to comply.
For many businesses, the most obvious way to create a more reliable approach would be to move post-trade processes to cloud-based platforms, as far as possible. Not only would this make sense from a compliance perspective; it would also chime with the wider move to cloud-based technologies, for example to expand ecosystems or use artificial intelligence. That said, many in financial markets are just at the beginning of their cloud journeys. Indeed, 65% of banks face challenges in adopting cloud computing, according to Deloitte.1 Hurdles include migration of legacy data, the need for new approaches to resource scheduling, and the demands of switching currently siloed applications.
To provide a way forward, a range of service providers have jumped into the space, offering financial institutions the opportunity to meet regulatory and operational requirements through on-demand, cloud-based solutions. These proprietary offerings include benefits such as faster aggregation, normalisation, and reconciliation of data for regulatory reporting, and the ability to ingest data from multiple systems and in multiple formats.
In parallel, a range of market infrastructure providers are joining the party. The Options Clearing Corporation (OCC), the world’s largest equity derivatives clearing organisation, said in May it had migrated its clearing, risk management and data management systems to the cloud after it received a no-objection notice from the U.S. Securities and Exchange Commission. Nasdaq migrated its first options market to the cloud at the end of last year and is set to migrate a second by the end of 2023. Bolsa Electronica de Chile recently announced its commitment to migrating to the cloud in 2024.
All these initiatives amount to significant tailwinds, and a growing consensus that the cloud offers the most dynamic approach to capital market standardisation and interoperability. For market participants that don’t have their heads in the clouds, it may be time to reconsider.
The derivatives efficiency puzzle facing many organisations is not the volume of data they possess but how to maximise its utility.
Firms are required to raise the bar through the trade lifecycle, from order management through to post-trade confirmation, netting, settlement, and reconciliation.
Sign up using the option below to receive the latest articles sent straight to your inbox.