Downsides of standardisation
What are the surprising hidden dangers of standardisation, and can it impact operational resilience and increase risk?
Despite significant progress in the management of risks across the business, financial institutions continue to be severely impacted by operational disruptions. Annual gross losses from operational incidents were €20.3 billion in 2021, and the number of annual loss events has averaged about 67,000 over the past five years. Moreover, since 2017, loss levels have remained relatively stable, suggesting that financial institutions have their work cut out to translate the desire for more resilience into reality.
When they think about operational resilience, decision-makers tend to focus on the potential impacts of extrinsic events such as the COVID pandemic, cyber-attacks (as seen at UK-based software firm ION in February) or telecommunications failures. In fact, often the culprit is much closer to home. The event type with the high frequency for most business lines from 2016 to 2021 was execution, delivery, and process management, which was responsible for more than 112,000 events over that period. The causes of these events are most often failed transaction processing or process management with trade counterparties and vendors.
While trade fails are an inevitable part of doing business, the fact that firms have been unable to put a dent in fail rates over the past few years is a concern. Until now, most fails could be repaired or reprocessed relatively easily. However, new regulation is changing that, with fines increasingly often imposed for trades that fail to make the grade. The Central Securities Depositories Regulation’s (CSDR’s) settlement discipline regime, which went live in February last year, is one such piece of legislation, focused on boosting settlement discipline across asset classes. CSDR imposes cash penalties to prevent and address settlement fails.
As the industry comes under rising pressure in the post-trade environment, firms have made efforts to cut fail rates, with standardisation seen as the most likely antidote. If data and workflows are standardised across the industry, the theory goes, then fail rates should tumble. To achieve a more level playing field, firms have looked to replace manual or outdated ways of working with modern infrastructure, common processes, and new data standards and protocols. And decision-makers have been willing to ignore the fact that standardisation can be a double-edged sword: if processes are standardised, for example, they may reduce competition for the effective provision of post-trade services.
Of all asset classes, the one most associated with a lack of standardisation, as well as incomplete, inaccurate and inconsistent data, is derivatives. Primarily, this is the result of the inherent complexity of the asset class, and the ever-rising number of underlyings. The impact of this complexity is most acutely felt during periods of high volatility. According to FIA data, record trade volumes in March 2020 led to 15 times more contracts not allocated on trade date. This was mainly because of bottlenecks in the settlement and clearing process. Similar frictions emerged in the wake of the Russian invasion of Ukraine and related market volatility.
In 2022, the FIA formally launched the Derivatives Market Institute for Standards, Inc (DMIST). DMIST’s first project is to look at the potential to standardise timeframes in the allocation process, which it hopes will also foster development of standardised IT-based solutions. These, in turn, should mitigate operational risks and increase efficiency across the industry. Certainly, the initiative is an important step. However, many industry participants understand that it is a means to an end, rather than an end in itself. The ultimate goal, according to the FIA, is to move to real-time settlement that would remove many of the risks, costs, and efficiencies in the trading lifecycle.
Real-time processing is, for now, very much over the horizon. For most market participants, a more practical first step would be to get their data houses in order. This would involve initiatives such as agreeing a standard data template for end-to-end trade processing workflows, as well as for consistent trade and order references. Many firms testify that trade ID fields are a long way from where they need to be. Meanwhile, it is not uncommon for major banks to offshore tasks such as reconciliation, and achieve efficiency ‘through the back door”.
One challenge in the listed derivative space is that the wider push for standardisation across asset classes is often not an ideal fit. While cross-asset standardisation achieves economies of scale, it can also lead to favouring of lowest common denominator solutions, impacting even simple variables such as the use of CSV and custom XML files, rather than the more common FIX protocol. What derivative post-trade professionals really want, of course, is a single source of truth for back office data that reflects all of the market’s complexity. For now, that is something standardisation is unable to deliver.
 Source: ORX Annual Banking Loss Report 2022.
You may also like
Derivatives boom ramps up operational pressure
Firms are required to raise the bar through the trade lifecycle, from order management through to post-trade confirmation, netting, settlement, and reconciliation.
Seeing through DMIST
The Derivatives Market Institute for Standards (DMIST) has made its first major proposal centering around reducing inefficiencies and uncertainties in post-trade processes. What criticisms have been raised and how does DMIST plan to address them?