Does Refit miss the point

Does the EMIR Refit miss the point?

A decade after global regulators introduced new requirements for derivative trade reporting, financial markets are bracing themselves for a fresh wave of regulation.

The next iteration of reporting requirements aims to tighten up aspects of previous regimes and tackle the growing issue of data mismatches – a particular issue in recent more-volatile markets. The EMIR Refit, CFTC Rewrite, and similar initiatives in Asia, show that regulators are united in a desire to get serious about harmonized data. However, in the rush for more granularity, are the new rules missing the point? 

The CFTC will go live with the first phase of its Rewrite in December. Market participants will be required to submit new, updated data fields, initiate reporting of collateral valuations, and use unique transaction identifiers (UTI) in the reporting process for the first time. In addition, there will be new validation rules, revised reconciliation requirements, and a move to standardised messaging formats, as well as the adoption of go unique product identifiers (UPI) in a second phase. There will also be a duty to correct errors within seven business days, ramping up the pressure on market participants to ensure they have sufficient capacity and infrastructure in place. 

In Europe, the European Commission in June endorsed ESMA’s proposed technical standards for the EMIR REFIT, which will require sell-side firms to report against 89 new data fields, adding action and event fields, as well as mandating the ISO 20022 messaging format, UPIs and UTIs. These in turn will require linking fields, so that when Refit goes fully live in early 2024, firms can connect open trades with trades executed after the new rules come into force. 

Commentary on the Rewrite and Refit to date has focused on the additional burdens that the regulation will create.  These range from simple data sourcing, and the need to collect more information on counterparties, to switching away from legacy reporting formats. ESMA is keen to see firms post higher reconciliation pairing rates (the average rate in 2021 was only about 60 percent) and is imposing new responsibilities on firms to be clear on who needs to take responsibility for UTIs. ESMA is also considering an alternative approach to reconciliation logic. This would see a two-day lag and the use of a new event date field to account for counterparties reporting on trade date, as opposed to T+1. 

In a report published in May, information provider Acuiti found that 69 percent of firms expected significant challenges in implementing Refit. These would include both acquiring the necessary capabilities to comply and correcting errors and resubmissions. Meanwhile, some firms continue to be concerned about ambiguity, for example in relation to how to interpret some fields. One element that has attracted attention is lifecycle events, and commentators argue that changes may have the counterproductive impact of heightening the risk of trade breaks between counterparties. Finally, despite the fact that the new reporting regimes are designed to boost global alignment, there is lingering concern among over potential divergence, for example between EU and UK regimes in relation to XML schema. 

As businesses plan to implement the new rules incrementally, one of the biggest challenges is a belief that their counterparties may interpret data fields differently from themselves, Acuiti reports. That is, there might be logic mismatches in a highly complex data environment. And these crossed lines would not be restricted to market participants. Already, for example, some firms have pointed out differences in Europe, the US, and Asia (where HKMA, JFSA and MAS are introducing new reporting regimes) in how regulators approach UTI, amid different levels of adherence to IOSCO standards. In short, firms are concerned that harmonization in theory is a lot easier than it is in practice. 

As firms beef up operational workflows to comply with new regulation, they will need to either repurpose legacy systems or acquire new tools for data collection, marshalling, and reporting. However, in the rush to acquire sleek new systems, it will be worth remembering that data mismatches can occasionally have nothing to do with interpretation or complexity. It is, in fact, a challenge that no regulation can truly abolish – which is basic human error at some point in the transaction process.  

As much as regulators would like to create a seamless trade workflow from front to back, there is always a chance for fatigue, miscommunication, or keying errors which are often unavoidable reporting problems. Therefore, market participants will always have a need, apart from staving off a sense of regulatory fatigue, to develop a culture of personal responsibility, care and double-checking in order to mitigate these risks. Human error aside, the imperative should still be better automation of data quality and the improvement of reconciliation processes.  

This will be a critical success factor in coping with the upcoming regulatory changes and tackling the legacy of ineffective recs processes should be seen as an urgent priority. 

You may also like

Would you like to read more articles like this?

Sign up using the option below to receive the latest articles sent straight to your inbox.