The Gateway to Algorithmic and Automated Trading

DCAM best practice: Identifiers and instrument ID

Published in Automated Trader Magazine Issue 37 Summer 2015

The capital markets sector is bedevilled by US v Europe v Asia regulatory differences. Lack of a common entity identification scheme (i.e. global LEIs), as well as common instrument ID scheme (i.e. for tagging collateralised debt obligations CDOs, Exchange-traded funds ETFs, etc) is a problem that the EDM Council is trying to address by encouraging adoption of its benchmarking 'how to' Data Management Capability Assessment Model (DCAM).

Michael Atkin, Managing Director, Enterprise Data Management (EDM) Council

This is a purely data-driven technology focused methodology. DCAM is trying to act as an industry-wide point of reference and 'how to' guide for data professionals.

The methodology is a synthesis of data management best practice. Banks, traders, asset managers - even infrastructure providers and others - can score themselves against it to gauge the quality of their data.

DCAM maps to the data architecture, quality and governance concepts expressed in BCBS 239. This is a set of principles for risk data aggregation set out by the Basel Committee for Banking Supervision (BCBS) and is part of its efforts to impose a strong "data control environment" on capital market participants, so that they can meet their enhanced post-crash reporting obligations and enforce global financial stability via better transparency.

BCBS 239: Key risk data aggregation rules

Adherence to the BCBS 239 risk data aggregation principles is mandatory and is one of the key forces driving financial institutions to implement data management improvement programmes across their enterprise.

"BCBS 239 has got 14 principles in total, seven of which focus on data," says EDM Council's managing director, Michael Atkin. "They basically ask for cohesion via a governance mandate; an infrastructure mandate; and a data quality mandate. They all fit into the over-arching capital reserve/ collateral reporting regime, centralisation and transparency requirements laid out post-crash [and evident in Basel III, Dodd Frank, etc]. There is a technology element outlining the standardisation need, via FIX or ISO20022 XML messaging for instance, and a data content element which details taxonomies. The interesting thing is that all of this is supposed to play together."

Although such interoperability is vital, the challenges are enormous.

"The good news is that it's all the same data," continues Atkin, who also sits on the US CFTC's Technology Advisory Subcommittee on data standardisation and actively works with regulators and the regulated to try to ensure data uniformity. "If you define your data well to start with, tag it right and document and store it correctly, aligning it with your business rules, then you'll have the enhanced fast data quality that is needed for today's multi-asset capital markets. Your cost of compliance and operations will go down and you'll have minable data-rich business information that can deliver value - if you get it right."

This vision is dependent on ironing out regional regulatory differences. Entrenched market fragmentation and vested interests need to be overcome. The global financial markets will certainly look different under the weight of new post-crash regulations and new real-time intraday market drivers which are now coming into fruition. But whether the long-sought after 'golden copy' and standardised utility platforms can introduce the hoped for data nirvana has to be debatable.