The Gateway to Algorithmic and Automated Trading

The Data Deluge

Published in Automated Trader Magazine Issue 37 Summer 2015

A data deluge is hitting capital markets as new surveillance, trade reporting, regulation and market practices come to fruition. Neil Ainger asks if firms can handle the huge volumes of data, intra-day collateral, repository and other forms of data now required on global financial markets.

Michael Atkin, Managing Director, EDM Council

Michael Atkin, Managing Director, EDM Council

"It's like hell on earth data-wise," says Michael Atkin, a renowned data professional and managing director of the Enterprise Data Management (EDM) Council trade body, in reference to the fragmented nature of over-the-counter (OTC) derivatives and other global capital markets.

The rising demand for data is being driven by a whole raft of new post-crash regulations such as the US Dodd-Frank rules and European Market Infrastructure Regulation (EMIR). These mean that the sell and buy side must store, report, consolidate and use data - often on a real-time basis - as never before. Better data quality, standardised utility-like platforms, efficiency and control are all required in this new environment.

The capital and collateral requirements coming in under Basel III are also driving a need for more intra-day information and the reformed markets require more linkages to custodians and central counterparty clearing houses (CCPs). Still other regulations wait on the horizon, such as the EU'S Markets in Financial Instruments Directive (MiFID) II and its Market Abuse Directive (MAD) II, which is aiming to stop further Libor, FX and other such mis-selling scandals or rogue trading behaviour.

Compliance, allied to monitoring for risk concentration and market abuse, are the key drivers for the data deluge.

Banks, asset managers, brokers, infrastructure providers and all other types of market participants - even national regulators themselves as the recipients of all this reporting data - are required to respond to the call for centralised reporting, clearing and enhanced transparency, emanating from the post-crash Pittsburgh G20 meeting back in 2009.

"If you try and respond in a fragmented way, with a sticking plaster approach, then you're going to drown," says Atkin. "Regulators are putting pressure on banks and others to manage their data better and present it to them in a more usable fashion. The demand covers capital market firms' risk assessments, collateral and other mandatory data, and often necessitates intra-day reporting. You also have to think about security and resiliency issues." (See DCAM best practice)

Source: Capco

According to Michael Cooper, chief technology officer (CTO) at BT Radianz, there is almost exponential growth in data volumes, driven by regulators' concern to capture all data. "Both structured and unstructured data, which requires interpretation and often relates to social media and so-called 'big data' analytics, is being collected now for the prevention of market abuse and forensics investigations afterwards," he says. "Market changes are also driving the data deluge (i.e. more margin calls, etc)."

Steve Colwill, director of Velocimetrics, a data analysis and benchmarking firm that offers real-time performance monitoring, rightly warns data should be seen as a competitive advantage in this new marketplace, adding that: "If data isn't clean, then it is just 'stuff'."

Robert Powell, Global Head of Compliance at Etrali

Robert Powell, Global Head of Compliance at Etrali

Moreover, it isn't just transaction reporting duties under EMIR, Dodd Frank, or even the EU's wholesale Regulation on Energy Market Integrity and Transparency (REMIT) rules that are driving the data deluge. The market abuse and conduct agenda, Know Your Customer (KYC), counterparty, tax beneficiary and other such storage requirements also matter, alongside centralised clearing and other transactional duties.

As Robert Powell, global head of compliance at Etrali warns: "EU rules, such as MAD II, cover all voice data and conversations, which need to be kept for surveillance purposes, regardless of where platform trades are executed or stored."

Compliance confidence

Stephen McGoldrick, co-chair of the EMEA regulatory subcommittee at the FIX Trading Community standards body, and a director of market structure at Deutsche Bank, accepts there is a rising data demand. "Yes, there will be a data deluge (in the wake of all these regulations)," he says, "but it's something we'll just have to cope with. I'm confident we can."

Speaking to Automated Trader at the FIX EMEA Trading Conference 2015 in London, McGoldrick wasn't despondent, commenting that: "Compliance is something we all have to achieve; so let's be positive

Stephen McGoldrick, Director of Market Structure, Deutsche Bank and Co-Chair of EMEA Regulatory Subcommittee, FIX Trading Community

Stephen McGoldrick, Director of Market Structure, Deutsche Bank and Co-Chair of EMEA Regulatory Subcommittee, FIX Trading Community

about it and get the efficiency benefit of better data."

The requirement to get a grip on data was illustrated recently when even the Bank of England (BoE) hired a chief data officer (CDO) Hany Choueiri, ex-of HSBC, at the turn of the year. There could be no more explicit demonstration of the UK central bank's determination to be able to handle the vast amount of data flowing into it, and of the wider elevation of data as a crucial issue within the industry.

"There is a data demand on us too as regulators," says Edwin Schooling Latter, head of Markets Infrastructure and Policy at the UK Financial Conduct Authority, on secondment from the BoE. "The FCA is investing more in technology because of it."

At the 2015 FIX EMEA Trading Conference, Schooling Latter told Automated Trader his organisation "certainly plans to" handle all the huge volumes of data flowing into it. He is confident the time and money invested in people, process and technology at the FCA will see a smooth transition to the new transactional data reporting, surveillance and other post-crash obligations.

Whether the same preparedness can be attributed to other, smaller national regulators is up for debate. Germany, France, the US and the UK will likely manage. "But there are 31 countries in the European Economic Area (EEA) and the likes of Iceland, Malta and Cyprus won't be able to invest in data handling capabilities," says Chris Pickles, of the Bloomberg Open Symbology Team. This may not matter if neighbours help out, and because their volumes are low, but it is a concern.

Tom Riesack, Managing Principal, Capco

Tom Riesack, Managing Principal, Capco

According to Tom Riesack, managing principal at the Capco consultancy: "The first thing a major regulator needs to do is build up an analytical capacity to handle the data flows coming into it. This enables the regulator to make data-driven conclusions."

Notably, the same logic applies to large banks.

"Don't forget the consolidation challenge either," cautions Riesack, again focusing on regulators' data handling capabilities, though the point is equally valid for banks. "For instance, each EMIR derivative trade repository (TR) collects data in a different manner since the regime's inception in February 2014. Data is not yet being reconciled between TRs."

There were bound to be teething problems as regulations such as EMIR came into being, but Schooling Latter is clear about what the UK regulator wants data for. "Our objective is transparency and monitoring market abuse. Those are the key FCA objectives," says Schooling Latter.

Trade repositories

One of the immediate causes of the data deluge was the start of TRs under EMIR in Europe in February 2014, covering derivatives, and before that Swap Data Repositories (SDRs), part of Dodd Frank Title VII in the US. (Swap Execution Facilities SEFs separately provide pre-trade bid/offer information on execution platforms).

Source: Capco

US players had to register as a swap dealer (SD) or major swap participant (MSP). The US Commodity Futures Trading Commission (CFTC) has led the way compliance-wise - too much so for some tastes - with rising complaints about a lack of regulatory coherence. They've been front-running the Securities and Exchange Commission's (SEC) own wider efforts, so perhaps the legal action from the International Swaps and Derivatives Association (ISDA) and other trade bodies specifically targeting its cross-border guidance in December 2013 was inevitable.

In Europe, a handful of new TRs have been launched. Last year's initial wave of six approved by the European Securities and Markets Authority (ESMA) comprised: the UK-based ICE Trade Vault Europe Ltd (ICE TVEL); CME TR; the Regis-TR Iberclear/Clearstream joint venture; London Stock Exchange (LSE) UnaVista platform; Poland's KDPW; and the Depository Trust and Clearing Company (DTCC) repository.

In the US many of the same repositories are present, having started out there. The leading US players, with more joining all the time, include: Intercontinental Exchange (ICE) Trade Vault US; the DTCC Data Repository; and CME SDR.

Away from transactional obligations, data monitoring for intra-day purposes, plus data mining and tracking for market abuse, or post-event forensics, are all crucial elements in the rising data mix. The data demand will only grow as MiFID II, MAD II and other such regulations come into full force in 2017, impacting other non-derivative segments of the capital markets.All of the above repositories store OTC derivative trades in the hope that if the worst happens and there is another Lehman-style collapse unwinding the positions wouldn't be so nightmarish in the future. The repositories mean that a wave of data has been unleashed, with firms having to collate and store it. This data is passed on to the regulators as and when required. It remains stored for later investigation.

"Meeting trade reporting requirements has been a big enough expense for many market participants," says Cian Ó Braonáin, global lead of Sapient Global Markets' regulatory reporting practice. "Despite sizable investments in internal systems, many banks are still grappling with data management challenges and inefficient trade reporting processes and governance.

"Buy side firms, leaning on their executing broker or clearing member to handle reporting on their behalf, must also now demonstrate to regulators that they're validating the delegated reporting against their own records."

Source: JWG Group

The data burden is borne out by Tim Thornton, chief data officer, at Mitsubishi UFJ Fund Services. Commenting on the anniversary of EMIR TRs in February this year, he said: "Errors and mistakes during the reporting process are still common. Regulators have indicated they intend to clamp down on firms submitting inaccurate data, but fund managers have onerous reporting obligations, which mean processing significant quantities of data."

The requirement under EMIR that financial institutions, including asset managers, clear their OTC derivative transactions through CCPs is likely to come into force in August 2015, he adds.

"EMIR also imposes stringent risk mitigation procedures on firms' un-cleared OTC transactions including timely confirmation, dispute resolution, portfolio compression, daily valuation and recordkeeping. These rules are extra-territorial so firms must check to see if they're ensnared. The (new) deadline is fast approaching and fund managers must ensure they are well prepared to deal with these rules, or risk facing regulatory sanctions," said Thornton.

Collaboration

Collaborative initiatives designed to deal with the data deluge hitting capital markets firms are proliferating. Financial utilities; vendor-sourced shared services in the cloud (if they're secure against hack attacks); and new data services are coming to market all the time.

Data consolidation and much needed standardisation initiatives, are also on the rise. For example, in 2014, the old Bloomberg Global Identifier (BBGID) was rebranded as an open standard Financial Instrument Global Identifier (FIGI) and endorsed by the Object Management Group (OMG), a non-profit standards consortium. Bloomberg wants FIGI to become the definitive global non-proprietary system for identifying securities' instruments and firms. (See FIGIs at bottom).

Source: JWG Group

DTCC is another major player in this space. They are launching the Clarient Entity Hub centralised reference data and document utility, for instance, and separately, the DTCC-Euroclear Global Collateral joint venture to prepare for the coming market liquidity and collateral changes.

Additionally, the DTCC Deriv/SERV subsidiary is still relevant because it targets post-trade processing of derivatives through the firm's Trade Information Warehouse (TIW); via its Global Trade Repository (GTR), which admittedly faces the hurdle of regional regulatory differences; and the Equity Cashflow Matching (ECM) engine. Harmonising domestic and international differences is a stated aim for the firm.

Matt Stauffer, Chief Executive, DTCC Clarient

Matt Stauffer, Chief Executive, DTCC Clarient

According to DTCC Clarient's CEO, Matt Stauffer, the Hub is designed to address global financial market participants' increasing need for greater control and transparency in the entity data management process. "It was developed in response to evolving risk management and regulatory requirements, many of which require a greater level of rigour in the collection and oversight of this data than in the past," he says.

These regulations include, but are not limited to, more robust global KYC processes, the Foreign Account Tax Compliance Act (FATCA), EMIR, Dodd-Frank, and other such rules, he adds.

Data partnerships

"There is a landscape of partnerships emerging as participants realise they can no longer serve all clients everywhere," says Sapient's Ó Braonáin. "Partnerships, whether defined on a one-to-one basis, as a 'utility', or a shared service, are helping to address weaknesses and gaps in business models, data, functional capability or geographic reach."

Examples he points to include: BNY providing services to CME for Central Securities Depository (CSD); BNY with SIX for custodial services; Nasdaq NLX with LCH for cross margining; BNY with State Street and Eurex for listed/OTC clearing; etc."

Collaborative undertakings such as these require shared data, standards and processing, plus mutual trust in each other's data quality, handing capabilities, security infrastructure and so on. A number of specifically data-focused initiatives that are more compliance driven, and not on a one-to-one basis, are also underway. Meanwhile, vendors are looking to serve the needs of smaller firms in the cloud. Larger players typically look to non-vendor aligned utilities or indeed each other for help.

Depending upon what part of the data spectrum a capital market firm wants to address - compliance, risk concentration or market abuse - and upon the size of the firm, various solutions are on offer. Accenture, for instance, has launched a transaction processing aggregator platform which is aimed at FIs wanting to pool their back office processing to cut IT and compliance costs.

Source: JWG Group

Accenture Post-Trade Processing received approval from the FCA in April. The post-trade utility has been set up in collaboration with Broadridge to support settlement, books and records, asset servicing, operational management and control, real-time data access and administrative accounting.

In the equities exchange arena for instance, Euronext outsourced its real-time monitoring and pricing of derivatives instruments to Horizon Software in April, abandoning its expensive old proprietary system. Horizon will henceforth carry out this activity and calculate and disseminate settlement prices on an intra-day basis.There is, of course, no shortage of solutions.

Similarly, more than 40 clients are now using Trayport's online prototype REMIT reporting tool. Ancoa is also providing real-time market surveillance for the LedgerX exchange Bitcoin options trading platform, enabling them to detect and deal with potentially manipulative trading behaviours.

Larger capital market firms, with bigger IT and human resources, are not so dependent on vendors to help them meet their data demands, and typically look to themselves or shared utilities organised by bodies such as FIX or SWIFT, in which they have a stake. They're not keen on vendor-based cloud solutions. Bigger FIs particularly prize trade body utilities when they operate in non-competitive areas such as tax or financial crime compliance reporting duties.

JP Morgan has even launched its own algorithm to track any signs of market abuse among its own staff to identify rogue employees before they go astray, according to a Bloomberg report. Dozens of inputs, including whether workers skip compliance classes, violate personal trading rules or breach market-risk limits, will be fed into the algo pattern-spotting behavioural software. It is presently being tested in the trading business and will spread throughout the global investment-banking and asset-management divisions by 2016, a spokesman confirmed.

It won't be easy to reform the capital markets, on both the sell and buy side, via utilities, vendors or in-house solutions, but the effort has begun. Surveillance, regulation and changed market circumstances are the key drivers and cause of the data deluge.

FIGIs: Fruit of Bloomberg & OMG cooperation

Chris Pickles, Bloomberg Symbiology Team

Chris Pickles, Bloomberg Symbiology Team

The Bloomberg Open Symbology (BSYM) project is an attempt to introduce better instrument and corporate identification tags in a multi-asset trading and investment environment. The Object Management Group (OMG) non-profit standards consortium adopted the 12-character BSYM identifier as the basis for its Financial Instrument Global Identifier (FIGI) specification in September 2014. This is intended to encourage global adoption and make it easier to identify securities such as equities, fixed income, indices, derivatives, currency and structured instruments across various venues. "The EDM Council is also supporting it as an open standard," says Chris Pickles of the Bloomberg Open Symbology team. Other standards are of course available too. OMG hope its FIGI, which is essentially a rebranding of the old Bloomberg Global Identifier (BBGID), which the firm stopped charging for in 2009, will help introduce a standardised system for naming financial securities that is non-proprietary, offers broad coverage, and is free and universally available around the world without restrictive license terms and fees. As Pickles of the BSYM team tells it, International Securities Identification Numbers (ISINs), widely encouraged by the Association of National Numbering Agencies (ANNA) and used in the equity and bond markets, "haven't proved to be the common identifier standard that everyone hoped for". There are also so many other segmented identifiers out there - from differing Market Identifier Codes (MICs) for exchanges to Alternative Instrument Identifiers (Aii) for derivatives - that some standardisation is required, particularly with the BCBS 239 risk data aggregation and validity requirements in mind. "The FIGI could help to encourage the necessary linkages," says Pickles, who adds there are 225 million financial instruments across all asset classes - from loans, to swaps and so forth - at the moment. This is growing fast by five million a month, adding to the pressure to bring order to the system.