July 26, 2018 - Paul Lashmet | Industry Solutions

Consolidated Audit Trail: Outside Looking In

The primary purpose of the Consolidated Audit Trail (CAT), a rule under the Securities and Exchange Act, is to arm regulators with the data they need to effectively conduct market surveillance and investigations into suspicious trading activities across all national exchanges.  The difference between this and current trade reporting regimes is that it covers more than just trade data. It also requires information about the client and how the trades are broken up and allocated to their accounts which will enable regulators to understand the full lifecycle of a trade and with whom it has been transacted.   If your organization lacks a comprehensive and current view of all of your data and your internal surveillance processes are fragmented, you run the risk of external parties (the regulators) knowing more about your business than you do.

About CAT

The concept of the CAT became official in July of 2012 when the Securities and Exchange Commission (SEC) adopted Rule 613, requiring self-regulating organizations (SROs) like FINRA to plan the creation, implementation, and maintenance of a central data repository to which all brokers and dealers report their equity and option trades.  The SROs and its members put forth a plan and it was adopted in November of 2016. Since then, a CAT processor was selected, business clocks were synchronized between SROs, broker, and dealers. Coming up in November of 2018, large SRO members are expected to submit data to the central repository. Smaller firms start submitting their data in 2019.

Some key differences between CAT and current reporting regimes such as Electronic Blue Sheets (EBS) and the Order Audit Trail System (OATS) are listed below.

  • Data linkages:  CAT requires customer account information to be provided and linked to the trade information.  It also requires that trade allocations, when a large trade is broken into smaller pieces that are then allocated across several accounts, are included.  OATS doesn’t require client and allocation information. EBS requires both but reports are only provided at the request of a regulator, while CAT requires daily reporting.
  • Timeliness of reporting:  In addition to daily reporting noted above, the timestamp granularity for recording trade events goes from seconds to milliseconds.
  • Data accuracy:  If regulators find exceptions to the data you provide, the response time from initial submission to remediation and resubmission decrease from five days (OATS) to three.

In short, CAT requires more and better data at higher granularity in less time than current regimes.  Regulators will have a broad and comprehensive view of what is happening in the market which will place more outside scrutiny towards your internal processes.

The Wisdom of the Crowd Will Highlight Your Data Quality Issues

Regulators could potentially spot your data quality issues before you do because, with CAT, they will have a full a view of every other participant you deal with and can reconcile across the market.  If the party on the other side of the trade (you bought and they sold) reports account information that is up to date while you reported obsolete information, an exception is reported to which you have three days to respond.  Considering the volume of trades that you process daily, there could be a lot of time spent responding. Also, with a 5% maximum allowable error rate, you could run out of goodwill with regulators very quickly.

Keeping up with changes across complex data linkages is one of the biggest challenges that will be faced as it relates to conforming to CAT.  Trades are entered, amended, canceled, and settled in the course of normal business. Simultaneously, account information linked to the trade can be updated at any time for a variety of reasons like bankruptcy, mergers, account consolidations, or basic data corrections.  With allocations added to the mix, you get a dynamic data environment of complex data linkages that need to be reported to the millisecond. Additionally, the underlying data sources are disparate because the trade execution platforms used in the front office are not the platforms that manage account data, allocations, and settlement functions.

Consolidated Audit Trail

The accompanying image is an example of how you could stay ahead of ever-changing data linkages.  It is a heat map that visualizes results derived from advanced analytics models such as machine learning algorithms.  In this case, the algorithms comb through the account, trade, and allocations data, evaluating the full trade lifecycle of live trades.  If material changes occur in the upstream account data that are not reflected in the downstream trade data, then the relevant trades are flagged, risk-assessed, and prioritized.  The heat map visualizes that priority. The redder the box the higher the risk and the more quickly the issue should be resolved. Clicking on the box provides granular details of the trades that must be reported to CAT by 8:00 am the following morning.

A big picture view of a constant and automated full trade lifecycle review of data quality across all trades will ensure complete and valid reporting.  As an extra bonus, if you do get an exception report, you have what is needed to reconstruct the trade and quickly defend or fix the submission.

External Surveillance Needs Internal Surveillance

External market surveillance, that being done by regulators, may spot suspicious behavior that is directly or indirectly connected to your firm.  Inquiries will be made to which you need to respond in as timely a manner as possible. The regulators have the necessary trade information because you already provided complete and valid data to CAT (see section above).  What investigators will be looking for is additional insight into the intent of that transaction and that requires internal data generated by your electronic communications, voice, and other surveillance platforms.

An internal surveillance program as described in this post, Cross-Functional Trade Surveillance,” will optimize your response to regulatory inquiries while demonstrating that you have a robust and defensible process in place.  In that post, we explain what it takes to move from a fragmented surveillance program to a holistic one. The latter will enable analysts to visually explore data across surveillance platforms and correlate risk activity patterns that would normally not have been identified when looking at each area in isolation.  

Move Beyond Traditional Data Platforms and Business Intelligence Tools

The volume and variety of data needed to build the solutions described above require you to move beyond traditional data platforms and business intelligence tools so that your surveillance program is robust, defensible, and financially sustainable, while responding to the cross-functional complexities of your organization.

Best-of-breed surveillance platforms, artificial intelligence (AI), and advanced analytics generate data that flag and evaluate risks buried deep within daily trading volume and communications. However, the cost, time, and personnel resources needed to store, process, and correlate that data across business functions and surveillance channels are prohibitive.  Improving the economics of a strategy is a key factor in the program’s success.

Modern big data environments such as data lakes need to be considered.  These solutions process more varieties and higher volumes of data at a significantly lower cost than traditional data platforms.
Native visual analytics, where the visual analytics engines run within the data lake,  increases the return on investment by enabling your team to drive the surveillance process in a timely, secure, and collaborative way.  All data and processing (including machine learning and AI) are directly accessible in the data lake and in its native format. Time-to-insight is accelerated because there are no delays from data transformation overhead. Data integrity is ensured because the data remains in its existing form with no disruptive ETL operations on it, and teams collaborate on the same version.

While big data innovations have largely been in data management, empowering end users will be the key to successfully leveraging data to drive your business operations.  Surprise #3 from our data lake survey shows things heading this direction. 


Related Posts