Regulatory compliance continues to permeate more broadly and deeply into the daily functions of financial services organizations. This post describes the concept of digital packaging and why we must reevaluate the front end capabilities of any data-driven regulatory compliance program.
Robust and Defensible: Requiring a Deeper and Broader Burden of Proof
Financial regulators expect that the processes you have in place are strong and tested, but the burden of proof may be difficult to meet. Here are some examples:
- Capitalization requirements require a much broader and deeper evaluation of models. These models must be proven against real transaction data that is both sufficient in volume and adequate in quality. (Example: BASEL/Fundamental Review of Trading Book (FRTB)).
- Institutions must correlate electronic communications with high velocity, cross-jurisdictional trade volume in real time in order to identify abusive trade behavior (Example: Trade Surveillance).
- Recordkeeping rules require the trade lifecycle of complicated derivative trades to be reconstructed within 72 hours of a regulator’s request, inclusive of chat, voice, emails, trade execution, and post trade events (Example: DFA/Trade Reconstruction).
- Best execution policies must be backed up by test data to prove accuracy and effectiveness. (Example: IIROC/Universal Market Integrity Rules 7.1).
There are multi-dimensional challenges associated with the examples above.
- Calculation Efficiency: Increased inputs into a growing number of interrelated calculations require hyper-efficient data processes. Model performance will be a critical element of business decisions.
- Data Volume, Variety, and Quality: Expanded risk models must be proven with real transaction data (historic/real-time, structured/unstructured, algorithmic). Data will be dynamically created.
- Organizational Functions: New models will require expanded data expertise across lines of business. Who will take the lead? Finance, Risk, or a hybrid? The desk structure may need to be reorganized.
The Front End: Packaging the Evidence
Multidimensional challenges require robust and integrated business processes that clearly show accuracy and effectiveness. Imagine a digital package that could show this in a transparent, secure, and interactive way. Here are some examples:
- A data quality issue is spotted by a line of business during a data exploration exercise. They then send that view over to the appropriate data quality team. The view is not just a static screenshot, so the data quality team can immediately drill down to the problem and advance remediation.
- A trader questions the validity of a risk model. A risk analyst will evaluate algorithmic data directly against historic trade data to help pinpoint and resolve the underlying issue.
- A regulator questions a complicated derivative trade. The compliance team reconstructs every action, from the first conversation to post trade events, into a clear and organized presentation.
Unified Data Discovery, Visualization, and Intelligence
ARCADIA DATA is a platform with the enhanced capabilities described above. Arcadia Data unifies data discovery, business intelligence, and real-time visualization in a single, integrated platform. It provides users with direct access to big data (it runs natively on Apache Hadoop clusters and/or cloud environments to leverage its capabilities) through an intuitive and self-service interface. Referring to the examples above, an end user can explore billions of rows of highly varied data, join views into interactive packages, and share that package with relevant colleagues to solve problems in a timely, secure, and collaborative way.
Future articles will expand on this in detail as it relates to Fundamental Review of the Trading Book, Trade Surveillance, Trade Reconstruction, and Best Execution.
This post describes a paradigm shift from data monitoring and reporting to dynamic modeling, evaluation, remediation, and business adaptation. The front end for any data-driven regulatory compliance program must bring clarity to complex processes by allowing the experts (analyst, compliance officer, COO, CDO) to paint a dynamic and interactive picture across billions of rows of data (structured, unstructured, real-time and historical) that is relevant to their line of business. This will allow them to collaborate with others, test assumptions, and package those results to prove a robust and defensible process.