Accelerate Regulatory Compliance

Arcadia Data accelerates regulatory compliance because it provides subject matter experts with the accessibility and transparency required to proactively mitigate risk, not just react to it.

Three core concepts must be considered in to satisfy the heightened expectations of regulators, investors, and customers in the financial services industry:

  • Granularity is first and foremost: Data granularity must be maintained so that you can demonstrate modelability and ensure fidelity of the data, two main concerns of regulators.
  • All data must be leveraged: The financial landscape is represented through all types of data: structured and unstructured, real time and historical, traditional and alternative. A robust and defensible regulatory program must leverage all data types, big or small.
  • Subject matter experts drive solutions: Analysts in compliance, risk, finance, operations, and the front office understand the nuances of how to derive meaning from data. Cross functional collaboration is key to understanding the full story of any transaction.

See below for examples of how BI and visual analytics that is native to modern big data environments respond to the key directives above. Arcadia Data accelerates regulatory compliance because it provides subject matter experts with accessibility and transparency required to proactively mitigate risk, not just react to it. It is your front end to regulatory compliance.

See how native visual analytics is applied to specific regulatory challenges such as CLAR, CAT, FRTB, and RENTD.

Learn More

Demos

Cross Organizational Model Validation

Evaluate how entities perform against a series of adverse market scenarios. You can imagine in the background, algorithms testing different ways to free up liquidity. Those changes could be reflected here. When you see a graph you like, you can package it as an app and use it as part of a management plan provided to auditors.

See Cross-Org Model Validation in action
Demos

Data Quality - Now

You don’t have to wait for your organization's big data strategy to mature with multiple sources. Enhance your data quality initiatives by generating multi-functional visualizations from one large data source.

See it in action
Demos

Dynamic Data Quality

Identify material changes in upstream data sources that affect active transactions in order to prioritize remediation efforts. Join machine learning data quality risk models with pre-trade electronic communications, trade execution, and post trade events for timely and effective trade reconstruction.

See it in action
Demos

Correlation and Divergence

Visualize algorithmically-generated data with billions of rows of transaction data to prove that it is both sufficient in volume and adequate in quality. Arcadia Data visual analytics can help evaluate the performance of multiple entities against a series of valuation models. Plan desk structure and organizational changes through an enterprise-wide lens while proving that models are based on real market data.

See it in action

Native Visual Analytics Applied to Specific Regulatory Challenges

The table below provides examples of specific regulatory challenges and how the capabilities of native visual analytics enable holistic solutions.

Regulation

Comprehensive Liquidity Assessment and Review (CLAR) and Liquidity Monitoring Reports

Regulatory ChallengeEnablement Via Native Visual Analytics Capabilities
Fidelity of aggregated data across entities: Data from multiple business sources (Wholesale, Retail, and Commercial) and products lines (Deposits, Savings, Loans and etc.), and jurisdictions need to be aggregated up without loss of fidelity to the underlying detail.Enable analysts with the ability to connect directly to individual data sources (lines of business, product lines, jurisdictions, etc.) so that they can run analytics at a granular level and aggregate those results up to the enterprise level. This mitigates the errors that occur during the extract, transform, and load processes required by traditional BI platforms. Data fidelity is ensured because data is not moved.
Dynamic governance and oversight: Internal controls need to be flexible and integrated across the organization, monitoring continuous changes within assets and liabilities so that calculating, matching, and reporting can be done in a timely and accurate manner.Enhance the firm’s internal control framework by providing visual analytics and BI capabilities that monitor, measure, and report in real time, providing the most up to date analysis at any point of time. Dynamic governance and oversight is a vital area that regulators are increasingly reviewing and requiring financial institutions to strengthen.
Transparency of Capital Allocation: Stress testing increases the demand on firms to have robust liquidity systems and processes in place. Liquidity Management and Treasury operations are required to frequently monitor and assess the impact on liquidity requirements across business lines by allocating funding costs accurately.The capabilities described above, together, enable transparency and enterprise wide visibility. Analytics can be run directly on the disparate data sources and types that make up a typical banking organization. Native visual analytics provides the capability to juxtapose real-time and historical analysis at a granular level and across functions on one data application. This helps to centralize both the capital allocation and data transparency processes.

Basel III Liquidity Coverage Ratio (LCR) and New Stable Funding Ratio (NSFR)

Regulatory ChallengeEnablement Via Native Visual Analytics Capabilities
Changes to both LCR and NSFR Treatments more favorable of operational, retail, and small business deposits (by assigning higher ASF weightings to these deposits), and retail and small business loans (by assigning lower RSF weightings to these loans). Weightings must be correctly tagged and assigned, thus, a need for a robust and aggregated data platform across entities. Data from multiple business sources (wholesale, retail, and commercial) and products lines (deposits, savings, loans, etc.), and jurisdictions need to be aggregated without loss of data to the underlying detail.Data quality review can be optimized through visual analytics that are run natively and simultaneously across multiple non-transposed data sources (i.e., lines of business, products, and algorithmic models). Data loss is mitigated and the requisite granularity is provided to verify that weightings have been correctly tagged and assigned to assets, ensuring correct calculations of the NSFR and LCR.
Dynamic Internal Controls Framework: Internal controls need to be flexible and responsive, continuously monitoring changes with treatments of operational deposits, stock borrowing transactions, reverse repos under LCR, and alternative treatment for derivatives under the NSFR. It is essential for data users to have adequate tools in calculating and reporting these treatments on a timely and accurate manner.Real-time and historical analysis enables a robust, defensible, and dynamic internal control framework that accurately monitors, measures, and reports changes in the treatments of operational deposits, stock borrowings, and derivatives to suffice LCR and NSFR requirements.
Meeting Regulatory Expectations: Regulatory compliance of both LCR and NSFR is expected by January 2018 by US banks. The BCBS did not align the two ratios’ implementation dates together as the LCR was required in January 2019. Nonetheless, banks are beginning to focus more on reviewing its short-term wholesale funding positions, thus requiring more data transparency across functions and lines of businesses.A self-service visual analytics platform enables subject matter experts to build their own data apps that adapt from short-term to long-term perspectives, regulatory changes, and heightened reporting expectations. The transparency required to ensure compliance with both LCR and NSFR is facilitated when analysts are able to join data, test assumptions, and collaborate with others.

Volcker Rule’s Reasonably Expected Near Term Demand (RENTD)

Regulatory ChallengeEnablement Via Native Visual Analytics Capabilities
Adapt to Change: Firms are faced with the daunting challenge of aggregating enterprise-wide data for both market making and customer trading activities while the definitions of the two continue to evolve.Native visual analytics enable business and regulatory analysts to adapt to an evolving definition of “market making” through on-demand access to multiple data sources and by creating data apps that they can revise and edit as needed.
Measuring Customer Facing Activity: The essence of RENTD is not the measurement of product inventory, but aligning that inventory to customer demand. This requires constant monitoring, measuring, and reporting of three factors: how fast products are bought and sold, how long those product remain in inventory, and the expected trend of demand for those products.Native visual analytics greatly enhances a firm’s ability to monitor, measure, and report changes in RENTD and its market maker inventory because direct access to both real-time and historical data sources provides business users with deeper insight into quantitative relationship between customer and desk trade activity.
Transparency of Market Making Conduct: Firms need to distinguish between trade activity that is done on behalf of a customer and direct to market activities within the context of RENTD in a transparent fashion.The correlation between customer trading demand and market making activities can be made transparent with native visual analytics. Subject matter experts can join trade data from respective sources on-demand and enrich their analysis further by connecting secondary data sources as needed. Transparency provides a timely and insightful understanding of how customer demand relates to trading inventory.