Native Visual Analytics for Regulatory Compliance

Summary

Native visual analytics accelerates regulatory compliance in the financial services industry because it provides subject matter experts with multifunctional accessibility and transparency needed to address heightened expectations of regulators. This page describes the concept of native visual analytics and provides examples of its application to specific regulatory challenges.

Subject Matter Experts Drive Solutions

For the purposes of this page, subject matter experts are the end users who understand the nuances of how to derive meaning (and thus value) from data. They include compliance officers, front office traders, risk managers, finance analysts, back office operations, senior management, and anyone else that influences strategic decisions based on their use and understanding of data.

Native visual analytics refers to the ability to run analytics native to where the data resides. This capability is what differentiates native visual analytics from traditional business intelligence and visualization tools. In the native case, data stays in place and provides high definition access to all granular data in its native state. It is a simplified, in-cluster and in-cloud architecture that runs analytics directly within Hadoop nodes, cloud instances, or other scale-out modern data platforms.. Even if you are working with multiple data sources, the analytics and visualization engines are distributed at each source, simultaneously. In the traditional BI case, data sources are moved to a separate place before the analytics are run. Analytics done after the move are dependent on getting that move right in a timely manner and on the first try. The experts are waiting to make decisions, not driving the solutions.

The “native” aspect of native visual analytics is what empowers subject matter experts to drive solutions because they can run analytics where the data resides and build their own data apps that are native to what they need and when they need it. It shifts the paradigm from passive analytics to data applications

Core Concepts Critical to Suffice Regulatory Requirements

There are three critical core concepts to sufficing heightened regulatory expectations in the financial services industry: a) granular first; b) leverage all data; and c) self-service.

  1. Granular First: Data granularity is maintained so that you can model the story from the ground up, ensuring that it is there when you need to drill down again. This suffices two regulatory requirements: modelability1 and fidelity2.
  2. Leverage All Data: The data makeup of financial products and activities are both unstructured (legal parameters and electronic communications), structured (quantitative values), and can be in real time or recorded in the past. Native visual analytics is agnostic to data type and its real-time and historic analysis capabilities are required to articulate the full story to regulators (i.e., electronic communications of pre-trade intent is needed for DFA Trade Reconstruction along with tracking the lineage of post-trade activities).
  3. Self-Service: Self-service analytics with zero data movement enables business and regulatory analysts to adapt to evolving regulatory requirements through on-demand access to multiple data sources and by creating data apps that they can revise and edit as needed. Subject matter experts can join trade data from respective sources on-demand and enrich their analysis further by connecting alternative data sources as needed.

Native Visual Analytics Applied to Specific Regulatory Challenges

The table below provides examples of specific regulatory challenges and how the capabilities of native visual analytics enable holistic solutions.

Regulation

Comprehensive Liquidity Assessment and Review (CLAR) and Liquidity Monitoring Reports

Regulatory Challenge Enablement Via Native Visual Analytics Capabilities
Fidelity of aggregated data across entities: Data from multiple business sources (Wholesale, Retail, and Commercial) and products lines (Deposits, Savings, Loans and etc.), and jurisdictions need to be aggregated up without loss of fidelity to the underlying detail. Enable analysts with the ability to connect directly to individual data sources (lines of business, product lines, jurisdictions, etc.) so that they can run analytics at a granular level and aggregate those results up to the enterprise level. This mitigates the errors that occur during the extract, transform, and load processes required by traditional BI platforms. Data fidelity is ensured because data is not moved.
Dynamic governance and oversight: Internal controls need to be flexible and integrated across the organization, monitoring continuous changes within assets and liabilities so that calculating, matching, and reporting can be done in a timely and accurate manner. Enhance the firm’s internal control framework by providing visual analytics and BI capabilities that monitor, measure, and report in real time, providing the most up to date analysis at any point of time. Dynamic governance and oversight is a vital area that regulators are increasingly reviewing and requiring financial institutions to strengthen.
Transparency of Capital Allocation: Stress testing increases the demand on firms to have robust liquidity systems and processes in place. Liquidity Management and Treasury operations are required to frequently monitor and assess the impact on liquidity requirements across business lines by allocating funding costs accurately. The capabilities described above, together, enable transparency and enterprise wide visibility. Analytics can be run directly on the disparate data sources and types that make up a typical banking organization. Native visual analytics provides the capability to juxtapose real-time and historical analysis at a granular level and across functions on one data application. This helps to centralize both the capital allocation and data transparency processes.

Basel III Liquidity Coverage Ratio (LCR) and New Stable Funding Ratio (NSFR)

Regulatory Challenge Enablement Via Native Visual Analytics Capabilities
Changes to both LCR and NSFR Treatments more favorable of operational, retail, and small business deposits (by assigning higher ASF weightings to these deposits), and retail and small business loans (by assigning lower RSF weightings to these loans). Weightings must be correctly tagged and assigned, thus, a need for a robust and aggregated data platform across entities. Data from multiple business sources (wholesale, retail, and commercial) and products lines (deposits, savings, loans, etc.), and jurisdictions need to be aggregated without loss of data to the underlying detail. Data quality review can be optimized through visual analytics that are run natively and simultaneously across multiple non-transposed data sources (i.e., lines of business, products, and algorithmic models). Data loss is mitigated and the requisite granularity is provided to verify that weightings have been correctly tagged and assigned to assets, ensuring correct calculations of the NSFR and LCR.
Dynamic Internal Controls Framework: Internal controls need to be flexible and responsive, continuously monitoring changes with treatments of operational deposits, stock borrowing transactions, reverse repos under LCR, and alternative treatment for derivatives under the NSFR. It is essential for data users to have adequate tools in calculating and reporting these treatments on a timely and accurate manner. Real-time and historical analysis enables a robust, defensible, and dynamic internal control framework that accurately monitors, measures, and reports changes in the treatments of operational deposits, stock borrowings, and derivatives to suffice LCR and NSFR requirements.
Meeting Regulatory Expectations: Regulatory compliance of both LCR and NSFR is expected by January 2018 by US banks. The BCBS did not align the two ratios’ implementation dates together as the LCR was required in January 2019. Nonetheless, banks are beginning to focus more on reviewing its short-term wholesale funding positions, thus requiring more data transparency across functions and lines of businesses. A self-service visual analytics platform enables subject matter experts to build their own data apps that adapt from short-term to long-term perspectives, regulatory changes, and heightened reporting expectations. The transparency required to ensure compliance with both LCR and NSFR is facilitated when analysts are able to join data, test assumptions, and collaborate with others.

Volcker Rule’s Reasonably Expected Near Term Demand (RENTD)

Regulatory Challenge Enablement Via Native Visual Analytics Capabilities
Adapt to Change: Firms are faced with the daunting challenge of aggregating enterprise-wide data for both market making and customer trading activities while the definitions of the two continue to evolve. Native visual analytics enable business and regulatory analysts to adapt to an evolving definition of “market making” through on-demand access to multiple data sources and by creating data apps that they can revise and edit as needed.
Measuring Customer Facing Activity: The essence of RENTD is not the measurement of product inventory, but aligning that inventory to customer demand. This requires constant monitoring, measuring, and reporting of three factors: how fast products are bought and sold, how long those product remain in inventory, and the expected trend of demand for those products. Native visual analytics greatly enhances a firm’s ability to monitor, measure, and report changes in RENTD and its market maker inventory because direct access to both real-time and historical data sources provides business users with deeper insight into quantitative relationship between customer and desk trade activity.
Transparency of Market Making Conduct: Firms need to distinguish between trade activity that is done on behalf of a customer and direct to market activities within the context of RENTD in a transparent fashion. The correlation between customer trading demand and market making activities can be made transparent with native visual analytics. Subject matter experts can join trade data from respective sources on-demand and enrich their analysis further by connecting secondary data sources as needed. Transparency provides a timely and insightful understanding of how customer demand relates to trading inventory.

1Modelability refers to the ability to prove that financial models and reporting are derived from real data. In the case of Basel Fundamental Review of the Trading Book, a modelable liquidity analysis is one in which every product has 24 or more pricing observations throughout the year with the time between each observation no longer the 30 days. If the data is not modelable, then capitalization charges go up (non-modelable risk factors). You need granularity first to prove this.
2Fidelity refers to the accuracy and exactness of data. The integrity of underlying data erodes when source data is moved, resulting in data loss and/or transformation.