Water metaphors are nothing new in financial markets, from liquidity to ‘dark pools.’ But to understand the impact of Big Data in finance, the data lake is at best an understatement. Tsunami is more like it.
Public news of compute errors and flash crash volatility notwithstanding, the day-to-day workings of capital markets and the global finance trade have reached planetary scale across many, many dimensions.
Winning the Deep Forensic Analysis Arms Race for Compliance
- Shailesh Ambike, Executive Co-Chair of Compliance & Legal Section (CLS) Education Sub-Committee of the Investment Industry Regulatory Organization of Canada (IIROC)
- Vamsi K Chemitiganti, GM – Financial Services at Hortonworks
- Shant Hovsepian, Co-Founder and CTO at Arcadia Data.
Live 6 October 2016 10a PT/1p ET
Transaction acceleration is not new news; it’s that banks are racing on parallel to keep the markets safe at any speed. Vamsi Chemitiganti, blogger and GM of Financial Services at our partner, Hortonworks, writes:
The rise of trade lifecycle automation across the Capital Markets value chain and the increasing use of technology across the lifecycle contributes to an environment where speeds and feeds are contributing to a huge number of securities changing hands (in huge quantities) in milliseconds across 25+ global venues of trading; automation leads to increase in trading volumes which adds substantially to the increased risk of fraud.
More transactions create more information, which creates more opportunity, for good and for ill. It’s like an arms race, with regulators and compliance risk managers in the same boat.
The planetary span of these markets means risk and compliance management needs a new way to pursue transparency: data navigation tools as powerful – and fast – as the currents within the markets.
Compliance analysts responsible for surveillance need a consistent, well-organized path through all trade data across all exchanges, along with historical and forensic data. Even real-time trader monitoring – made famous in Michael Lewis’ Flash Boys and the work done by analysts at Royal Bank of Canada – won’t solve for anomalies that can only be detected by evaluating longer-running patterns. Back to Vamsi Chemitiganti’s blog post:
Backtesting of data has become a challenge – as [has] being able to replay data across historical intervals. This is key in mining for patterns of suspicious activity like bursty spikes in trading as well as certain patterns that could indicate illegal insider selling.
Patterns are the name of the game. As pattern-recognizing creatures, the human compliance analysts can best compete with transaction volume when their visualization tools can navigate and plumb 100% of the data. Not static visualizations, but data applications that provide closed-loop navigation, end-to-end, from the surface to the depths, and into the ever-growing volume of historical data.
With Big Data architectures closing in on 100% data capture, analysts need tools that make it easier to both build and maintain their analytics. The first step: putting compute and data together, rather than creating unavoidable blind spots through extracts and summaries.
Visual analytics computed without moving the data view it top to bottom, not just at the surface. For analysts to navigate their way through historical data and real time data as thoroughly as possible, visualization and the risk/compliance data platform solutions must have speed, scale, and fully transparent granularity.
After all, in rough seas, the Navy is only as good as its maps.