Tableau, Power BI, even Excel are excellent BI tools: trustworthy workhorses that can digest gigabytes of data and render them into accessible tables, diagrams and graphs. And yet, they all seem to stop short when it comes to performing financial analytics, particularly market risk analytics. Performance doesn’t hold up, risk-specific measures are near-impossible to integrate properly, and users always end up having to fall back on Excel to reconcile data from disparate systems.
Why does this seem to only happen in finance? How is it still an issue in the era of Big Data?
Let’s paint a picture: A risk manager at a bank has been asked by the head of trading to evaluate the financial results attributed to funding curves. The risk manager creates a simple quantitative model and begins sourcing and consolidating risk and market data across systems in order to perform the analysis. This is followed by data cleansing then model calibration, likely done over the course of a day or so, after organizing data from different sources into separate CSV files to then map it and consolidate it again.
When the risk manager looks at the results, he sees something amiss and so increases the granularity of all queries down to trade level to check and see what is incorrect. He makes changes to the query, then adjusts the data according to the results. By the end of this exercise, the dataset is so massive, it is too large to run in Excel or present on typical data visualization tools so it needs to be broken down into smaller pieces to create a report.
The risk manager then presents his findings, a loss, to his superior. His superior requests further detail on which products and currencies are driving the loss. The risk manager then goes back to his desk and adds the requested dimensions to check the loss and reruns the queries only to realize that he reported a loss when it was actually a profit.
Does this sound familiar? All of that time spent ultimately yielding the wrong results? Why not spend time getting conclusions out of the data rather than waste time on data cleanup and infrastructure issues?
What is the root cause?
In short: large volumes and non-linear aggregation
Risk management is built upon a series of measures that, many times, do not linearly compute. Non-linear aggregation is something many technologies struggle with, especially when they have not been designed for that. In addition, the size of the datasets has expanded greatly over the past few years, particularly due to regulatory requirements. Again, while most BI platforms can work very well on a few gigabytes, it takes specialized technology to enable the analysis of terabytes of data – which is now the norm at virtually every mid-size bank.
Non-linear calculations and aggregation are the main reasons why analytics platforms struggle with market risk analysis, but other factors include the lack of optimization for high-end hardware configuration and the lack of tools to easily share visualizations and insights among teams and departments.
The end result is an IT department that spends millions of dollars each year to run and maintain a range of analytics technologies on grossly outsized hardware platforms that still provide an unsatisfactory user experience for the business.
How to fix data analytics for Market Risk, once and for all?
ActiveViam’s Atoti is designed to readily handle large, complex financial queries and analytics:
- Atoti is backed by a best-in-class in-memory analytics tool capable of handling non-linear calculations on aggregated data with the ability to drill down to a granular level and deliver results through interactive and flexible dashboards.
- Atoti does not lose speed or performance when aggregating terabytes of data in real time.
- Atoti replaces your market risk analytics tools, while optimizing the hardware footprint throughout.
- Atoti is the most cost-effective system for market risk because it is in-memory with the ability to fully exploit the flexibility of the Cloud.
In sum, Atoti is an easy way to perform data analytics with very little code to write so users can slice and dice and filter results, inject formulas and redefine them at any point, providing a platform where a user can spend more time on iterating on the analytical model and drawing conclusions from the data rather than focusing on infrastructure issues or data cleanup.
Having a single application that can be built to accommodate the needs of the desk, IT engineers and regulatory risk teams is a good way to harmonize approaches and reduce costs over time. Other solutions burden risk departments with upgrades that can take up to 18 months and cost hundreds of thousands of dollars in consulting fees – for a single change. For further information on Atoti, visit our website.