Managing transactions and analytics simultaneously in the high-speed world of financial services
HTAP databases can deliver the performance of an in-memory database with the persistence and reliability of a traditional operational database
Increasing trade volumes and periods of high market volatility create technology challenges for financial services firms. This is especially true for sell-side firms, which can experience extremely high transaction volumes, since they partition already high volumes of incoming orders into an even greater number of smaller orders for execution.
At the same time, they must support a high number of concurrent analytic queries to provide order status, risk management, surveillance and other information for clients.
This requirement for multi-workload processing at high scale, coupled with the highest levels of performance and reliability, has historically been difficult to satisfy. Compounding the challenge, transaction volumes grow not only incrementally and within expectations, but can also spike due to unexpected world events.
A critical component of a sell-side firm's technology infrastructure is its transaction management and analytics platform. The platform must be reliable and highly available. A failure, or even a slowdown of the platform, can have severe consequences as it can take many hours to rebuild order state and resume normal operations after a failure. In the meantime, the firm's ability to process additional trades and provide order status is compromised and financial losses mount.
To successfully handle growth and volatility without performance or availability issues, the platform must balance transactional workloads with the concurrent analytic demands of downstream applications at scale. Financial services organisations, particularly sell-side firms, must process millions of messages per second, while simultaneously supporting thousands of analytic queries from hundreds of systems that must report on the state of orders while performing other queries.
Currently, in-memory databases are widely used, primarily due to their ability to support high-performance data-insert operations and analytic workload processing. However, in-memory databases alone are not an ideal platform for transaction management and analytics for several reasons:
Limited ability to concurrently process transactional and analytic workloads at scale: In-memory databases are not designed to support multi-workload processing at high scale. As a result, as volumes increase, at some point both the transaction processing and the analytic queries will slow or stall.
Scale limitations: Since the data in an in-memory database is stored in main memory, the working data set is limited by the available amount of memory. This creates risk when transaction volumes spike, compromising the ability to process new orders once the available RAM is filled.
High costs: Since servers have hard memory limits, scaling in-memory databases beyond these limits requires purchasing and procuring additional nodes - to sustain normal operations plus headroom for unexpected volatility - increasing overall costs.
System downtime: Since the data is stored in-memory, if the database server fails, the data that is resident in-memory on that server is lost.
Finding a solution
So, given these challenges, how can financial services organisations find a solution that enables them to simultaneously process transactional and analytic workloads at high scale? The answer comes in the form of the Hybrid Transaction/Analytical Processing (HTAP) database.
Traditionally, online transaction processing (OLTP) and online analytical processing (OLAP) workloads have been handled independently, by separate databases. However, operating separate databases creates complexity and latency because data must be moved from the OLTP environment to the OLAP environment for analysis. This has led to the development of a new kind of database which can process both OLTP and OLAP workloads in a single environment without having to copy the transactional data for analysis. HTAP databases are being used in multiple industries for their ability to uncover new insights, create new revenue opportunities and improve situational awareness and overall business agility for organisations.
The best HTAP database platforms deliver the performance of an in-memory database with the persistence and reliability of a traditional operational database. They are optimised to accommodate high transactional workloads and a high volume of analytic queries on the transactional data concurrently, without incident or performance degradation, even during periods of market volatility.
They have a comprehensive, multi-model database management system (DBMS) that delivers fast transactional and analytic performance without sacrificing scalability, reliability or security. They can handle relational, object-oriented, document, key-value, hierarchical, and multi-dimensional data objects in a common, persistent storage tier.
Moreover, the best of these embody features that make them attractive for mission-critical, high-performance transaction management and analytics applications. These include:
High-performance for transactional workloads with built-in persistence: The ideal scenario is to find a data platform that includes a high-performance database that provides transactional performance equal to, or greater than, in-memory databases along with built-in persistence at scale.
Data is not lost when a machine is turned off, eliminating the need for database recovery or re-building efforts. By using an efficient, multi-dimensional data model with sparse storage techniques, data access and updates are accomplished faster, using fewer resources and less disk capacity.
High-performance for analytic workloads: Seek out solutions that provide a range of analytic capabilities, including full SQL support, enabling you to use their existing SQL-based applications with few or no changes. Since the database stores data in efficient multidimensional structures, SQL applications achieve better performance than traditional relational databases.
Consistent high-performance for concurrent transactional and analytic workloads at scale: Ideally, solutions should provide the highest levels of performance for both transactional and analytic workloads concurrently, at high scale, without compromising performance for either type of workload. Since rising order volumes increase both the transactional and analytic workloads on the system, a data platform must scale to handle such workloads without experiencing performance or availability issues.
Positive prospects
This article has highlighted that many financial services organisations are, for a variety of reasons, currently crying out for ways in which they can simultaneously process transactional and analytic workloads at high scale. Fortunately, help is now at hand. Thanks to the latest breed of data platforms for high-performance transaction management and analytics applications, both transaction processing and analytic queries are supported concurrently, at very high scale, with built-in durability and with the highest levels of reliability - and at a low total cost of ownership.
Graeme Dillane is manager, financial services at InterSystems