Time Series Opportunities for Realtime Payments

Steve Wilcockson
KX Systems
Published in
9 min readNov 8, 2022

Steve Wilcockson & Alex Weinrich

Introduction

Analysts’ estimates vary (as ever) on the projected size of the global digital payments market. In terms of transaction volumes, a Deloitte & Capital IQ survey suggests they will be $11.3Trillion by 2026, a Statista report suggests $12.5Tr, while PWC suggest cashless payments numbers will double by 2030. Whatever the metric and variation, it is a big number. And for payments companies processing the individual customer transactions that comprise it, it is a potentially informative treasure trove and risk mitigator. A classic — the best maybe — example of Big Data unlocking value?

The number of payment companies is growing rapidly globally, more than doubling in the last decade, with digital FinTech disruptors making big waves. As alternative payment vehicles like cryptocurrencies gather momentum, regulations evolve and digital technologies emerge, payment playing fields have leveled, no longer limited to the decades-dominant big players. Maintaining a competitive edge when customers want easy, fast, safe services at low or no cost in an increasingly competitive global landscape is not easy. Regulators demand diligent transactions processing to preserve market integrity, while shareholders demand competitive market volumes and profits. How can payments firms reconcile these tough and sometimes competing sets of requirements?

Data differentiates. Data on transactions in flight like where, when, how often and how much can inform its immediate processing, whether it is appropriate, suspicious, downright dangerous or just okay to proceed. And that same data, when combined with historical data, can facilitate customer profiling to identify exciting revenue-generating ancillary services, ads, campaigns, offers to increase turnover and reduce customer churn. Let’s dive deeper into some payments challenges and explore opportunities that can arise from them.

Challenge 1 — Optimal Monitoring for Payment Fraud

Payments organizations deploy decision management platforms. They process everything, every dip, swipe and tap, often across the globe and across merchants. Crucially, this process aims to detect fraud. If fraud is suspected, an immediate decision might be taken to decline transactions on behalf of the customer, or to send them advice to let them make an appropriate decision, quickly.

That is the “simple” challenge faced. Underneath, there are further challenges.

Speed: Quoting one payments system architect: “We have lots of data to parse when a transaction comes through, which needs processing in milliseconds and stored. We can’t go to disk, or use a traditional database to get that data during the transaction. We have to get it quickly so we have for example a 40 terabyte in-memory data grid which we need to analyse.” The analysis needs to happen in real-time, way faster than the human brain can process. If the transaction is too slow, the check fails.

Scale: The volume, throughput and low latency, the tens of thousands of transactions every single second, performing dozens of services on each transaction going through at tens of milliseconds, with 99.999% success rate is jaw-dropping. Just think about those numbers and what payment fraud practitioners worry about all day, all night, 365 days a year. And how all that data — petabytes — informs artificial intelligence and machine learning algorithms that detect the fraud. The algorithmic efficacy and data efficacy are interlinked. Quoting our architect: “Without the data, we can’t get the answers that we need. Our data scientists, data analysts, rule authors, all day they dream up ways to collect data and use it better.”

Revenues: Each time a fraud is detected, it benefits — perhaps even delights — both customer and regulator, improving brand reputation. However, should a fraudulent million dollars transaction get missed and approved, the organization loses more than a million dollars. That is not okay. On the flip side, if a customer swipes legitimately but the system has not reconfigured to the customers’ spending preferences and the bank cancels the “unrecognized” transaction, they will use a different card from another organization.

Regulation: AML/KYC regulations emphasize the need to move from a “Detective Control” environment to a “Preventative Control” environment. Existing transaction monitoring — the process of observing customer transactions in real-time or retroactively to spot trends and red flags, and transaction screening — verifying customer identities and ongoing screening of their transactions — can in reality be too slow and inconsistent to support live checks on payments given the low latency needs and high volumes of data. Thus organizations depend on detecting fraudulent or sanctioned activity after the event, leaving them vulnerable to large fines and censure by the regulator.

Costs: Any analyses of terabytes whether straightforward monitoring or mission-critical complex machine learning, real-time or historical or both, impacts the bottom line of IT budgets. When disk space and memory is saved, data center compute re-used, cloud instances not needed, the consequent infrastructure cost savings please CFO, CIO, board and shareholder alike.

Optimizing the interaction between now — the transaction — in the real time stream, and the historical fraud trends — the historical context in the warehouse — challenges most established systems. While architecture data pipelines flow nicely from warehouse model to live monitoring, albeit with latency challenges and glitches, returning the message from stream to model and warehouse can be overtly manual.

Fig 1: Common Payments Architecture: From feature to stream, engineers traditionally might schedule feature materialization jobs to load data into a dedicated database, unload newly computed features to object storage, then write a Lambda function that triggers on the appearance of new data to load the data into the database. Any unexpected latency or glitch anywhere across the process means prospective monitoring failure.

Teams, therefore, need to maintain standing infrastructure, manually plug gaps in the workflows (reverting intelligence back into the model training and calibration processes, for example), while spending time and resource ensuring speed and continuity throughout. Teams must also look to the future when system demands change.

Challenge 2 — Optimal Digital Insights with Time Series Analytics

The amount of data flowing through payment companies is astronomical. Capturing such data is one task but turning the data into value is an ultimate goal. Leading and disruptor companies have grown organically and/or through acquisitions over time. With expansion comes some level of technical debt — fragmentation, legacy database environments and siloed systems without the desired interoperability of a modern technological architecture. For a payment company to run efficiently, processes must run flawlessly in real-time. Identifying issues, high traffic periods and potential risks is essential. Getting to that nirvana is non-trivial, but essential for competitive leadership.

When it comes to servicing customers directly with added value and providing corporate intelligence to the company’s stakeholders, each transaction is important — payment settlements involving payee and payer must never break — with real-time insights paramount when driving customer value. At a recent event, I thought I had made a purchase but it did not show up on my app. Consequently, I paid again, only to find on arriving home that I paid twice. Such issues can cause serious reputational damages for a firm along with unnecessary strain on payment recipients.

Furthermore, if organizations understand better my spending patterns, they can better target services — am I spending more money on online gaming and entertainment than I am on household maintenance? Are those gaming spends binges or spikes or more regular? Should they recommend that I go see a therapist ? Might my grocery bill show the same signs of inflation as those predicted by governments and specialists? Should the payments firm work with their retailers to offer me a discount on coffee brand x or restaurant y which might encourage me to use their card more? All viable use cases offering information which I might just find compelling to act on.

For payment companies to analyze duplicate transactions, assess sudden traffic spikes or monetize certain behaviors from their client base, they require the right tools. Data scientists and engineers are commonly unable to leverage the data they own. A system that allows analysts to have their data at their fingertips, and act upon it, can be challenging to develop. Data is often available only in batches, while combining historical trends with real-time information is often impossible in legacy database structures.

Some common issues may arise:

  • Slow data pipelines result in outdated or stale data
  • Manual work: having to import and export data causes slow and time-consuming workflows
  • Inefficient and siloed engineering means poor interoperability between systems and across departments.
  • Inaccessible data. Data sitting in legacy systems or subsets of data might not be easy to query.

Machine learning, as touched upon above, is an area many companies feel they can improve. Using historical data in conjunction with real-time data can give a company quite an edge with forward looking analysis. However, many times, historical data is not as accessible as engineers would like it to be and most databases cannot handle joining these 2 types of data sets. The ability to have better predictive insights can help prepare companies, generate needed intelligence and prevent risk events from happening.

Where KX Comes In

KX has long been at the forefront of financial innovation, particularly in trading, risk management and surveillance. KX provides kdb+ technology for real-time analysis of any data, at speed or at rest, in the cloud or on-premises, providing firms with dynamic insights into the here and now in combination with what took place in the past. That same power and analytics processing capacity in financial transaction analysis can be utilised by retail payment companies.

KX leverages cloud technologies, services, and protocols through kdb Insights to ensure analytics applications are more scalable, robust and interconnected, easy to deploy and update, and bring continuity between research and production. Advanced analytics, with seamless Python integration and SQL querying of data, allows users to detect anomalies and derive insights — all in real-time.

Fig 2: KX: Kdb+ time-series database capabilites available in the cloud as kdb Insights, for scalable cloud deployments, leveraging ultra-efficient, fast, and numerically-optimized kdb+ capabilities.

KX: A Powerful Payments Intelligence Solution

A unified scalable data store like KX, with a powerful columnar capability for time series analysis addresses the problem of data fragmentation by providing aggregation that enables instant, low latency analysis of high-granularity real-time and historical data across the organization. Users can replace the delayed insight of batch-based analysis with instantaneous response to events, delivering insights sooner and saving costs of batch storage and data duplication across data warehouses and databases, pleasing CIOs and CFOs. With 40 terabyte queries on payment streams, continuity between historical context and live monitoring is in reach, responsiveness and insights improve, infrastructure costs lowered, and architectures simplified.

Fig 3: A Simplified, Smaller, Continuous, Fast Time Series Architecture with KX

In addition, given KX holds live stream and extensive history in memory, not only can practitioners deploy feature context direct to the live stream, but also through Python interoperability, better capture and direct new insights from new events in code they already have, carrying stream insights back to the core model and warehouse.

Conclusion: Opportunities Abound

Payments organizations have used streaming technologies, data warehouses and data lakes for years. However, legacy inefficiencies increase costs, add complexity and risk, incur reputational risk and can prevent new opportunities from being brought to market. Fraud errors must be avoided where possible, with new services brought online to enthuse and retain clients. KX technology through kdb+ has driven capital markets for decades, providing users with a “continuous stream of truth.” Payment firms can leverage those same continuous streams of truth to add rigor and reliability to anomaly and fraud detection processes on one hand, deliver new customer insights on the other. What is more, they can start today, leveraging code sets they already use.

Don’t delay, Start today!

--

--

Steve Wilcockson
KX Systems

Loves the intersection of quantitative tech, data science and society, also all things postcolonial as a First Nations-sympathetic white guy perspective.