Apache Kafka Guide #45 Bank Application example

Paul Ravvich
Apache Kafka At the Gates of Mastery
5 min readMay 2, 2024
Apache Kafka Guide #45 Bank Application example

Hi, this is Paul, and welcome to the #45 part of my Apache Kafka guide. Today we will discuss Bank Application examples as practical training in learning Apache Kafka.

Bank Application Task

We now have Bank, a pioneering digital bank renowned for providing its users with real-time banking services. The bank is eager to introduce a cutting-edge feature designed to instantly alert users of unusually large transactions, potentially indicative of fraudulent activity. The necessary transaction data for this feature already exists within a SQL database. Users have the flexibility to set and modify transaction threshold limits directly through the bank’s application, which the system should immediately recognize and adapt to. Consequently, alerts must be dispatched to users without delay. Any significant lag in notification could hinder users from taking timely actions to prevent financial losses. The question then arises: How can we leverage Kafka to effectively implement this functionality?

  • The bank offers real-time banking services to its clients and aims to introduce a new feature for notifying clients about significant transactions.
  • Transaction records are already maintained in a database.
  • Users can set their thresholds for notifications.
  • Notifications are to be delivered instantaneously to users.

Solution with Apache Kafka

Alright, diving into architecture, let’s consider the scenario with bank transactions. Discussions around bank transactions are pivotal. There’s a general understanding that these transactions are already stored within an SQL database. The primary goal here is to transfer this data from the database to Kafka, and for this task, the Kafka Connect source is the tool of choice. Notably, there’s a technology known as the CDC (Change Data Capture) Connector, with Debezium being a prominent example. Debezium is remarkable for its ability to seamlessly transfer transactions from the database to Kafka in real time, offering a highly efficient and structured data format.

The strength of Kafka Connect Source Connectors is undeniable, offering a robust solution simplifies the data transfer process significantly. This approach underlines the convenience of leveraging Kafka Connect Source to achieve substantial outcomes with minimal effort. Furthermore, the conversation shifts towards the creation of a user settings topic. This involves users configuring thresholds within apps, which in turn communicate with an app threshold service. This service acts as a proxy producer, forwarding data to the user settings topic.

The next step involves establishing a user alert topic. This is where a Kafka Streams application becomes invaluable, processing both bank transactions and user settings to determine if a transaction exceeds the user threshold. Should this condition be met, a message is generated in the user alerts topic, which is then dispatched to the user. To accomplish the notification delivery, a small consumer notification service is envisioned. This service triggers alerts within the users’ apps, promptly informing them.

To summarize, while straightforward, the architecture emphasizes the significance of Kafka Connect and the utilization of source CDC Connectors like Debezium. This setup not only facilitates efficient data management but also enhances the overall system’s responsiveness to user-defined criteria, making it a powerful and adaptable architecture for handling bank transactions and user notifications.

Summary

Regarding the topic of bank transactions, I have several insights to share. First and foremost, the Kafka Connect source is a remarkable tool. The market is rich with a variety of CDC (Change Data Capture) Connectors designed specifically for Kafka Connect. Among these are connectors for PostgreSQL, Oracle, MySQL, SQL Server, MongoDB, and likely others. While I don’t possess a comprehensive list, the availability of these connectors significantly simplifies the process of integrating your company’s existing data into Kafka swiftly and efficiently.

Moving on to the Kafka Streams application, it’s important to note a key functionality: when a user modifies their settings, alerts for past transactions will not be activated. This is because the system begins processing transactions from the point of the new changes moving forward. When discussing the topic of user thresholds, my perspective is that it is more advantageous to transmit events to the topic rather than states. To illustrate, consider the difference between conveying detailed event information — such as “User 1 activated a threshold of $10000 at 7:40 PM on June 1” — and simply stating a static condition like “User 1 has a threshold of $10000.” The former approach is less state-focused and presupposes the future irrelevance of such detailed information, an assumption I find to be misplaced. Therefore, it’s beneficial to include as much information as possible in your events.

The spotlight, however, really shines on the CDC (Change Data Capture). This area holds immense potential and capabilities, offering a rich landscape for exploration and implementation in the realm of bank transactions.

Topics on Bank Transactions:

  • Utilizing Kafka Connect Source is an excellent method for revealing data from existing databases.
  • There is an abundance of CDC (change data capture) connectors available for technologies like PostgreSQL, Oracle, MySQL, SQL Server, MongoDB, etc.

Kafka Streams Application:

  • When users modify their settings, alerts for transactions that have already occurred will not be triggered.

Topics on User Thresholds:

  • It is more efficient to send events to the topic indicating a user has set a threshold (e.g., User 1 enabled a threshold of $10000 at 7:40 pm on June 1) than to send the user’s state (e.g., User 1: threshold $10000).

Thank you for reading until the end. Before you go:

Paul Ravvich

--

--

Paul Ravvich
Apache Kafka At the Gates of Mastery

Software Engineer with over 10 years of XP. Join me for tips on Programming, System Design, and productivity in tech! New articles every Tuesday and Thursday!