Kafka Konsumer 🔥 🚀

Easy implementation of Kafka consumer with built-in exception manager 💃

Abdulsamet İLERİ
Trendyol Tech

--

As a Trendyol Indexing Team, our architecture is heavily based on an event-based system, so Kafka takes part at the center of this system. Approximately 350 million events are processed daily within ~15 different Kafka consumer projects.

Figure: Daily processed events

In these consumer projects, we must implement

  • boilerplate Kafka initialization codes
  • to handle exceptions with an eventual strategy
  • exposing processed event metrics, defining alerts, and more!

It becomes quite difficult for many such projects after a certain period. To find a solution, we searched existing open-source projects but didn’t find anything, so we decided to develop a library (kafka-konsumer) under the Trendyol organization, accommodating all needs. 👊 Thanks to this library, we start solving security problems more effortlessly, have a consistent codebase, and easily start new consumer projects and maintain old ones. 💃

We have previously developed a library (kafka-cronsumer) that implements the exception strategy these consumer projects need. You can check this article and review its repository to learn how it works. This library worked in production without any problems and was very successful. 😋

We wanted to extend this behavior not only to exceptional cases but also to standard consuming processes! As a result of this, kafka-konsumer was born! 🤔

It takes Kafka connection information and a consume function; that's it. It allows us to focus only on business logic. Exception handling and exposing metrics come built-in 🚀. We will look more closely at the following.

Features 🚀

  • Built-in retry/exception handling support
  • High-performance batch-consuming support: in the implementation phase, we are using the same strategy mentioned in this article 🚀
  • Distributed tracing support on consumption and produce operations 🚀
  • Exposed Several Metrics
  • Scram — Plain Text Authentication & TLS Support
  • Kafka Producer Support

How to use it? 🛠

Simple Consumer

Figure: Simple Consumer

Simple Consumer With Retry/Exception Option

Figure: Simple Consumer With Retry/Exception Option

Batch Consuming

Figure: Batch Consuming

With Grafana & Prometheus

Currently, we are exposing the following metrics:

Figure: Exposed metrics

It is possible to create a Grafana dashboard and alert rules on them, for example:

Figure: Grafana Dashboard example
groups:
- name: konsumer-alerts
rules:
- alert: UnprocessedMessageIncreasing
expr: increase(kafka_konsumer_unprocessed_messages_total{job="konsumer"}[5m]) > 0
for: 5m
labels:
severity: "critical"
annotations:
summary: "Kafka Konsumer unprocessed message increasing"
description: "Kafka Konsumer unprocessed message increasing, current value: {{ $value }} <http://localhost:3000/d/DlIdtG_4z/kafka-konsumer-dashboard|Grafana>"

You can also find several ready-to-run examples in this directory.

Conclusion

At the time of writing this article, we have been using this library in production for a few months without any problems 😌 We are also ready to write new consumer projects within 1 hour 😅 Thanks to all contributors Emre Odabas, Anıl Mısırlıoğlu, Mehmet Sezer, Elif Seray Dönmez Çelik, Nihat Alim.

We are open to hearing your voice and feedback. You can create an issue or contact me or Emre Odabas. Thank you for reading 🎈

You can also check out other articles from Trendyol Tech below

We love to share our knowledge. If you like to “share” with us, you could apply for our open positions.

--

--