Confluent Kafka integration with .Net Core

Introduction

Sheshnath Kumar
4 min readApr 20, 2020

Before moving to integrate Kafka with .net core let’s have some basics around Kafka and related things. If you want, you can skip direct to Integrate Confluent Kafka in .Net Core.

Apache Kafka — Is an open-source event streaming software platform. It is a fast, scalable, fault-tolerant messaging system that can handle trillions events per day in real time. Primarily designed for heavy reads, Kafka uses Log data structure.

Confluent Platform — Is an enterprise-ready platform that complements Apache Kafka with advanced capabilities designed to help accelerate application development and connectivity, enable event transformations through stream processing, simplify enterprise operations at scale and meet stringent architectural requirements. It has additional services along with Apache Kafka like: Kafka Connect, REST Proxy, KSQL, Control Center, Schema Registry, Multi-language development etc.

Apache ZooKeeper — Is used for maintaining centralized configuration information, naming, providing distributed synchronization, and providing group services. Apache Kafka also uses ZooKeeper to manage configuration like electing a controller, topic configuration, quotas, ACLs etc.

Let’s have a look at High-level Kafka architecture:

Broker — One or more server(s)/node(s) running Kafka is Broker. Its mediator between message producer and consumer for message delivery to correct parties. Broker has Consumer Group Coordinator, Storage to guarantee the delivery.

Topic — Is a category or group name of the records. Message Producer uses this topic name to publish the message.

Offset — Integer value used by brokers to maintain the current position of the consumer for a topic/partition.

Partition — Is a unit of parallelism where Kafka topics are divided to achieve higher throughput. Data is chunked by key (hashing key) and distributed across partitions based on the hash key.

Message — A Message in Kafka is a key-value pair with associated metadata. So, each message has key(topic name), value(data) and metadata.

Producer -Publishes message/event to Kafka topics.

Consumer — Is the subscriber/reader of the message/event.

Consumer Group — Group of related consumers who receive the message to perform a task. Each consumer group has a unique group Id and corresponding offset for each topic partition.

Consumer Lag — Difference between the consumer offset and the latest offset is known as Lag.

Integrate Confluent Kafka in .Net Core

Prerequisite:

Zookeeper, and Confluent Kafka installed and running on your local system or have access to the server where these are installed.

As an alternate, you can run Zookeeper, and Kafka images in docker as well.

  • Install Docker at your local machine.
  • Copy and save below code in docker-compose.yml.

Run ZooKeeper, and Kafka with Docker-compose

---
version: '3.4'
services:
zookeeper:
image: confluentinc/cp-zookeeper:latest
hostname: zookeeper
container_name: zookeeper
ports:
- "2181:2181"
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ZOOKEEPER_TICK_TIME: 2000
broker:
image: confluentinc/cp-server
hostname: broker
container_name: broker
depends_on:
- zookeeper
ports:
- "9092:9092"
environment:
KAFKA_BROKER_ID: 1
KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://broker:29092,PLAINTEXT_HOST://localhost:9092
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0
KAFKA_CONFLUENT_LICENSE_TOPIC_REPLICATION_FACTOR: 1
CONFLUENT_SUPPORT_CUSTOMER_ID: 'anonymous'

Now, we have docker-compose.yml ready. Follow below steps to run the ZooKeeper, and Kafka at your local machine.

  • Open command prompt and move to the directory where docker-compose.yml kept.
  • Run command docker-compose up -d
  • Then run docker-compose ps to check the status if both Zookeeper, and Kafka is up.

Run the Application

Download and open this solution in the Visual Studio and run it. This solution has two Console based projects; Producer, and Consumer. Once you hit the run button two console windows will open. Type message in the “Kafka Sample Producer” console and hit enter. You can see the same message in the “Kafka Sample Consumer” console window. That’s it. You are able to produce and consume messages through Kafka.

Let’s have a walk through of the code changes.

  1. Install Nuget package Confluent.Kafka for Producer and Consumer both projects.
  2. Define ProducerConfig as below:
    var producerConfig = new ProducerConfig
    {
    BootstrapServers = “localhost:9092”
    };

    Here BootstrapServers is the server id (localhost in this case) and port where Kafka is running.
  3. Similarly, define ConsumerConfig as below:
    var consumerConfig = new ConsumerConfig
    {
    GroupId = “test-consumer-group”,
    BootstrapServers = “localhost:9092”
    };

    Where GroupId is the Consumer Group Id, and BootstrapServers is Kafka server.

Tools and Technologies

  • .Net Core (3.1) with C#
  • Confluent.Kafka 1.4.0
  • Visual Studio 16.5.2
  • Docker

These are the very basics to start Confluent Kafka with .Net Core. I’ll be coming up with more topics which can help to develop enterprise solutions.

You can download code sample from here.

Related articles:

Producer And Consumer Idempotency with Confluent Kafka .Net Client

Keep learning! Thank you!!

--

--

Sheshnath Kumar

Cloud Solutions, Distributed Systems/Microservices, Gen AI