Integrating Kafka with .NET for Pub/Sub Messaging

Siva V
4 min readJun 25, 2024

--

Apache Kafka is a distributed streaming platform widely used for building real-time data pipelines and streaming applications. It can handle high throughput, scalability, and fault tolerance, making it an excellent choice for pub/sub messaging systems.

This blog will provide a detailed guide on integrating Kafka with a .NET application for pub/sub messaging. We’ll cover the basics of Kafka, why it is beneficial, and walk through an example with complete code snippets.

Embark on a journey of continuous learning and exploration with DotNet-FullStack-Dev. Uncover more by visiting our https://dotnet-fullstack-dev.blogspot.com reach out for further information.

What is Apache Kafka?

Apache Kafka is an open-source stream processing platform developed by LinkedIn and donated to the Apache Software Foundation. Kafka is primarily used for two purposes:

  1. Building real-time data pipelines: Reliable data transfer between systems or applications.
  2. Building real-time streaming applications: Processing streams of data in real-time.

Kafka uses a publish-subscribe model, where producers publish data to topics, and consumers subscribe to those topics to receive data.

Key Components of Kafka

  1. Producer: An application that sends messages to a Kafka topic.
  2. Consumer: An application that reads messages from a Kafka topic.
  3. Broker: A Kafka server that stores messages and serves client requests.
  4. Topic: A category or feed name to which records are published.
  5. Partition: A topic is split into partitions, allowing data to be distributed and parallelized.

Why Use Kafka?

  • Scalability: Kafka can scale horizontally by adding more brokers.
  • Durability: Kafka replicates data across multiple brokers, ensuring data durability.
  • High Throughput: Kafka can handle a high number of reads and writes per second.
  • Fault Tolerance: Kafka’s distributed nature provides resilience against server failures.

Integrating Kafka with .NET

Prerequisites

  1. Kafka Setup: Ensure you have Kafka installed and running. You can download Kafka from the official website.
  2. .NET SDK: Ensure you have the .NET SDK installed. You can download it from the .NET website.
  3. Confluent.Kafka NuGet Package: We’ll use the Confluent.Kafka library to interact with Kafka from .NET. Install it using the following command:
dotnet add package Confluent.Kafka

Step 1: Setting Up Kafka Producer

Create a .NET console application and add the Confluent.Kafka package.

dotnet new console -n KafkaProducer
cd KafkaProducer
dotnet add package Confluent.Kafka

Create a file named Producer.cs and add the following code:

using System;
using Confluent.Kafka;

class Producer
{
public static void Main(string[] args)
{
var config = new ProducerConfig { BootstrapServers = "localhost:9092" };

using (var producer = new ProducerBuilder<Null, string>(config).Build())
{
try
{
var dr = producer.ProduceAsync("my-topic", new Message<Null, string> { Value = "Hello, Kafka!" }).Result;
Console.WriteLine($"Delivered '{dr.Value}' to '{dr.TopicPartitionOffset}'");
}
catch (ProduceException<Null, string> e)
{
Console.WriteLine($"Delivery failed: {e.Error.Reason}");
}
}
}
}

This code sets up a Kafka producer that sends a message to the topic “my-topic”.

Step 2: Setting Up Kafka Consumer

Create another .NET console application for the consumer and add the Confluent.Kafka package.

dotnet new console -n KafkaConsumer
cd KafkaConsumer
dotnet add package Confluent.Kafka

Create a file named Consumer.cs and add the following code:

using System;
using Confluent.Kafka;

class Consumer
{
public static void Main(string[] args)
{
var config = new ConsumerConfig
{
GroupId = "test-consumer-group",
BootstrapServers = "localhost:9092",
AutoOffsetReset = AutoOffsetReset.Earliest
};

using (var consumer = new ConsumerBuilder<Ignore, string>(config).Build())
{
consumer.Subscribe("my-topic");

try
{
while (true)
{
var cr = consumer.Consume();
Console.WriteLine($"Consumed message '{cr.Value}' at: '{cr.TopicPartitionOffset}'.");
}
}
catch (OperationCanceledException)
{
consumer.Close();
}
}
}
}

This code sets up a Kafka consumer that listens to the topic “my-topic” and prints any consumed messages.

Step 3: Running the Producer and Consumer

Start Kafka: Ensure Kafka is running. You can start Kafka using the following commands:

# Start ZooKeeper
bin/zookeeper-server-start.sh config/zookeeper.properties

# Start Kafka broker
bin/kafka-server-start.sh config/server.properties

Create Topic: Create the topic “my-topic” if it doesn’t already exist.

bin/kafka-topics.sh --create --topic my-topic --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1

Run Producer: Navigate to the KafkaProducer directory and run the producer.

dotnet run

Explanation of Code Snippets

Producer:

  • ProducerConfig: Configuration for the Kafka producer, including the BootstrapServers.
  • ProducerBuilder: Builds the Kafka producer instance.
  • ProduceAsync: Asynchronously sends a message to the specified topic.

Consumer:

  • ConsumerConfig: Configuration for the Kafka consumer, including GroupId and BootstrapServers.
  • ConsumerBuilder: Builds the Kafka consumer instance.
  • Subscribe: Subscribes to the specified topic.
  • Consume: Continuously consumes messages from the topic.

Conclusion

Integrating Kafka with .NET for pub/sub messaging is straightforward with the Confluent.Kafka library. Kafka's high throughput, scalability, and fault tolerance make it an ideal choice for real-time data pipelines and streaming applications. By following the steps and code snippets provided in this blog, you can set up a Kafka producer and consumer in a .NET application, enabling efficient and reliable message processing.

You may also like : https://medium.com/@siva.veeravarapu/integrating-kibana-logging-in-a-net-application-bc63a4cde08d

--

--

Siva V

Over 8 years of hands-on experience techie in MS technologies. Journey of continuous learning and exploration with https://dotnet-fullstack-dev.blogspot.com/