A Comprehensive Guide to Integrating Kafka in a Spring Boot Application

Abhishek Ranjan
3 min readApr 15, 2023

--

Introduction:

Apache Kafka is a popular distributed streaming platform that allows you to build scalable, fault-tolerant, and high-throughput applications. In this guide, we’ll provide a step-by-step tutorial on integrating Kafka in a Spring Boot application, complete with detailed code samples and explanations. By the end of this tutorial, you’ll have a solid understanding of how to incorporate Kafka into your Spring Boot projects to improve performance and boost your application’s capabilities.

Prerequisites

To follow along with this guide, you should have:

  1. Basic knowledge of Java and Spring Boot
  2. Java Development Kit (JDK) 8 or later installed
  3. A suitable IDE, such as IntelliJ IDEA or Eclipse
  4. Apache Kafka installed and running on your local machine

Setting Up Your Spring Boot Project

To get started, we’ll create a new Spring Boot project with the following dependencies:

  1. Spring for Apache Kafka
  2. Spring Boot Starter Web

You can create the project using Spring Initializr, or you can manually add these dependencies in your Maven pom.xml or Gradle build.gradle file.

Maven:

<dependencies>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
</dependencies>

Gradle:

dependencies {
implementation 'org.springframework.kafka:spring-kafka'
implementation 'org.springframework.boot:spring-boot-starter-web'
}

Configuring Kafka in Spring Boot

Next, we need to configure Kafka in our Spring Boot application. We’ll start by adding the following properties to the application.properties file:

spring.kafka.bootstrap-servers=localhost:9092
spring.kafka.consumer.group-id=my-group-id

Here, spring.kafka.bootstrap-servers specifies the address of your Kafka broker, and spring.kafka.consumer.group-id specifies the consumer group ID for your application.

Creating a Kafka Producer

To send messages to Kafka, we need to create a Kafka producer. First, create a new package called com.example.kafka.producer and then add the following KafkaProducerConfig class:

package com.example.kafka.producer;

import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.DefaultKafkaProducerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;
import java.util.HashMap;
import java.util.Map;
@Configuration
public class KafkaProducerConfig {
@Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}

Next, create a MessageProducer class in the same package to send messages to a Kafka topic:

package com.example.kafka.producer;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Component;

@Component
public class MessageProducer {

@Autowired
private KafkaTemplate<String, String> kafkaTemplate;

public void sendMessage(String topic, String message) {
kafkaTemplate.send(topic, message);
}

}

Creating a Kafka Consumer

Now, let's create a Kafka consumer to receive messages from the Kafka topic. Create a new package called com.example.kafka.consumer and add the following KafkaConsumerConfig class:

package com.example.kafka.consumer;

import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;

import java.util.HashMap;
import java.util.Map;

@Configuration
public class KafkaConsumerConfig {

@Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
configProps.put(ConsumerConfig.GROUP_ID_CONFIG, "my-group-id");
configProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
configProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
return new DefaultKafkaConsumerFactory<>(configProps);
}

@Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}

}

Create a MessageConsumer class in the same package to listen for messages from the Kafka topic:

package com.example.kafka.consumer;

import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Component;

@Component
public class MessageConsumer {

@KafkaListener(topics = "my-topic", groupId = "my-group-id")
public void listen(String message) {
System.out.println("Received message: " + message);
}

}

Testing the Kafka Integration

Finally, let's test our Kafka integration by sending and receiving messages. Create a new REST controller in the com.example.kafka.controller package:

package com.example.kafka.controller;

import com.example.kafka.producer.MessageProducer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class KafkaController {

@Autowired
private MessageProducer messageProducer;

@PostMapping("/send")
public String sendMessage(@RequestParam("message") String message) {
messageProducer.sendMessage("my-topic", message);
return "Message sent: " + message;
}

}

Run your Spring Boot application and use a tool like Postman or curl to send a POST request to http://localhost:8080/send?message=Hello_Kafka. The message will be sent to the Kafka topic, and the consumer will receive and print the message to the console.

Conclusion

In this detailed guide, we've demonstrated how to integrate Kafka into a Spring Boot application with step-by-step instructions and code samples. Now, you're equipped to leverage the power of Kafka in your Spring Boot

--

--