Run Apache Kafka in a Docker Container

Chakresh Tiwari
ShoutLoudz
Published in
4 min readMay 11, 2024

Introduction

In this blog, We are going to discuss How to run Kafka in Docker. So to do this the first step is to install Docker in your system. For that visit the Docker website and click on Get Started link and download the Docker Desktop Software for your operating system.

Now after installing Docker Desktop, We will be creating one Docker compose file with the initial configuration to start single and multiple Kafka servers in the Docker.

Docker Compose File

In the docker-compose file, we will add all the environment and volume-related properties, To start a single Kafka server.

version: "3.8"

services:
kafka-1:
image: bitnami/kafka:latest
ports:
- "9092:9092"
environment:
- KAFKA_CFG_NODE_ID=1
- KAFKA_KRAFT_CLUSTER_ID=WnLkTHhkQaiJbwP8FClPhw
- KAFKA_CFG_PROCESS_ROLES=controller,broker
- KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=1@kafka-1:9091
- KAFKA_CFG_LISTENERS=PLAINTEXT://:9090,CONTROLLER://:9091,EXTERNAL://:9092
- KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://kafka-1:9090,EXTERNAL://${HOSTNAME:-localhost}:9092
- KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CONTROLLER:PLAINTEXT,EXTERNAL:PLAINTEXT,PLAINTEXT:PLAINTEXT
- KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
- KAFKA_CFG_INTER_BROKER_LISTENER_NAME=PLAINTEXT
volumes:
- /Users/chaktiwa/kafka/docker-compose/volumes/server-1:/bitnami/kafka

Below are the details of the above-mentioned properties:

KAFKA_CFG_NODE_ID = Node id of the server, its unique for each server
KAFKA_KRAFT_CLUSTER_ID = All the servers which are part of the same group will be in one cluster
KAFKA_CFG_PROCESS_ROLES = This Kafka works as both controller(which will maintain metadata) of Kafka broker means storing data of topics

KAFKA_CFG_CONTROLLER_QUORUM_VOTERS = Deciding on selecting leader if one server goes down
KAFKA_CFG_LISTENERS = related to communication in and out of the Kafka server

If you want to pass a few properties from the env file you can create an env file and mention the values of variables used in this file

Commands

After Starting docker and creating compose and env files you can start kafka by running the command from the file location.

docker-compose up

If the file name is different then pass the file name into an argument

docker-compose -f docker-compose1.yml up

If need to pass the environment variable pass the file also in argument.

docker-compose -f docker-compose.yml --env-file environment.env up 

After the command is executed you will see new container in the docker desktop.

By Using docker compose you can start multiple services under one container.

Access Kafka CLI using Docker

Now to access running Kafka, using CLI you can access it by going into the exec tab in the docker container and running the commands

Here you can run any CLI command like this

./kafka-topics.sh --create --topic docker-kafka-topic --bootstrap-server host.docker.internal:9092

./kafka-topics.sh --list --bootstrap-server host.docker.internal:9092

Access Kafka CLI from host Machine

  1. Open Terminal and Go to docker-compose file location.
  2. Run the following command, It uses compose file and docker
docker-compose exec kafka-1 /opt/bitnami/kafka/bin/kafka-topics.sh --list --bootstrap-server host.docker.internal:9092 

3. Run commands using the host machine CLI using local CLI scripts.

./kafka-topics.sh --list --bootstrap-server localhost:9092
docker-kafka-topic

This command will give an UnknownHost exception so you need to edit. /etc/hosts file by running the command (sudo vi /etc/hosts.) add 127.0.0.1 host.docker.internal and save the file.

Run Multiple Kafka Server

Copy the Kafka-1 services part and paste it below in the docker-compose file update the properties and run the start command again, and now multiple Kafka servers will start under one container in docker.

version: "3.8"

services:
kafka-1:
image: bitnami/kafka:latest
ports:
- "9092:9092"
environment:
- KAFKA_CFG_NODE_ID=1
- KAFKA_KRAFT_CLUSTER_ID=WnLkTHhkQaiJbwP8FClPhw
- KAFKA_CFG_PROCESS_ROLES=controller,broker
- KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=1@kafka-1:9091,2@kafka-2:9091,3@kafka-3:9091
- KAFKA_CFG_LISTENERS=PLAINTEXT://:9090,CONTROLLER://:9091,EXTERNAL://:9092
- KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://kafka-1:9090,EXTERNAL://${HOSTNAME:-localhost}:9092
- KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CONTROLLER:PLAINTEXT,EXTERNAL:PLAINTEXT,PLAINTEXT:PLAINTEXT
- KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
- KAFKA_CFG_INTER_BROKER_LISTENER_NAME=PLAINTEXT
volumes:
- /Users/chaktiwa/kafka/docker-compose/volumes/server-:/bitnami/kafka

kafka-2:
image: bitnami/kafka:latest
ports:
- "9094:9094"
environment:
- KAFKA_CFG_NODE_ID=2
- KAFKA_KRAFT_CLUSTER_ID=WnLkTHhkQaiJbwP8FClPhw
- KAFKA_CFG_PROCESS_ROLES=controller,broker
- KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=1@kafka-1:9091,2@kafka-2:9091,3@kafka-3:9091
- KAFKA_CFG_LISTENERS=PLAINTEXT://:9090,CONTROLLER://:9091,EXTERNAL://:9094
- KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://kafka-2:9090,EXTERNAL://${HOSTNAME:-localhost}:9094
- KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CONTROLLER:PLAINTEXT,EXTERNAL:PLAINTEXT,PLAINTEXT:PLAINTEXT
- KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
- KAFKA_CFG_INTER_BROKER_LISTENER_NAME=PLAINTEXT
volumes:
- /Users/chaktiwa/kafka/docker-compose/volumes/server-2:/bitnami/kafka

kafka-3:
image: bitnami/kafka:latest
ports:
- "9096:9096"
environment:
- KAFKA_CFG_NODE_ID=3
- KAFKA_KRAFT_CLUSTER_ID=WnLkTHhkQaiJbwP8FClPhw
- KAFKA_CFG_PROCESS_ROLES=controller,broker
- KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=1@kafka-1:9091,2@kafka-2:9091,3@kafka-3:9091
- KAFKA_CFG_LISTENERS=PLAINTEXT://:9090,CONTROLLER://:9091,EXTERNAL://:9096
- KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://kafka-3:9090,EXTERNAL://${HOSTNAME:-localhost}:9096
- KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CONTROLLER:PLAINTEXT,EXTERNAL:PLAINTEXT,PLAINTEXT:PLAINTEXT
- KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
- KAFKA_CFG_INTER_BROKER_LISTENER_NAME=PLAINTEXT
volumes:
- /Users/chaktiwa/kafka/docker-compose/volumes/server-3:/bitnami/kafka
docker-compose -f docker-compose.yml --env-file environment.env up

Now create a topic using CLI commands by using the kafka-1 server and check the topics by using another server kafka-2 and 3.

So In this tutorial, we checked how to start single and multiple kafka servers using Docker. To achieve fault-tolerant behaviour it is always recommended to start multiple Kafka servers so that no messages will be lost.

Thanks for reading!!

--

--

Chakresh Tiwari
ShoutLoudz

Software Engineer at Cisco(Appdynamics) , Sharing my knowledge and experience related to work. I am here to help learners to prepare for tech interviews.