Setting up Apache Kafka for local development with Docker

Paris Nakita Kejser
DataCraft Backbone
Published in
2 min readFeb 6, 2024

Story updated 26 September 2024 and moved to:

Old article version still exists below here:

Many companies with huge amounts of data need a way to streamline all the ingress data and dump it to different places known as Kafka Sinks.

In production, I will not recommend self-hosted Kafka; I can recommend paying the cloud provider to host it for you in AWS. It is called Amazon Managed Streaming for Apache Kafka (MSK). However, it can be hard to test it locally, so it's very helpful to quick spin an Apache Kafka up on your local machine with Docker and start developing what you need before deploying it to your staging or production environment.

First, we will create a single node Kafka setup running with Docker using Docker Compose

When it's done you can try to check the status of your Kafka and Zookeeper setup by using these two commands.

both tests should return succeeded if everything is running as we expect.

We can now download Kafka Tool (Offset Explorer 2) there is a UI tool for Kafka so you can look into Kafka in a UI way and don’t need to remember the command line in the head :)

When you have downloaded the tool you need to set up the configuration, in the configuration you go now to the advanced tab and use localhost:29092 as the bootstrap gateway, and then you are good to go! :D

You can learn how to write your first producer and consumer with one of my other articles.

--

--

Paris Nakita Kejser
DataCraft Backbone

DevOps Engineer, Software Architect, Software Developer, Data Scientist and identify me as a non-binary person.