The better development experience with Kafka

Bluxmit
Geek Culture
Published in
3 min readJan 12, 2022

Apache Kafka de-facto became the most popular open-source distributed streaming platform. It became in the centre of service meshes, data platforms, streaming analytics and hundreds of other use-cases.

Knowledge of Kafka is a must for nearly every software engineer. At the same time, it is not the easiest piece of technology. Even setting it up first time locally time might feel frustrating. In most cases, docker-compose is required, that looks something like this

version: "3"
services:
zookeeper:
image: 'wurstmeister/zookeeper:latest'
ports:
- '2181:2181'
environment:
- ALLOW_ANONYMOUS_LOGIN=yes
kafka:
image: 'wurstmeister/kafka:latest'
ports:
- '9092:9092'
environment:
- KAFKA_BROKER_ID=1
- KAFKA_LISTENERS=PLAINTEXT://:9092
- KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://127.0.0.1:9092
- KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
- ALLOW_PLAINTEXT_LISTENER=yes
depends_on:
- zookeeper

immediately scaring with environmental variables for configuration and separate zookeeper service. And then, to play around with Kafka you’d typically use docker exec to create topics, send and receive messages

# Create topic
docker exec -it kafka_kafka_1 kafka-topics.sh --create --bootstrap-server kafka:9092 --topic my-topic
# Create events
docker exec -it kafka_kafka_1 kafka-console-producer.sh --bootstrap-server kafka:9092 --topic my-topic
# Read events
docker exec -it kafka_kafka_1 kafka-console-consumer.sh --bootstrap-server kafka:9092 --topic my-topic --from-beginning

This is not extremely friendly for the new developers to get started with Kafka!

To make it easier to get started with Kafka I’ve created kafka workspace.

It is just one docker image, no need to use docker-compose. And it includes browser-based terminal, browser-based VS-code version and some other CLI tools that make it easier and more convenient to work with Kafka locally.

Simply execute

docker run --name rwid-1 -d -p 8020-8035:8020-8035 alnoda/kafka-workspace

and you have 1-broker Kafka cluster up and running locally.

Open http://localhost:8020/ in browser for quick access to all the tools

Open browser-based VS-code editor from the workspace UI, or go directly to http://localhost:8025/, and connect to the local Kafka cluster using VS-code Kafka extension. You only need to provide the name for the cluster, which can be any.

Now you can create topics, and consume events from this topic directly in VS-code using Kafka extension

Open workspace terminal on http://localhost:8026/ and use any of multiple CLI tools to manage topics, explore cluster, produce and consume messages.

For example, produce events to the topic “quickstart-events” with kcat

kafkacat -b localhost -t quickstart-events

and consume events from this topic back

kafkacat -b localhost -t quickstart-events

Workspace has many CLI tools that allow to explore easily such advanced Kafka features, such as offsets, partitions, consumer groups. In addition workspace has python installed. This all makes it very easy to get started with Kafka!

Disclamer: I am the creator of the kafka-workspace image (and other workspaces in that repo). I use them for my own development, and happy to share with the community

--

--