Exploring Kafka: A Practical Guide with Node.js Implementation

Harsh Gupta
Engineering at Bajaj Health
3 min readJan 18, 2024

In our earlier discussion, we delved into the fundamentals of Kafka, exploring its internals and key concepts such as producers, consumers, consumer groups, and partitions. We also examined scenarios highlighting when to leverage Kafka and when to reconsider its use. Continuing this journey of learning, this article provides a hands-on guide for setting up a Kafka cluster on your local system. Additionally, we’ll implement Kafka using Node.js

Installation for Local Development

The quickest and easiest method to get Kafka up and running is by installing the Conductor desktop application. This application not only sets up a Kafka cluster but also provides a user-friendly graphical interface to interact with Kafka and observe real-time message streaming. After installation, click on the “Start a Local Kafka Cluster” button, give your cluster a name, and initiate the cluster. This action triggers the start of both a Zookeeper server and a Kafka server.

The GUI is self-explanatory. You can create topics, start producers or consumers side by side, and observe real-time message streaming. I highly recommend experimenting with the GUI to gain a practical understanding.

Node.js Implementation

To kickstart our Node.js application’s interaction with Kafka, we need a client object from the Kafka.js library.

client.js

const { Kafka } = require("kafkajs");

exports.kafka = new Kafka({
clientId: "my-app",
brokers: ["localhost:9092"],
});

Next, we request the Kafka client to provide an admin client to help us connect to Kafka and facilitate the creation of topics.

admin.js

const { kafka } = require("./client");

async function init() {
const admin = kafka.admin();
console.log("Admin connecting...");
admin.connect();
console.log("Adming Connection Success...");

console.log("Creating Topic >> purchases ");
await admin.createTopics({
topics: [
{
topic: "purchases",
numPartitions: 2,
},
],
});
console.log("Topic Created Success >> purchases");

console.log("Disconnecting Admin..");
await admin.disconnect();
}

init();

Running node admin.js will create a topic named "purchases" in our cluster with 2 partitions, which can be verified in the GUI.

producer.js: This script helps us produce messages to our Kafka cluster. To produce a message, run node producer.js.

const { kafka } = require("./client");

async function init() {
const producer = kafka.producer();

console.log("Connecting Producer");
await producer.connect();
console.log("Producer Connected Successfully");

// Producing messages
await producer.send({
topic: "purchases",
messages: [
{
partition: 0,
key: "grocery",
value: "mango",
},
],
});

await producer.send({
topic: "purchases",
messages: [
{
partition: 1,
key: "stationary",
value: "pen",
},
],
});

await producer.send({
topic: "purchases",
messages: [
{
partition: 1,
key: "stationary",
value: "colors",
},
],
});

await producer.disconnect();
}

init();

consumer.js: To consume produced messages, run node consumer.js. Specify a consumer group ID as a command-line argument.

const { kafka } = require("./client");


async function init() {
const consumer = kafka.consumer({ groupId: "some-group-id" });
await consumer.connect();

await consumer.subscribe({ topics: ["purchases"], fromBeginning: true });

await consumer.run({
eachMessage: async ({ topic, partition, message, heartbeat, pause }) => {
console.log(
`[${topic}]: PART:${partition}:`,
message.value.toString()
);
},
});
}

init();

This hands-on guide provides a practical understanding of setting up Kafka locally, implementing it using Node.js , Please Experiment with the provided scripts to further enhance your knowledge of Kafka and its functionalities.

--

--

Harsh Gupta
Engineering at Bajaj Health

Versatile Backend engineer excels in architecting,collaborating and deploying scalable applications at cloud, Lets Connect on linkedin.com/in/devharshgupta