Event Driven Application using Golang, Kafka, and GoFr
In the modern world, the amount of data being produced is exponentially increasing. The data can range from a simple click on a website to complex weather data from sensors high up in the sky, and our applications need to process the continuous production of data from these events. So can we make applications that keep these events as a base of the system and process and work around these events? What would be the benefits of such an architecture?
In this article, we will be covering the basics and advantages of event-driven architecture and how to build an event-driven application using golang and Kafka. We would be using GoFr — a golang framework that makes it easy to develop these types of applications.
What is Event Driven Architecture?
Event-driven architecture is a software design paradigm that enables an organization to detect and respond to events in real-time, where events are changed in states or updates. It is a system of loosely coupled microservices that communicate and share information through the production and consumption of events.
For example, we can take a look at the below design for a customer referral service
Here we have decoupled the customer service which handles all the customer-related operations and referral service which would handle all the referral program-related operations, both communicating using Kafka event streams. This is the true decoupling of producers and consumers —
- Producers do not need to concern themselves with how the events they produce are consumed (so additional consumers can be added without affecting the producers).
- Consumers do not need to concern themselves with how they are produced.
Decoupling the components of an application enhances its scalability via independent scaling of those components across the network. Developers can modify their system by dynamically adding or removing event producers and consumers without requiring changes to the logic in any microservices.
Creating Event Driven microservices using Kafka and GoFr
Now let's create a simple example system with 2 microservices and a container for Apache Kafka.
Make sure you have docker and docker-compose installed, and use the following file to spin up the Apache Kafka and zookeeper using docker-compose up
command.
version: '2'
services:
zookeeper:
image: confluentinc/cp-zookeeper:latest
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ZOOKEEPER_TICK_TIME: 2000
ports:
- 22181:2181
kafka:
image: confluentinc/cp-kafka:latest
depends_on:
- zookeeper
ports:
- 29092:29092
environment:
KAFKA_BROKER_ID: 1
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
Now that our Kafka is up and running on our localhost:29092
Let’s now build our publisher and subscriber microservices and we can communicate using Kafka. Imagine a scenario where we have customer service publishing events with customer details and referral codes when a new customer is signed up on the website. These events would be consumed by a referral service which would credit some monetary amount to both the referrer and the referred customer.
We will be using GoFr — a golang microservice framework for building these 2 microservices, GoFr makes it easy to connect to a Kafka instance by just providing some configuration values like instance hosts and port, topic name, etc. This would save us time and energy while setting up the Kafka pubsub. Below are the configs needed to connect your gofr application and Kafka.
PUBSUB_BACKEND=KAFKA
PUBSUB_BROKER=localhost:29092
CONSUMER_ID=test
These configs are to be kept in /configs
folder in the .env
file as GoFr reads from this file by default and injects the config values in the application.
For the customer service publisher we can publish the event whenever a new customer is created, for this we would set up POST /signup
endpoint which would take customer details and create a new customer.
package main
import (
"encoding/json"
"fmt"
"gofr.dev/pkg/gofr"
)
type body struct {
Name string `json:"name"`
Email string `json:"email"`
// more relavent customer details like password, images, etc.
ReferralCode string `json:"referralCode"`
}
func main() {
app := gofr.New()
app.POST("/signup", customerSignup)
app.Run()
}
func customerSignup(ctx *gofr.Context) (interface{}, error) {
var req body
// get the request body into the request struct
err := ctx.Bind(&req)
if err != nil {
return nil, fmt.Errorf("invalid body")
}
// process the customer details and save them in a database
// now once a new customer is created, we would like to publish an
// event containing the customer id and referral code
b, _ := json.Marshal(event{req.Name, req.ReferralCode})
err = ctx.GetPublisher().Publish(ctx, "order-logs", b)
if err != nil {
return nil, err
}
return "Success", nil
}
type event struct {
Name string `json:"name"`
ReferralCode string `json:"referralCode"`
}
Run the publisher using go run publisher.go
to run the publisher and you can use curl to call the /signup
endpoint
curl --location 'localhost:8000/signup' \
--header 'Content-Type: application/json' \
--data-raw '{
"name" : "Vipul",
"email" : "vipul@mail.com",
"referralCode" : "CMY44S"
}'
Now for the referral service, we would use the following code and then subscribe to the new-customer
topic continuously.
package main
import (
"gofr.dev/pkg/gofr"
)
func main() {
app := gofr.New()
// Using app.Subscribe - 1st argument defines the topic that needs to be
// subscribed to and continiously calls the defined function
app.Subscribe("new-customer", func(ctx *gofr.Context) error {
var ev event
err := ctx.Bind(&ev)
if err != nil {
return err
}
ctx.Logger.Infof("Event received %v", ev)
return nil
})
app.Run()
}
type event struct {
Name string `json:"name"`
ReferralCode string `json:"referralCode"`
}
Now whenever a new customer is signed up a new event containing the referral information will be published by the customer service and the same event is read by the referral service in the same order. This is also helpful if we were to introduce a new service, let's say order-service and it wants to use the new-customer
event, now in event-driven architecture the producer is decoupled and solely responsible for the production of events so many consumers can consume those events in the same order without interfering with other consumers.
Event Driven Architecture is a powerful and useful design pattern that can make our systems more scalable and resilient, capable of handling huge amounts of data!
References :
Setting up Kafka — https://www.baeldung.com/ops/kafka-docker-setup
GoFr documentation — https://gofr.dev
GoFr code — https://github.com/gofr-dev/gofr