Hands-on with Spring Cloud Stream

David ten Hove
Just Eat Takeaway-tech
3 min readAug 3, 2020

Introduction

Spring Cloud Stream makes it very easy for developers to create Java applications that communicate through message brokers such as Kafka and RabbitMQ. I could explain to you how it works, but in my experience software developers prefer being shown how to get something done. So let’s go build ourselves a microservice.

Setting up

Before we can start, we need a message broker. For this example, we’re using Apache Kafka. A simple way is to clone the Wurstmeister Kafka repository and run it with docker-compose, like this:

git clone https://github.com/wurstmeister/kafka-dockerdocker-compose -f docker-compose-single-broker.yml up

Project base

Now that’s running, head to the Spring Initializr. Keep the defaults but add two dependencies:

  • Cloud Stream
  • Spring for Apache Kafka (Not Apache Kafka Streams)

Generate the zip, unpack it, and open the project with your favorite IDE. You should have a basic Maven Java project setup, with a src folder, pom.xml, and an application class. Let’s make a bean that outputs the current date and time. Next to the application class, create a class named AwesomeConfig:

Running

Now that we finished writing our microservice (yes, that part is already done), start your application. Spring will automatically create the necessary setup in Kafka, we just need to listen to the output of our application. You can use any Kafka client, but I personally like Kafkacat:

kafkacat -b localhost -t awesomeBean-out-0 -qC
2020–07–20T15:58:10.307515
2020–07–20T15:58:11.343487
2020–07–20T15:58:12.346040

If all is well, you should see the local date and time being printed every second. And all you had to do was use the Spring Initializr and define a Bean!

Reading and processing messages

Of course, most services don’t just output messages. Let’s change our application to read messages from one topic, modify them, and print them to another. To do this, we will replace our Supplier Bean with a Function Bean:

Now re-run the application and fire up an additional Kafkacat to provide input:

kafkacat -b localhost -t awesomeBean-in-0 -P

This will let you enter input using your console. Any input you give should show up in your previous kafkacat console but in all caps.

Congratulations! You now have a microservice which reads from and writes to a message broker, ready to be included in a microservice infrastructure.

Automatic serialization and deserialization

Of course, passing Strings around is child’s play. Let’s define a model class and work with that. Create a class called Movie in the package com.example.demo:

Now we’ll make our Bean take the String it received as input and turn it into a movie with a random rating:

After making these changes, restart your application. Turning to our kafkacat consoles again, use the second one we openend to enter your favourite movie name and use the first to see how much our new microservice likes it. My output looked like this:

{“name”:”Inception”,”rating”:3}
{“name”:”Attack of the Killer Tomatoes”,”rating”:9}

Looks like I’ll have to have a stern talk with my computer as to what makes a good movie. Anyway, as you can (hopefully) see, Spring converts the Movie objects to JSON, because that’s the default mime-type. No need to manually set mime-types or provide a serialization mechanism.

Conclusion

Building microservices doesn’t have to take long, and it doesn’t have to be complicated. Spring and Spring Cloud Stream take a ton of work out of your hands, so you can focus on implementing your business logic. They provide defaults for pretty much everything, such as connecting to the broker, declaring exchanges and queues, and serializing/deserializing data.

If you’re hungry for more hands-on with Spring Cloud Stream, check out this webinar I co-hosted.

I hope this got you started and I’ll see you around for the next blog post!

Make sure to check out our careers page to discover tech jobs at Takeaway.com!

--

--