Installing and using RabbitMQ with Docker.

Koray Çağlar
4 min readOct 27, 2022

--

Note: This tutorial is part of a larger tutorial. See the main article here:

I will explain how to setup RabbitMQ with Docker. We will do a basic example using Python, then implement the part in my project where RabbitMQ is used.

RabbitMQ is a message broker, which means it is a middleman between other programs that help deliver messages, logs and sometimes data. In its most basic form, there should be a producer(sender) program and a consumer(receiver) program. The producer sends messages to a queue in RabbitMQ, the receiver pulls the messages in first-in-first-out(queue) order. RabbitMQ has many more concepts and architectures to learn about, but they are out of scope for this article.

Installing RabbitMQ

We will install it using Docker, so make sure you have Docker on your system before beginning.

RabbitMQ and other programs using it should be installed on the same Docker network. For tutorial purposes, let’s create a network named “rabbitnet”.

$ docker network create rabbitnet

Run this command to pull the latest RabbitMQ image and run a container with given arguments:

$ docker run -d --rm --network rabbitnet -p 15672:15672 --hostname rabbit-1 --name rabbitmq rabbitmq:latest

Our container runs on the rabbitnet network, is attached to port 15672 and named rabbitmq.

Now our RabbitMQ is installed. As you see it was very easy and straightforward. Now we will build a basic example system.

Sending Message to RabbitMQ

We will create a basic Python program that sends a message to a queue in RabbitMQ, then Dockerize it with a Dockerfile.

Make a folder named ‘sender’.

$ mkdir sender

Inside the sender directory, make a python file.

$ sudo nano sender.py

Copy the following code into the py file.

Save and exit the editor. Now we should build a Docker image using a Dockerfile.

Create a Dockerfile using any text editor:

$ sudo nano Dockerfile

Copy this code inside it:

Save and exit.

Create a Docker image named ‘sender’:

$ docker build -t sender .

We have our producer program ready. When we run a container from this image, it will send a 'hello' message to our 'test_queue', then shutdown the container.
Let's create the receiving end.

Pulling a Message from RabbitMQ

Make a ‘receiver’ folder.

$ mkdir receiver

Create a Python file named ‘receiver.py’

$ sudo nano receiver.py

Copy the following code into the script:

Create a Dockerfile in the same directory.

$ sudo nano Dockerfile

Copy the following code into the Dockerfile:

Build a Docker image named ‘receiver’.

$ docker build -t receiver .

Our receiver(consumer) program is ready. Its container will always be up and listening the 'test_queue' and print out the message when it receives one.

Testing

You can check our images using 'docker images' command. We will run containers from the images and see if it is working.

Run this command to create the receiver container:

$ docker run -u root -dit --network rabbitnet --name receiver receiver

Our container is running on 'rabbitnet' and named 'receiver'. It is important that all our programs using the RabbitMQ should be in the same Docker network as the RabbitMQ itself.

When we run our sender container, it will publish a message and shutdown. Let's do it:

$ docker run -u root --network rabbitnet --name sender --rm sender

Now check if our receiver container received it:

$ docker logs receiver

It did! Now you can build more complex systems using this knowledge. Which I will do it now for my personal project.

Concerning the Main Project

I deleted all the containers, images and networks created above in this article and build them on my default 'elk_default' docker network.

I am running a new RabbitMQ container with this command:

docker run -d --rm --network elk_default -p 15672:15672 --hostname rabbit-1 --name rabbitmq rabbitmq:latest

The producer will be a Python script running on Airflow which we didn't build yet. It will send a base64 code of a marble image and the prediction made by the machine learning model, meaning 2 messages should be delivered using 2 different queues.

The consumer will be a Python script that prints the base64 code of received image and the prediction received from the machine learning model.

The receiver script is as follows:

The Dockerfile is same as in the example above:

Building the receiver image:

$ docker build -t receiver .

Running the receiver container:

docker run -u root -dit --network elk_default --name receiver receiver

The receiver script listens to 2 different queues: 'base64' and 'prediction', then prints the received messages. The RabbitMQ system is ready to use. In the next article we will install Airflow and build a 'dag' containing tasks that send messages to our RabbitMQ.

--

--