Setting Up Celery with RabbitMQ in Docker: A Step-by-Step Guide

Mohammed Farmaan
2 min readSep 7, 2023

--

Photo by NEOM on Unsplash

Introduction

In this guide, we’ll walk through the process of setting up Celery, a distributed task queue, with RabbitMQ as the message broker in a Dockerized environment. This setup is useful for handling background tasks in your Python applications. Before we dive into the steps, let’s start by installing Docker, which is a critical component for this setup.

  1. Installing Docker
  2. Launching a RabbitMQ Container
  3. Creating a Python Celery Application
  4. Starting a Celery Worker
  5. Using Celery in Your Code

Step 1: Installing Docker

Before we proceed, ensure you have Docker installed on your system. If you haven’t already installed it, follow these steps:

  1. Download Docker: Visit the official Docker website at https://docs.docker.com/get-docker/.
  2. Choose Your Platform: Download and install Docker for your specific operating system (e.g., macOS, Windows, Linux). Follow the installation instructions provided for your platform.
  3. Verify Installation: After installation, open a terminal and run the following command to verify that Docker is installed and running:
docker --version

You should see the Docker version information displayed.

Now that Docker is installed, we can proceed with setting up Celery and RabbitMQ in our Dockerized environment.

Step 2: Launching a RabbitMQ Container

To get started, we’ll launch a RabbitMQ container using Docker. Run the following command:

docker run -d --name rabbitmq-demo -p 5672:5672 rabbitmq

This command starts a RabbitMQ container named “rabbitmq-demo” and maps port 5672 from the container to your host machine.

Step 3: Creating a Python Celery Application

Create a Python Celery application. In a Python file (e.g., celery_app.py), define your Celery instance and tasks. Here's a basic example:

from celery import Celery

app = Celery('myapp', broker='pyamqp://guest@localhost//')

@app.task
def add(x, y):
return x + y

In this example, we configure Celery to use the RabbitMQ broker running on your localhost.

Step 4: Starting a Celery Worker

Start a Celery worker to process tasks by running this command:

celery -A celery_app worker --loglevel=info

Replace “celery_app” with your Python module name where you defined the Celery application.

Step 5: Using Celery in Your Code

You can now use Celery to send and process tasks in your Python code. Here’s an example:

from celery_app import add

result = add.delay(4, 5)
print("Task ID:", result.id)

This code sends the “add” task to the Celery worker for execution.

Conclusion

Congratulations! You’ve successfully set up Celery with RabbitMQ in a Dockerized environment. This setup allows you to handle background tasks efficiently in your Python applications.

Feel free to customize your Celery configuration and tasks to meet your specific application requirements.

--

--