Automation in Django is a developer’s dream. Tedious work such as creating database backup, reporting annual KPI, or even blasting email could be made a breeze. Through Celery — a well-known software in Python for delegating task — such action has been made possible.
Celery is a task queue based on distributed message passing. It is used to handle long running asynchronous tasks. It allows us to do things in a distributed way (i.e run things remotely so that our server doesn’t take too much time loading that process). Celery is perfectly suited for tasks which will take some time to execute but we don’t want our requests to be blocked while these tasks are processed. Case in point are sending emails, SMSs, making remote API calls,CRON jobs, preparing and caching values etc.
A message broker is used by Celery to send and receive messages from Django . For example :~ RabbitMq, Reddis etc.
Producer and Consumer Problem
The problem of running asynchronous tasks can easily be mapped through the Producer and Consumer problem. Producers place jobs in a queue, Consumers then check the head of the queue for the waiting jobs, pick the first one and execute.
Correct naming of Components in reference to Celery :~
- Web (Producers)
- Broker (Queue)
- Worker (Consumer) — Since workers can also place new tasks in the queue they can also behave as producers.
- First create and activate the virtual environment, i’d recommend using Pipenv.
- Install Django using pipenv install django in your virtualenv.
- Install Celery in your virtualenv like:
- pipenv install celery==3.1.18
To use Celery with your Django project you must first define an instance of the Celery library (called an “app”)
If you have a modern Django project layout like:
then the recommended way is to create a new olympia/olympia/celery.py module that defines the Celery instance:
from celery import Celeryos.environ.setdefault('DJANGO_SETTINGS_MODULE', 'olympia.settings')app = Celery('olympia')
Then you need to import this app in your
olympia/olympia/__init__.py module. This ensures that the app is loaded when Django starts, so that the @shared_taskdecorator (mentioned later) will use it:
from .celery import app as celery_app__all__ = ['celery_app']
STEP 3: Install Redis as a Celery “Broker”
- pipenv install redis==2.10.3
In settings.py add
Once Redis is up, add the following code to your settings.py file:
Add the CELERY_BROKER_URL configuration to the settings.py file:
CELERY_BROKER_URL = ‘amqp://localhost’
Create task in your app
Add below code
Just for testing purpose, let’s create a Celery task that generates a number of random User accounts.
Then lets define a form and a view to process the Celery task:
from django import forms
from django.core.validators import MinValueValidator,MaxValueValidatorclass GenerateRandomUserForm(forms.Form):
total = forms.IntegerField(
This form expects a positive integer field between 50 and 500. It looks like this:
Then in views.py
Starting The Worker Process
Open a new terminal tab, and run the following command to run the Celery server:
celery -A proj worker -l info
Note:~ Before starting Celery server make sure that your Broker server is also running i.e for RabbitMq
systemctl start rabbitmq-server.
Now we can test it. I submitted 500 in my form to create 500 random users.
The response is immediate:
Meanwhile, checking the Celery Worker Process:
[2017–08–20 19:11:17,485: INFO/MainProcess] Received task:
Then after a few seconds, if we refresh the page, the users will be there.
Now, If we check the Celery Worker Process again, we can see it completed the execution:
[2019–08–20 19:11:45,721: INFO/ForkPoolWorker-2] Task
mysite.core.tasks.create_random_user_accounts[8799cfbd-deae-41aa-afac-95ed4cc859b0] succeeded in
28.225658523035236s: ‘500 random users created with success!’
- Do not pass Django model objects to Celery tasks. To avoid cases where the model object has already changed before it is passed to a Celery task, pass the object’s primary key to Celery. You would then, of course, have to use the primary key to get the object from the database before working on it.
- The default Celery scheduler creates some files to store its schedule locally. These files would be “celerybeat-schedule.db” and “celerybeat.pid”. If you are using a version control system like Git (which you should!), it is a good idea to ignore this files and not add them to your repository since they are for running processes locally.
- In case of writing Celery tasks for API’s usign Django REST Framework don’t pass complex objects as task parameters i.e Database objects without serializing them. Passing complex instances such as model instances without serialization comes with a few problems.
This was just one of the applications of Celery. We can follow the same approach to send emails . We would only need to make changes to the tasks.py file :~
# Django imports.
from django.conf import settings
from django.core.mail import send_mail
from django.template.loader import get_template# Third party imports.
from celery import shared_task@shared_task
def send_message(subject, context, recipient_list, html_path, text_apth):
txt_ = get_template(text_apth).render(context)
html_ = get_template(html_path).render(context)
from_email = settings.DEFAULT_FROM_EMAIL sent = send_mail(
subject, txt_, from_email, recipient_list,
In the future i will be writing another blog covering how we can use CeleryBeat for periodic tasks in Django and DRF.
Celery - Distributed Task Queue - Celery 4.3.0 documentation
This document describes the current stable version of Celery (4.3). For development docs, go here. Celery is a simple…
Asynchronous Tasks With Django and Celery - Real Python
When I was new to Django, one of the most frustrating things I experienced was the need to run a bit of code…