Django Asynchronous Tasking with Celery and Redis

Taylor Berukoff
Django Unleashed
Published in
6 min readMar 21, 2024
Hero Image of Django, celery, and Redis logos

Django is a fantastic platform for web development used across the industry. It has many tools and features built in to facilitate the creation of web applications. However, in order to run tasks asynchronously, you will need to integrate outside tooling. In this case we will use Celery and Redis to perform asynchronous tasking in Django.

To follow along, you will need to have set up a basic Django project. You can follow my Django Setup Checklist on how to do so if needed. A copy of the code I start from for this article can be found on GitHub.

Table of Contents:

  1. Installing Dependencies
  2. Run Worker
  3. build basic page
  4. Run it
  5. Tips for deployment

1. Installing Dependencies

To begin, we will gather up our python dependencies with the following pip install commands.

pip install celery
pip install redis

Celery is a Python library for asynchronous task scheduling and execution, commonly used in web applications for handling background tasks efficiently across multiple worker processes or machines. Celery can be easily installed using only pip.

Redis is a fast, open-source, in-memory data store used for caching, session management, and message queuing.

Redis acts as a message broker for Celery, facilitating communication between the task producer and worker processes. It ensures efficient task distribution by managing the exchange of messages between components, enabling asynchronous task execution in distributed systems.

When installing and using Redis with Django and Celery, it is necessary to both install the Redis library through pip and to install Redis itself onto your machine.

Installing Redis through pip provides the Python client library (redis-py) for communication between Celery and Redis, while installing Redis directly on your machine supplies the server infrastructure necessary for storing task queues and results. Both installations are essential for Celery's effective operation.

Instructions for installing Redis onto your machine can be found here. In the case of an Ubuntu setup, we will be running the following command.

curl -fsSL https://packages.redis.io/gpg | sudo gpg --dearmor -o /usr/share/keyrings/redis-archive-keyring.gpg

echo "deb [signed-by=/usr/share/keyrings/redis-archive-keyring.gpg] https://packages.redis.io/deb $(lsb_release -cs) main" | sudo tee /etc/apt/sources.list.d/redis.list

sudo apt-get update
sudo apt-get install redis

It’s important to note that, in deployment, you must also install Redis itself on your server. Running pip install on your requirements file will not suffice!

2. Setting up our application

For this we will expand on the already created app “myappname”. You can also create a fresh app from scrap at this point, but be sure to add it to your settings.py.

To begin, we will modify our site’s settings.py and urls.py along with our app’s urls.py.

# Within mysitename/settings.py

.
.
.

# CELERY SETTINGS
CELERY_BROKER_URL = 'redis://localhost:6379//'
# Within mysitename/urls.py

from django.contrib import admin
from django.urls import path, include

urlpatterns = [
path('admin/', admin.site.urls),
path('', include('myappname.urls')),
]
# Within myappname/urls.py

from django.urls import path
from . import views

app_name = "myappname"

urlpatterns = [
path("", views.index, name="index"),
]

This will inform our Celery app of the location of our Redis workers (which we will set up), and create some basic url patterns to test our new tools. This will give errors until we define our index function within our view.

For our example view to start, we will set up as follows

import time
from django.http import HttpResponse

def time_consuming_task(time_consumed=5):
time.sleep(time_consumed)
print("Slow process completed!")

def index(request):

# Do a time consuming task
time_consuming_task.delay(5)

html_content = """
<h1 style="text-align: center;">This is your site's index. currently, the index takes a whopping 5 seconds to load!</h1>
"""
return HttpResponse(html_content)

This will create a simple view that will load some HTML content. However, our time consuming task will take a whopping five seconds to complete before our page is loaded, causing a significant delay and hampering user experience.

This is because our time consuming task has to be performed before our page loads by default!

Base index site

3. Configuring Celery

Celery is fairly straightforward to implement, and allows for quite a bit of versatility. In our case, we will set up a separate celery.py file within our site folder (where our settings.py file is contained). This file will contain the following:

# mysitename/celery.py

# Import for basic systems operations
import os
# Import celery core object
from celery import Celery

# Let Celery know which settings module to use if multiple
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysitename.settings')

# Initialize celery with localhost broker and backend
app = Celery('mysitename', broker="redis://localhost:6379", backend="redis://localhost:6379")
# Configure our app based on django settings
app.config_from_object('django.conf:settings', namespace='CELERY')
# Auto find any tasks in a tasks.py file and associate with Celery
app.autodiscover_tasks()

We can then create a task within a tasks.py file for our app.

# myappname/tasks.py

import time
from mysitename.celery import app

@app.task(bind=True)
def async_time_consuming_task(self, time_consumed=5):
time.sleep(time_consumed)
print("Slow process completed in an asynchronous manner!")

It is important to note that in our case we pass bind=True as a parameter. This allows us to access information about the task itself (whether it was successful, whether this run is a retry, etc) by binding this task to the first argument of our function. If we choose to do this, we need to pass self as an argument in our function as well.

If you choose not to set bind to True , then you will not need to pass self as a parameter.

With our task set up, we can delete our previous function within views.py and import our task instead. Your views.py should look as follows.

# myappname/views.py

from django.http import HttpResponse
from .tasks import async_time_consuming_task

def index(request):

# Do a time consuming task
async_time_consuming_task.delay(5)

html_content = """
<h1 style="text-align: center;">This is your site's index. It now runs immediately and our time consuming task is taken care of in the background!</h1>
"""
return HttpResponse(html_content)

Note that we need to reference the function using it’s new delay method (thanks to our task decorator) as so, async_time_consuming_task.delay(5) .

Lastly, we will need to activate our Redis worker! Similar to how we need a terminal to use the runserver command, we will also need to open a terminal and run the following command.

celery -A mysitename worker -l info
Result of the previous code

This will activate a worker which celery can use to complete our tasks asynchronously. It is important to note that this needs to be running continuously during development and production in order for our async tasking to work properly. Any updates to our tasks or configuration will necessitate a restart of this worker as well.

With that, we can run our server and see the following output without delay.

Successful screen loading

We should also see our task completed in our terminal which we ran our worker.

Celery worker output

With that, we have proper asynchronous tasking with Celery and Redis in Django!

Congratulations for making it through! I hope this article was insightful and informative, if it was then leave some claps and follow. If you have any questions, leave a comment and I will do my best to respond promptly.

You can find the finished code for this project on GitHub. Note that you will still need to install python dependencies and Redis itself!

Take care!

--

--

Taylor Berukoff
Django Unleashed

Full Stack Developer with a passion for Django Web Applications. Data Science sprinkled in here and there for fun! Follow me on Twitter @TaylorBerukoff