Django : Task Scheduling with Celery

Image for post
Image for post

Automation in Django is a developer’s dream. Tedious work such as creating database backup, reporting annual KPI, or even blasting email could be made a breeze. Through Celery — a well-known software in Python for delegating task — such action has been made possible.

A message broker is used by Celery to send and receive messages from Django . For example :~ RabbitMq, Reddis etc.

Producer and Consumer Problem

The problem of running asynchronous tasks can easily be mapped through the Producer and Consumer problem. Producers place jobs in a queue, Consumers then check the head of the queue for the waiting jobs, pick the first one and execute.

Correct naming of Components in reference to Celery :~

  • Web (Producers)
  • Broker (Queue)
  • Worker (Consumer) — Since workers can also place new tasks in the queue they can also behave as producers.

Setup

  • First create and activate the virtual environment, i’d recommend using Pipenv.
  • Install Django using pipenv install django in your virtualenv.
  • Install Celery in your virtualenv like:
    - pipenv install celery==3.1.18

To use Celery with your Django project you must first define an instance of the Celery library (called an “app”)

If you have a modern Django project layout like:

- olympia/
- manage.py
- olympia/
- __init__.py
- settings.py
- urls.py

then the recommended way is to create a new olympia/olympia/celery.py module that defines the Celery instance:

STEP 1:

File : olympia/olympia/celery.py

code snippet:

import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'olympia.settings')app = Celery('olympia')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

STEP 2:

Then you need to import this app in your olympia/olympia/__init__.py module. This ensures that the app is loaded when Django starts, so that the @shared_taskdecorator (mentioned later) will use it:

File : olympia/olympia/__init__.py

from .celery import app as celery_app__all__ = ['celery_app']

STEP 3: Install Redis as a Celery “Broker”

Celery uses “brokers” to pass messages between a Django Project and the Celery workers. In this tutorial, we will use Redis as the message broker.

  • pipenv install redis==2.10.3

In settings.py add

Once Redis is up, add the following code to your settings.py file:

Add the CELERY_BROKER_URL configuration to the settings.py file:

settings.py

CELERY_BROKER_URL = ‘amqp://localhost’

Create task in your app olympia/olympia/tasks.py

Add below code

Just for testing purpose, let’s create a Celery task that generates a number of random User accounts.

Image for post
Image for post

Then lets define a form and a view to process the Celery task:

from django import forms
from django.core.validators import MinValueValidator,MaxValueValidator
class GenerateRandomUserForm(forms.Form):
total = forms.IntegerField(
validators=[
MinValueValidator(50),
MaxValueValidator(500)
]
)

This form expects a positive integer field between 50 and 500. It looks like this:

Image for post
Image for post

Then in views.py

Image for post
Image for post

Starting The Worker Process

Open a new terminal tab, and run the following command to run the Celery server:

celery -A proj worker -l info

Note:~ Before starting Celery server make sure that your Broker server is also running i.e for RabbitMq systemctl start rabbitmq-server.

Now we can test it. I submitted 500 in my form to create 500 random users.

The response is immediate:

Image for post
Image for post

Meanwhile, checking the Celery Worker Process:

[2017–08–20 19:11:17,485: INFO/MainProcess] Received task:

mysite.core.tasks.create_random_user_accounts[8799cfbd-deae-41aa-afac-95ed4cc859b0]

Then after a few seconds, if we refresh the page, the users will be there.

Now, If we check the Celery Worker Process again, we can see it completed the execution:

[2019–08–20 19:11:45,721: INFO/ForkPoolWorker-2] Task

mysite.core.tasks.create_random_user_accounts[8799cfbd-deae-41aa-afac-95ed4cc859b0] succeeded in

28.225658523035236s: ‘500 random users created with success!’

Final Tips

  1. Do not pass Django model objects to Celery tasks. To avoid cases where the model object has already changed before it is passed to a Celery task, pass the object’s primary key to Celery. You would then, of course, have to use the primary key to get the object from the database before working on it.
  2. The default Celery scheduler creates some files to store its schedule locally. These files would be “celerybeat-schedule.db” and “celerybeat.pid”. If you are using a version control system like Git (which you should!), it is a good idea to ignore this files and not add them to your repository since they are for running processes locally.
  3. In case of writing Celery tasks for API’s usign Django REST Framework don’t pass complex objects as task parameters i.e Database objects without serializing them. Passing complex instances such as model instances without serialization comes with a few problems.

Conclusion

This was just one of the applications of Celery. We can follow the same approach to send emails . We would only need to make changes to the tasks.py file :~

# Django imports.
from django.conf import settings
from django.core.mail import send_mail
from django.template.loader import get_template
# Third party imports.
from celery import shared_task
@shared_task
def send_message(subject, context, recipient_list, html_path, text_apth):
txt_ = get_template(text_apth).render(context)
html_ = get_template(html_path).render(context)
from_email = settings.DEFAULT_FROM_EMAIL
sent = send_mail(
subject, txt_, from_email, recipient_list,
html_message=html_, fail_silently=False,
)
return sent

In the future i will be writing another blog covering how we can use CeleryBeat for periodic tasks in Django and DRF.

I’m a Software Engineer(Backend) and a part time Blogger who loves to learn and try new tools & technologies. My corner of the internet:https://sarthakkumar.xyz

Get the Medium app