Introduction to Asynchronous Task Queueing Celery

Hemanthhari2000
featurepreneur
Published in
6 min readApr 1, 2021

Understand the concept of an asynchronous task queue, distributed message passing

Introduction

Python is one of the best programming languages out there for web development. It is easy to learn and is flexible like most of the web applications on the internet. Task queue (or) Job queue plays an important role in a web application where it schedules programs (or) jobs to the queue for batch processing. This enables services (or) programs to execute in the background without disturbing other processes. This Task Queue is designed for asynchronous work. So, how can one actually achieve Task Queue? Well, we can by using Celery.

Overview

Let’s take a look at the contents that we will cover in this article.

  • What is Celery?
  • Why use Celery?
  • Merits of using Celery
  • Prerequisites
  • Installing Celery
  • Implementation of Celery
  • Conclusion

What is Celery?

Celery is an open-source, simple, flexible, distributed system to process vast amounts of messages while providing operations with the tools required to maintain such a system. It’s a task queue with a focus on real-time processing, while also supporting task scheduling. Celery is written in Python, but it can be implemented in any language as the protocol is the same for all. It is a cross-platform software which means you can set up Celery in any Operation System.

Why use Celery?

One of the most difficult parts of the Python web application stack is understanding and implementing asynchronous task queues or job queues. Task Queues are very important to serve users with better functionalities. If users are not satisfied, all the effort put into the application will go in vain. So, it is better to use a task queue like Celery to avoid such scenarios in real life.

As a developer, we want our application to be fast and responsive to all the requests that it gets. As we know that each request ties up to a worker process until the response is finally finished. In a normal application, we would have overloaded our process worker which eventually slows down our applications. But, we don't want that do we? of course not, so this is the main reason why we use Celery. Celery makes it easy to put tasks in a queue that can be executed later. By doing this we are improving our WSGI server response time and also perform some other important tasks while this executes.

Merits of using Celery?

The following are some of the merits of using Celery.

  • Open-Source: As explained above, Celery is Open-Source software written in Python. This enables small businesses to use such technology and even modify some changes in the official source code.
  • Broker Support: Celery supports multiple message brokers like RabbitMQ and Redis. Which are also the recommended ones. Also, we have some additional brokers like MongoDB, Amazon SQS, CouchDB, IronMQ, Django ORM are supported too.
  • Easy to Install: Celery is very easy to install and there are no workarounds needed. It is pretty straightforward though.
  • Web Framework Integrations: It also integrates itself with multiple web frameworks, including Django, Flask, Pyramid, Pylons, web2py, Tornado, and Tryton.

Prerequisites

You’ll need the following to successfully run celery.

  • Python (latest version recommended)
  • Redis (or) RabbitMQ installed for broker management.
  • sqlite3
  • IPython (optional)

Note: In this article, we are using “Redis” as our broker. You can use “RabbitMQ” and get the broker URL.

Installing Celery

Before installing Celery, install the following using the commands shown:

pip install redis
pip install sqlalchemy

All set, Let’s get started with the installation of Celery. As explained above installing Celery is the easiest thing. Just type this command in your command line (or) terminal.

pip install celery

That’s it our installation is complete.

Implementation of Celery

Open up your code editor and create a new python file called “tasks.py”. Now, import celery and create an app using celery and pass the broker URL and also specify the database backend. In this case, the broker URL is from Redis and the database backend is from SQLite3.

from celery import Celeryapp = Celery(
name = 'tasks',
broker = 'redis://localhost:6379',
backend = 'db+sqlite:///db.sqlite3')

Now go ahead and create a decorator as follows:

@app.task
def absoluteSub(a, b):
return abs(a - b)

now you can see that we have a simple function that outputs absolute value when subtracted. This function is wrapped in a decorator called to task.

Now, let's get back to our terminal and start our celery server and see if everything is connected or not.

celery -A tasks worker --loglevel=info

Note: Here in the above argument “tasks” is the name of our python file that we created.

The above command should set everything up and we should get an output that looks something like this

As you can see we are connected to the Redis broker using the URL and also our main program is ready.

Next, open up a new terminal and run the following commands

from tasks import absoluteSub
result = absoluteSub.delay(10, 20)
result.status
result.get()

This should give us the output like this

We get the status as ‘SUCCESS’ indicating that the process is done, and also we get the result as “10” which is as expected. But, let us check if we got the log of this process in the database backend that we connected.

We can see that our process was executed and we can also see the status and task_id that celery provides.

Now, it is your time and creates some awesome projects that use task queueing with Celery to boost up the performance of your project.

Conclusion

In this article, we have seen Celery and its functionalities. We went through the concept of Asynchronous Task Queue and its effective use in the real-world scenario. We even installed Celery and implemented some task queueing. We connected our Celery to Redis broker and also we used the SQLite3 database as our backend to store the result and logs that Celery throws. I hope this article was useful to you all. Will see you in my next article until then, as always code learn repeat …….

Follow for more…

--

--