Using Celery with Flask for asynchronous tasks

Mohith Aakash G
featurepreneur
Published in
3 min readJan 15, 2022

Celery

Celery is basically a task queue. It is used to asynchronously execute work outside the HTTP request-response cycle. You can use it to execute tasks outside of the context of your application. The general idea is that any resource consuming tasks that your application may need to run can be offloaded to the task queue, leaving your application free to respond to client requests.

Celery components

  1. The Celery client. This is used to issue background jobs. When working with Flask, the client runs with the Flask application.
  2. The Celery workers. These are the processes that run the background jobs. Celery supports local and remote workers, so you can start with a single worker running on the same machine as the Flask server, and later add more workers as the needs of your application grow.
  3. The message broker. The client communicates with the the workers through a message queue, and Celery supports several ways to implement these queues. The most commonly used brokers are RabbitMQ and Redis.

Flask and Celery

Install celery and redis client using pip.

pip install celery
pip install redis

You can install Redis according to the download instructions for your operating system.

Now initialise the celery client in your flask application

from flask import Flask
from celery import Celery

app = Flask(__name__)
app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0'
app.config['CELERY_RESULT_BACKEND'] = 'redis://localhost:6379/0'

celery = Celery(app.name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)

CELERY_BROKER_URL tells Celery where the broker service is running. If you run something other than Redis, or have the broker on a different machine, then you will need to change the URL accordingly.

CELERY_RESULT_BACKEND option is only necessary if you need to have Celery store status and results from tasks.

Any functions that you want to run as background tasks need to be decorated with the celery.task decorator. For example:

@celery.task
def my_background_task(arg1, arg2):
# some long running task here
return result

Then the Flask application can request the execution of this background task as follows:

task = my_background_task.delay(10, 20)

The delay() method is a shortcut to the more powerful apply_async() call. Here is the equivalent call using apply_async():

task = my_background_task.apply_async(args=[10, 20])

When using apply_async(), you can give Celery more detailed instructions about how the background task is to be executed. A useful option is to request that the task executes at some point in the future. For example, this invocation will schedule the task to run in about a minute:

task = my_background_task.apply_async(args=[10, 20], countdown=60)

The return value of delay() and apply_async() is an object that represents the task, and this object can be used to obtain status.

To run

You need to run three processes simultaneously for this. The easiest way is to open three terminal windows.

On the first terminal run Redis.

redis-server

Run the Celery worker on the second terminal.

celery worker -A app.celery --loglevel=info

Run your flask application on the third terminal.

python app.py

--

--