Using Celery to perform long-running tasks in a web application

Hello! In this article we will look at how we can include real-time and dynamic information in our web application, either produced on our server or obtained from the internet.

Serving dynamic and real-time contents in a web application differs vastly from serving out static files and pages because it is not guaranteed that the required information will be available or generated within the server’s response cycle for the given HTTP request. Also it might happen that while the needed content can be reliably generated but the generation of that content takes significant amount of time. Imagine you are running a web app where users can upload videos and you store thumbnails of various moments in the video to display later. You could go the straightforward way of obtaining the HTTP POST request from user, storing the video, and using any of the tools to get thumbnails at say, every one-tenth position in the video. But doing that, especially if your website has considerable traffic, could take a while and might result in the user having to wait for many seconds while the request is processed on the server and a reply generated. This is obviously bad for user experience as nobody wants to wait unnecessarily while their page takes a long time to load. This is the kind of problem that we are going to try to solve in the following sections, and this problem can be solved using what is called asynchronous task queue.

Setting up the environment

In this article, we will only deal with the case of a Flask web app though similar methods will be applicable for a Django or Pyramid project, or any Python web framework (there are Celery extensions for Pyramid and Django). It is assumed that user already has a virtual environment running and Flask is installed. Otherwise it can be done using the following commands on Linux or Mac OS:

pip3 install virtualenv
virtualenv celery-venv
source celery-venv/bin/activate
pip install Flask Celery

Celery has three components:

  • the celery client which will interact with our application and run parallely with it
  • the celery workers which will run the long-running tasks given by our application asynchronously
  • the message broker which the celery client will use to keep track of the tasks and queue it in so that the celery workers can “consume” them

Optionally celery also provides the option of result back-end to keep track of the status and result of the tasks but in a lot of the cases this feature is not needed.

RabbritMQ is the most preferred message broker for Celery, followed by Redis and in this example we will use Redis DB, a NoSQL key-value store, to serve as the message broker.

Redis can be easily downloaded and installed on Linux by the following commands:

tar xvzf redis-stable.tar.gz
cd redis-stable
sudo make install

The redis server can be started with default configurations by the following command:


For more serious applications one might want to daemonize the redis-server process and start it on system boot automatically (read [here]( for more) but for our purpose we will just open a separate terminal and keep the redis-server process running.

An Example Application

To understand asynchronous task queues and Celery we will create a simple Flask application which allows users to receive an email reminder at a specific time in future containing the message specified by them. So let’s get started with writing some actual code!

Change to your project directory and open up a file named ‘’

import os
from flask import Flask, request, render_template, redirect,url_for
import smtplib
from celery import Celery # import the Celery class
app = Flask(__name__)
# Celery configurations
app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0'
app.config['CELERY_RESULT_BACKEND'] = 'redis://localhost:6379/0'
# create a celery object
celery = Celery(, broker=app.config['CELERY_BROKER_URL'])
def send_email(email_address, subject, body=None):
if not body:
body = subject
subject = 'No Subject'
user = os.environ['MAIL_ADDRESS']
password = os.environ['MAIL_PASSWORD']
sent_from = user
to = [email_address]
email_text = """\
From: %s
To: %s
Subject: %s
""" % (sent_from, ", ".join(to), subject, body)
server = smtplib.SMTP_SSL('', 465)
server.login(user, password)
server.sendmail(sent_from, to, email_text)
@app.route('/', methods=['GET', 'POST'])
def index():
if request.method == 'GET':
return render_template('index.html')
email = request.form['email']
'Reminder from Flask-Celery Test App',
countdown=int(request.form['minutes']) * 60)
return 'Reminder set'
if __name__ == '__main__':

What Happened Here

To use celery in our applications we need to create an instance of Celery class which we import from the celery module we installed earlier. To instantiate the celery object we need to provide it with some configuration values such as the message broker to connect to, the result backend etc. Since we don’t need a result backend in our app we just provide celery with the message broker information which is the URL which our Redis server is bound to. We can use Flask’s configuration mechanism to store celery configurations as well as it ensures that all our configurations are stored at one place and easier to change and view. Next we define our email sending function which is just a regular function that you would write to send an email. The only difference here is that we wrap our function with task decorator which registers this function as one of the tasks for Celery to perform when called for. The task decorator can easily be accessed as it is available as an attribute of the celery instance that we created. Next we define our view as we normally would in a Flask app but the difference lies in the way we call our email sending function. Since we have the send_email function wrapped with celery’s task decorator it provides us with a few extra methods we can use to implement our function in a more asynchronous fashion. The apply_async method is used when we want to have our given function run asynchronously rather than instantly in the same thread as it normally would without celery. What apply_async method normally does is it enqueues the task (ie, the execution of the send_email function) to celery’s task queue and whenever a celery worker is free it will dequeue the task from the queue and execute it. The arguments to the function are given as a list to the args parameter and the countdown argument that we provided tells celery to not execute the task as soon as a worker is free but rather defer the task for the given amount of seconds and then execute the task as soon as a worker is then free. That’s all what is needed to incorporate celery’s asynchronous task queue mechanism in our Flask code.

Next create a ‘templates’ folder and inside that create a file ‘index.html’:

<!DOCTYPE html>
<title>Flask-Celery Test App</title>
<form action="/" method="post">
<input type="email" name="email" placeholder="Enter your email address"><br>
<textarea name="msg_body" placeholder="Enter the reminder message to be sent"
rows="4" cols="50"></textarea><br>
<input type="text" name="minutes" placeholder="Enter time in minutes from now to send the reminder"><br>
<input type="submit" value="Set reminder">

Running the app

Keep the redis-server process running on one terminal. Start the celery worker process in another terminal by running the following command:

celery worker -A app.celery --loglevel=info

Now run the Flask app in another terminal by running the command:


Make sure that the celery process has access to the required environment variables (email address and password) by exporting the required values in the terminal running the celery process, before starting the celery process.

This is it. You now have a simple web app which can perform long-running or intensive tasks in the background using asynchronous task queues, driven and controlled by your web app.