Getting Started with Celery: A Comprehensive Guide

Yuyi Kimura
Dev Whisper
Published in
4 min readJul 10, 2024
Representation of Python Celery generated by AI

Celery is a powerful, open-source, asynchronous task queue based on distributed messaging. It focuses on real-time operation but supports scheduling as well. It is normally paired with web frameworks like Django or Flask to handle time consuming tasks outside of the web request/response cycle.

At its core, Celery is designed to handle tasks in the background, making it ideal for uses-cases in which you need to offload tasks to run them efficiently. Celery is highly flexible and can be integrated with a couple of messaging brokers like RabbitMQ and Redis.

Common Use Cases

Let’s go over some common use cases for Celery:

1. Sending Emails

Sending emails is a time-consuming task that can slow down your server’s response time if processed synchronously (waiting for the email to be sent to return a response). Celery allows you to send emails in the background, improving the user experience by making your application more responsive.

2. Generating Reports

Generating large reports can take significant amount of time, from fetching all the required data to transforming it into a visually effective report. Using Celery, you can delegate the report generation process in the background and notify the users once the report is ready to download (or send it via email).

Code Examples

Let’s dive into some code and see how Celery works. We will implement a simple Flask application for this example.

1. Install Celery and Redis

First, you need to install Celery and your message broker (for this example we will be using Redis)

pip install celery redis Flask

(Optional) Run Redis using Docker

Since you will need a Redis instance to start hacking with Celery, a good option is to spin it up using Docker. If you already have Docker installed, simply run the following command:

docker run -d --name redis -p 6379:6379 redis

This command will:

  • Download the Redis image if it’s not already present on your system.
  • Run the Redis server in detached mode (d), mapping the default Redis port 6379 from the container to your host machine.

2. Create Flask Application

Let’s create a simple Flask application in app.py:

from flask import Flask, request, jsonify
from celery import Celery

app = Flask(__name__)

# Configure Celery
app.config['CELERY_BROKER_URL'] = 'redis://localhost:6379/0'

celery = Celery(app.name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)

@celery.task
def send_email(recipient):
# Simulate email sending
import time
time.sleep(5)
return f'Email sent to {recipient}'

@app.route('/send-email', methods=['POST'])
def send_email_route():
data = request.get_json()
recipient = data['recipient']
task = send_email.delay(recipient)
return jsonify({ 'task_id': task.id }), 202

@app.route('/status/<task_id>')
def task_status(task_id):
task = send_email.AsyncResult(task_id)
if task.state == 'PENDING':
response = {'state': task.state, 'status': 'Pending...'}
elif task.state != 'FAILURE':
response = {'state': task.state, 'result': task.result}
else:
response = {'state': task.state, 'status': str(task.info)}
return jsonify(response)

if __name__ == '__main__':
app.run(debug=True)

3. Run Flask and Celery Worker

Run your Flask application:

python app.py

In a new terminal, start the Celery worker:

celery -A app.celery worker --loglevel=info

Let’s describe the whole flow:

  1. A client makes a POST request to /send_email with a JSON body containing the recipient’s email.
  2. The Flask application will process the request and trigger a Celery task.
  3. The Celery worker will receive the task and start the “processing” of sending an email.
  4. There is an extra endpoint /status/<task_id> which allows the client to check the status of the task.

Benefits of Using Celery

The above example is an oversimplification of what you can do with Celery, but even so, we can mention a whole list of benefits from it:

Asynchronous Processing

Celery allows task to be executed asynchronously, which can greatly enhance the performance of your server application. In a real-world scenario, in which you have a lot of customers executing actions that requires time-consuming operations, moving those operations to the background can avoid your app to get bloated of requests, impacting the experience of your customers.

Scalability

Celery is designed to be highly scalable, allowing you to distribute tasks across multiple workers (even on different machines). This allows you to horizontally scale your worker pool for tasks that are more demanding without increasing your application server as a whole.

Scheduling

In addition to these real-time tasks, Celery supports scheduling, making it easy to run periodic tasks.

Challenges of Using Celery

There is no silver bullet in technology, therefore we can’t ignore the intrinsic challenges from adding a new tool to your stack.

Complexity

Setting up and managing a distributed task queue can add complexity to your application.

Error Handling

Proper error handling and monitoring are essential to ensure that tasks are not failing silently. Celery supports retry logic and a couple of error handling mechanism, but it’s up to you to use them correctly.

Dependency

Celery relies on external message broker (Redis, RabbitMQ, etc.), therefore it may include a dependency to your project.

Conclusion

Celery is an invaluable tool for any Python developer looking to handle asynchronous tasks and improve the performance of their applications. While it does introduce some complexity, the benefits of using Celery often outweigh the challenges, especially for applications that require efficient task management.

By following this guide, you should now have a basic understanding of how to set up and use Celery in your Python projects. Whether you’re sending emails, generating reports, or handling any other background tasks, Celery can help you manage these tasks efficiently and effectively.

--

--

Yuyi Kimura
Dev Whisper

Full-stack software engineer and machine learning enthusiast.