Celery and Rabbitmq in Django And Monitoring with Flower, Just Couple of Steps to Get Async Working
In my 6 years of coding experience, without a doubt, Django is the best framework I have ever worked. At times We will be building a simple Django application to run async tasks in the background using Celery and RabbitMQ.
What do we need?
Python 3.6.8, RabbitMQ 3.6.10
Django==2.2.3 , flower==0.9.3 , celery==4.3.0
install RabbitMQ :
apt-get install rabbitmq-server
The RabbitMQ service starts automatically upon installation. You can manually start the server by running the following command on the command line.
After RabbitMQ installation, you must install python and related package with pip command like:
pip install Django
pip install celery
Then, you must create a Django project with a simple app, we create djangocelery project with app l, you can clone the djangocelery project on my GitHub.
djangocelery/|-- app/| |-- migrations/| |-- __init__.py| |-- admin.py| |-- apps.py| |-- models.py| |-- tests.py| +-- views.py|-- djangocelery/| |-- __init__.py| |-- settings.py| |-- urls.py| +-- wsgi.py|-- manage.py+-- requirements.txt
The first step you must add the CELERY_BROKER_URL configuration to the settings.py file:
CELERY_BROKER_URL = ‘amqp://localhost’
Add Celery to your Django Project
Create a file named celery.py adjacent to your `settings.py` file,
|-- djangocelery/| |-- __init__.py| |-- celery.py| |-- settings.py| |-- urls.py| +-- wsgi.py
This file will contain the celery configuration for our project, Add the following code to the `celery.py` file :
The code above creates an instance of our project, The last line instructs celery to auto-discover all asynchronous tasks for all the applications listed under `INSTALLED_APPS`.
Celery will look for definitions of asynchronous tasks within a file named `tasks.py` file in each of the application directories.
Edit the __init__.py
now edit __init__.py file in the project root with this path :
|-- djangocelery/| |-- __init__.py
open it and copy below code into the file and save that.
This will make sure our Celery app is important every time Django starts.
Creating First Celery Task
We can create a file named tasks.py inside a Django app and put all our Celery tasks into this file, The Celery app we created in the project root will collect all tasks defined across all Django apps listed in the INSTALLED_APPSconfiguration.
Instead of calling the celery_task directly, I’m calling celery_task.delay(), This way we are instructing Celery to execute this function in the background then Django keep processing my view celery_view and returns smoothly to the user
But before you try it, check the next section to learn how to start the Celery worker process.
Starting The Worker Process
Open a new terminal tab on the project path, and run the following command:
celery -A djangocelery worker -l info
The result is something like this:
then run Django project open the http://127.0.0.1:8000/celerytask/ then after a few seconds, If we check the Celery Worker Process again, we can see it completed the execution, first load page finished and send tasks to celery then celery received tasks:
After 30seconds tasks functions done and return successful strings :
Monitoring Celery Workers
There is a handy web-based tool called Flower which can be used for monitoring and administrating Celery clusters, Flower provides detailed statistics of task progress and history, It also shows other task details such as the arguments passed, start time, runtime, and others.
pip install flower
Once installed, you must open a new command line in the project directory and run this command :
celery -A djangocelery worker -l info
then open a new command line in project path and run this command :
celery -A djangocelery flower
The details can then viewed by visiting http://localhost:5555/dashboard in your browser like :
In addition :
You can delete all pending tasks with :
celery -A proj purge
That's it. Done.