Flask + Celery = how to.

Stefano Frassetto
4 min readMar 6, 2019

--

Handle background tasks without pain.

TL;DR

Life’s too short to wait for long running tasks in your requests, Flask is simple and Celery seems just right to fit the need of having background jobs processing some uploaded data, sending emails or baking cakes while letting the users continuing their wild ride on your web app.

On the Flask side, the docs look pretty clear, they even got an encouraging bg-tasks Celery section. Moreover, as Celery states, framework integration with external libraries is not even needed.

You start small and everything looks pretty neat: you’ve created your app instance, made a Celery app with it and wrote some tasks to call in your route handlers. Things are doing great, your app’s growing and you’ve decided to embrace the application factories’ Flask approach to gain more flexibility, but you’re not too sure on how to maintain Celery nice and clean inside your app. Moreover, you’ll want to isolate all your tasks definitions in a sub-folder to import them in your views, blueprints, flask-restful Resources or anywhere you may need to.

The problem, though, is that if you stick to the old pattern it will be impossible for you to import your celery instance inside other modules, now that it lives inside your create_app() function. This approach could get daunting, as it’s very likely to run into circular imports.

Fortunately, Flask documentation’s pretty clear on how to deal with factories and extensions:

It’s preferable to create your extensions and app factories so that the extension object does not initially get bound to the application.

What this is suggesting is that one should:

  1. Write a function taking both the extension and app instances to perform some desired initialization;
  2. Instantiate the extension in a separate file (e.g. “factory.py”);
  3. Make an instance of the celery app and import it in our factory module to call the initializing function implemented at the first step.

In our case this means splitting our make_celery() function in two different ones: the first creating a Celery app instance, and another performing the tasks needed to bind that exact instance to the Flask app.

Your starting point may look something like this, or any variation of it:

sensible-browser http://localhost:5000/ & FLASK_APP=app flask run

Let’s refactor it to make the celery instance accessible from other modules.

First off, let’s split our make_celery() function and create a celery app instance:

Can you see where this is heading to? We’re now able to freely import our celery instance into other modules and we have a function to initialize that instance together with our flask app configuration, which we’ll do after having moved the create_app() function to its own factory module:

With everything in place we can now conveniently create a python script to run our flask app:

Et voilà, we’re free to import our celery app wherever we want know, and deal with a more flexible app structure.

For example, we could create a task module to store our tasks:

This let us import created tasks in other modules too. Let’s insert it in our all module:

python run.py, go to http://localhost/foo.txt/bar and let it create your file.

Alright, we cheated a little bit here. In fact, Celery is not actually running our task here, which is being run directly by the request handler instead.

Workers

To plug a Celery worker in we first must start a broker. This is pretty easy if you have Docker installed in your system:

docker run --name some-redis -d redis

First, let our tasks be queued by applying the .delay() method to it.

We’ll also need a little script to start the worker:

We’re ready to roll:

python run.py

and from another terminal window:

celery worker -A celery_worker.celery --loglevel=info --pool=solo

Now head to http://localhost:5000/flask_celery_howto.txt/it-works!

A new file flask_celery_howto.txt will be created, but this time it will be queued and executed as a background job by Celery.

You can confirm this by looking at your worker’s output:

[2019-03-06 11:58:55,700: INFO/ForkPoolWorker-1]

Task app.tasks.make_file[66accf66-a677-47cc-a3ee-c16e54b8cedf] succeeded in 0.003727149000042118s: None

I know what you’re thinking now: How can I monitor my background tasks?

Ever heard of flower?

Flower is a web based tool for monitoring and administrating Celery clusters

Setting it up is a piece of cake:

pip install flower
celery -A celery_worker.celery flower
# Visit me at http://localhost:5555
https://flower.readthedocs.io/en/latest/

Full repo for this tutorial here.

Enjoy!

--

--