Amit Kumar
Jul 14 · 5 min read

week 5| GSoc’19 | CCExtractor Development

Image Source: https://ccextractor.org

Concurrency, Parallelism, Threads, Processes, Async and Sync 🤔

Plant trees , it will surely help nature to survive!
Plant trees , it will surely help nature to survive!
Message: Plant trees, source: www.earth.com

Video processing takes huge time to process a single video which means when a user send a request to process a video, the client will wait for it’s response. But the problem arises when response time is greater than the connection time. Also you won’t like your user to be handcuffed only with the processing bar followed by failed user-client connection. So I guess it’s better to ask user to have coffee and come back after a few minutes. Let’s solve this issue while the user takes a sip of coffee. Since Django is not pre-built to work asynchronously now you got to find a solution for this. The first and foremost picture of solution that comes into mind is of Celery because it is the all time champion so far for executing tasks asynchronously. You may relate async tasks as the classical producer/consumer problem. Producer places a job in queue and Consumer checks the head of the queue for the awaiting jobs and picks the head one and execute it.

When you come around Celery you will notice a term broker, the concept of a broker is simply a queue. RabbitMQ, Redis and SQS are few examples. Since our user is very short tempered and has zero patience and almost finished his first coffee cup we can’t wait anymore to implement Celery in our Django app. Below the post, reference links are provided to get more detailed information. I followed Celery’s documentation to implement the feature for Django and selected Redis as the broker. I was very happy till when I found except for the main purpose i.e Inference part everything worked well. Celery community on irc is not so active I asked help from almost everywhere. I was unable to find the real reason behind this issue. I checkout the Celery FAQs where I found this statement

Why is Task.delay/apply*/the worker just hanging?

Answer: There’s a bug in some AMQP clients that’ll make it hang if it’s not able to authenticate the current user, the password doesn’t match or the user doesn’t have access to the virtual host specified. Be sure to check your broker logs (for RabbitMQ that’s /var/log/rabbitmq/rabbit.log on most systems), it usually contains a message describing the reason

My approach was to segment a video into n chunks and then inference parallelly with multiprocessing by leveraging the max CPU cores available.But except for the inference, every other functions call were working just fine. why is it so?

Here is the reason:

Python multiprocessing cannot be used to pass a tensorflow session into multiprocessing pool because the session is not serializable and the session may be managing GPU memory and other similar states as well.

But before this fact was known to me I kept on trying Celery to work. Finally this piece of command worked.

celery -A Rekognition worker -l info — without-gossip — without-mingle — without-heartbeat -O fair — pool=solo

The most important argument here is — pool=solo , Now let me explain briefly about Celery Execution Pool.

Celery spawns child processes to process any task but not by itself. These child processes (or threads) are also known as the execution pool. Execution pools are of four types supported by Celery, In which I will be discussing prefork and solo especially. The prefork pool implementation is based on Python’s multiprocessing package. This is where my tensorflow model was hanging to process. Prefork creates child processes to process the tasks. while ’solo’ pool is neither threaded nor process-based. It runs inside the worker process. It runs inline which make it faster. But the problem is not still solved since it executes the tasks sequentially. But one thing that can be done to utilize the fact is to create multiple workers with solo pool. With experiment, I found this method reduces the time of processing when compared with a single pool as well as passing a whole video instead of n chunks. Multiple solo pools come with high memory usage issue. So this part is still in an experimental part of the project.

Then one thing struck to my head i.e. How the tornado web framework’s asynchronous system is designed?

What I found that almost all asynchronous python frameworks use python’s standard library: asyncio underneath. So the 😃wow moment came now. But wait before I could jump into it let me tell you something more about async in python, the concept of asynchronous programming was introduced into python 3.4. It was introduced very late due to a fundamental python concept — Global Interpreter Lock (GIL) which blocked it from being asynchronous. In simple words in simple words, GIL a mutex (or a lock) that allows only one thread to hold the control of the Python interpreter.

“The event loop is the core of every asyncio application. Event loops run asynchronous tasks and callbacks, perform network IO operations, and run subprocesses.”

You may can add tasks to this loop and it executes them on a FIFO fashion

How to implement this event loop?

loop = asyncio.get_event_loop()
loop.run_until_complete(tasks())
#To define a async function :
async def foo():
pass

So I implemented one thread for each async method and finally solved a small yet major issue in asynchronous execution of the video processing part. Now user is not supposed to be handcuffed . He/She can put the video on the process and leave the desk to have another cup of coffee or add another video on the process.

Thank you for reading :)

LinkedIn GitHub Twitter

References:

https://realpython.com/

https://docs.python.org/3/library/

Amit Kumar

Written by

GSoC’19 || Software Enigneer

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade