Understanding “some” System Design Concepts And Implementing Celery Based Python Application

Akash Srivastava
Camping with python
7 min readJun 4, 2020
Source: giphy.com

Hey, Hope you are doing good.

Working and managing things in parallel and asynchronously is quite a necessity nowadays. The consumer wants speed and everything real-time, so Companies have thousands of server for the single purpose of providing a better customer experience. This very demand causes many engineering challenges, which is one of the major parts of System Design Challange.

So, if you have an engineering mindset and you want to learn more about modern-day computing and some system design stuff then let’s get started.

Disclaimer: I am sharing my own experience in backend development and I may be wrong at some point then please let me know.

Problem

If you are a beginner in Django or node.js or any other server-based library/framework, you must have been running your application on localhost. Okay but let us assume that you already have deployed your code on a server.

Scenario:You have developed a website/app using an advanced photo sharing technology. Your website/app is becoming a big hit. Now the audience is rolling over from Instagram to your platform.

Yay!!!, you have achieved your motive. Not quite yet, the requests on the server are growing exponentially, the server is processing the requests with higher Work Load than 1.0. Now, this causes a bottleneck situation. The server is unable to process a huge amount of requests, causing slower response time and unsatisfactory results.

So, what can you do now? The first thing in your mind may be “SCALING-UP” but of what type:

  1. Horizontal Scaling
  2. Vertical Scaling

Horizontal Scaling

Horizontal scaling simply means to have more workers to do the job, rather than having a single worker. That is to have more number of servers with the same code and all are connected to the database system.

We will cover the database server based scaling in upcoming articles.

Vertical Scaling

Vertical scaling simply means to make your worker bigger and stronger. That is to modify the current server with higher Computing power and Bigger RAM. The biggest example is Stackoverflow.com.

Now, which one will you choose?

I think I were at your position then I would have gone for Horizontal one (but with a bit bigger servers) and scale my servers based on the crowd and demographic aspects.
For example, I would have my major servers in USA and INDIA because I am getting more crowd in these countries.

Now, how to assign requests on the servers? because we have multiple of them now. This part of the assigning of request is played by Load Balancer.

Load Balancer

As the name suggests the load balancer balances the load on the servers by managing the requests for the servers. The load balancer allocates the requests on the methodology called Consistent Hashing.

Reference: https://www.youtube.com/watch?v=K0Ta65OqQkY Best :)

Hurray, Now your server-side system is ready to serve, you have now launched your application to the world and you are currently serving thousands of requests.

New Scenario: Release 1.0 was a success, now as you have hired developers on the go for further development of the project.
For release 2.0 multiple subroutines and parallel asynchronous processes are being developed. For example, email messaging, notification system, model training, etc.

Now, the question is, to have these subroutines on the main requests handling servers or should we have these as Microservices on different servers from where they serve when required.

Again, If I was at your position I would have gone for the Microservices based system. because it makes a system distributed, asynchronous and faster. If any of the developers push a bad commit onto the Microservice CodeBase and got production error. Even then the main system will be fine.

Distributed Systems

By having multiple servers, we already have covered some aspects of distributed systems. But in a bigger plot, the distributed system is the system constructed by different-different microservices.

How does it work?

So, imagine you have two different codes on two different server systems and you want to trigger a function of the server number 2 without importing it.

How can you do that? ….There are multiple ways like using an API call or by simply broadcasting a message to the servers to start a particular process.

In API based systems we require API endpoints associated with the server which is quite simple and elegant to build. Node.js is popularly known for it.

We can also use a certain type of database called Messaging Queue (Message brokers). The servers need to subscribe to a messaging queue and using this technique we can send the message to start a process on the other server.

Messaging Queue

A Messaging queue is a type of FCFS (First Come First Serve) based queue system that bridges the communication gap between the server at two different places and also it is like broadcasting a message so any free server listening to the messages can take up the job.

Some popular Messaging Queue: RabbitMq, Redis, Amazon SQS

Process:

  1. The main server triggers the request onto the messaging queue.
  2. The messaging queue accepts the request and stores it in the database like appending a data in a “Queue Data Structure”.
  3. When the microservice server takes up the job and completes it successfully then the messaging queue removes the data from the queue with a success message.
Distributed Systems of some sort

And Now !! you can say, your backend system is complete and ready to server.

Yes!! Source: giphy.com

Celery

“Celery communicates via messages, usually using a broker to mediate between clients and workers. To initiate a task the client adds a message to the queue, the broker then delivers that message to a worker.

A Celery system can consist of multiple workers and brokers, giving way to high availability and horizontal scaling.

Celery is written in Python, but the protocol can be implemented in any language. In addition to Python, there’s node-celery and node-celery-ts for Node.js, and a PHP client.

Language interoperability can also be achieved exposing an HTTP endpoint and having a task that requests it (webhooks).”

This section is COPY PASTED FROM Celery Documentation :)

Python Based Distributed Systems using Celery

Now you know the concept behind messaging queues, servers, and microservices. Now let us build an example application where a python script communicates with the RabbitMQ server and starts process present on another python script.

Idea

Python Script 1: It takes the value from the user and calls the function(present on the other script) with value as an argument using the RabbitMQ Server.

Python Script 2: When it acknowledges the task on the RabbitMQ Server. It starts the process and prints the argument 5 times.

I know its a very simple program, but the main takeaway is the ideology applied. We will develop more advanced python app using Django and Celery in future.

Let's deploy Messaging Queue

We will be using RabbitMq server on Heroku. So let's deploy a RabbitMq server on it.

Step 1: Create a Heroku App

Heroku

Step 2: Find and install RabbitMq server addon

Heroku

Step 3: Click the link

Heroku

Step 4: Copy the link

Heroku

This link will be used in the python scripts to connect to the database.

Let’s Setup

Dependencies:

celery==3.1.0 
(Stable in windows as well, the latest version doesn't support windows)

Let’s code

Python Script 2: “The Hidden”

celeryScript2.py

In this script, first, we initiated a celery object using the broker link. Then the wrote the method that is to be executed on call having the “decorator” called “task” with the name as “printSentence”. This name can be used to call the task from the other side.

To determine whether system is working or not, start the celery process using following command and you are most likely to see this output.

celery -A celeryScript2 worker -l info
See the markings

Python Script 1: “The Front”

celeryScript1.py

Here we initiated the celery object using the broker link and then we took input from the user. After that, we broadcasted the task to the messaging queue with the name of task and arguments.

Results

Now execute the scripts. If everything works perfectly you will see the output in the celery tasks command window

Hence, we have successfully implemented a distributed system using Celery and Python.

I hope you find it interesting enough to excite your engineering instincts.

I hope this tutorial will help you a lot in your future projects. Share this article with your colleagues and friends.

Thanks for being here with me till the end. Please comment below about the article, your comments mean a lot to me.

You can follow me on Twitter, GitHub, Medium, LinkedIn.

Don’t forget to follow Camping with python.

If you have any doubts regarding this tutorial or have any other issue or you want to suggest something, you can comment below.

The next article will be up soon until then keep practising.

--

--