Non-blocking, event-driven model of Node JS explained using real-world analogies

Swagnik Dutta
The Startup
Published in
5 min readAug 29, 2019

When I started with Node JS, most of the definitions I came across had keywords like asynchronous, non-blocking, event-driven, single-threaded which made little sense back then.

I was curious to know why there was this sudden hype over Node JS or what exactly has it brought on the table that might have been absent all this time.

Googling the above keywords didn’t quite answer my question. The keyword definitions spoke of different systems, paradigms, and architectural differences however I was trying to understand how they all fit in and make Node JS what it is.

In order to get the complete picture, I felt it was essential to understand the problem which Node JS is focused on solving. Taking that as our starting point let’s first try to understand the motivation. Once we’re done understanding the why, we’ll then look into the how aspect.

The problem

This can be best understood by mimicking a real-world system.

Imagine you’re in a cafe and there is a long queue of people waiting to place an order. However, the cashier who accepts your order is the only person working in the cafe. So what he would do is, he would take your order, your money and then proceed to prepare it in the kitchen. After your order has been prepared, he will serve it to you and proceed to accept the next order.

Note that while he has been preparing your order, he was unable to take the order of the next person in the line. You’ve been blocking the cashier from attending any other customer. As the line gets longer, the delay increases resulting in many frustrated customers.

The above is an example of a blocking system. One might think of adding more cashiers however that would require additional resources (salary for new cashier and space). Needless to say, this solution is far from scalable since at anytime the number of people in the queue can be way more than the number of cashiers.

We can draw a parallel from the cafe model above to explain the working of a thread-based model. In a traditional thread-based model when the server receives a connection, it holds the connection open until it has performed the request which can either be a page request or a costly transaction like writing something to a database. For whatever microseconds it takes to perform the request, the web server stays blocked on that I/O operation. To scale this type of server, additional copies of the server are required where each copy would require a separate operating system thread.

Thus multi-threaded systems were born out of necessity to scale these blocking systems. Here every client requests are managed by separate threads. These have been running fine since years however, these are resource-intensive as with increasing client requests, more number of threads are created and threads require memory and other resources.

Referring to the cafe model, imagine what would happen if a new cashier gets created for every new order. The cafe will shut down in no time after incurring heavy losses.

How exactly it solves

Let’s see what we can do to help our angry, frustrated customers who’re still standing in the line.

We make a change in our coffee shop model by bring in a chef whose one and only task is to prepare the order. In the new model, the cashier accepts your order and assigns the responsibility of preparing the order to the chef. Meanwhile you’re given a token number and requested to wait while your order is being prepared.

Note, that this time you’re not blocking the next person in the line from placing his order.

When your order is ready, the chef calls out your token number. You collect your order and enjoy your food while the chef proceeds to prepare the next order.

The customers are now happier and less frustrated since they no longer need to wait in the line just to place an order. Once the order is placed, they can take a seat, relax and probably check their Instagram stories.

What Node JS has done is, it has implemented a non-blocking system like above in its single-threaded, event-driven model. In the heart of this model lies a component, known as the event loop which continuously listens for incoming requests while running indefinitely on the single thread. It is never blocked. When a request comes, the event loop assigns it to another entity(a thread from the internal thread pool) to perform the actual operation. On completion of the operation, the internal thread sends the response back to the event loop which in turn sends the response back to the client via callbacks.

In our coffee shop, callback is the part where the chef calls out your token number and you go and collect your order.

In this model, the web server does not have to wait for the completion of read or write operations of previous requests. Its only task is to continuously listen for incoming requests, thereby making the model highly efficient and scalable.

If we want to speed things up in our cafe, we can add more chefs in the kitchen but adding another cashier is not needed since the current cashier is no longer overburdened with work.

Conclusion

In this article I’ve solely focused on drawing parallel from real-world systems to explain the non-blocking behaviour of Node JS. Further details regarding thread assignment and management are out of scope for this article, however, can be found in the source mentioned second under references.

References

  1. You don’t know Node - Azat Mardan. https://node.university/p/you-dont-know-node
  2. Node JS Architecture — Single Threaded Event Loop — By Rambabu Posa. https://www.journaldev.com/7462/node-js-architecture-single-threaded-event-loop

--

--