Slimming down my NodeJS service response time by 44% using worker threads

Dat Nguyen
Destination AARhus-TechBlog
7 min readApr 5, 2022
Response time before and after implementation of worker threads

When you look at the graph, you probably get the same feeling as when you see people’s body transformation pictures and think: “what a nice achievement!

Do you feel inspired? In this article, I will tell you the story of how my NodeJS service lost 44% of its weight, making it much faster. The great news is your service probably can too, sit down and let me share its transformation journey!

How it all started

My NodeJS service has been gaining weight during the Christmas holidays. Weight in terms of response time due to heavy parsing, indexing, and transformation of large strings and objects up to several megabytes where the size of the data sets was linearly reflected in the response time. So I decided it was time for my NodeJS service to start the fad diet called “NodeJS worker threads” with a serious workout program.

Slimming down the response time has all sorts of benefits in different aspects. You want a fast response time, for instance, when a user visits a website or developers call your service endpoint! People today are in such a hurry. The slightest delay in a request will give a poor user experience.

When should I use worker_threads?

NodeJS is running a single-threaded event-driven concurrency model to perform non-blocking I/O operations, but from Nodejs version 12, you can use the stable built-in multithreading feature called worker_threads. That can be relevant if you are like me and want to slim down the response time. A way to do it is to perform your CPU-intensive tasks in the worker threads. CPU-intensive tasks could be:

  1. Complex computation
  2. Parsing large strings
  3. Indexing and transforming big objects

How does it work

The full code can be found here: https://github.com/Daterry/nodejs-workers-example

To show you the beauty of worker threads let’s take a simple ExpressJS service that can do some pushups!

Example 1: Simple endpoint that does pushups

Our workout is purely based on pushups! Example 1 has an endpoint called /do-pushups-alone and takes a query parameter called reps. Each pushup repetition takes 1 sec.

Calling the endpoint: http://localhost:3000/do-pushups-alone?reps=10 logs the following:

Express is starting the workout at http://localhost:3000
I did 10 reps in: 10.001s
total workout duration: 10.001s

Let’s say if you were to do 50 pushups, it would take around 50sec! Now imagine calling in a gym buddy (Let’s call him a worker_thread) so we can do the 50 pushups together. That is what happens below in Example 2.

Try to call our endpoint in Example 2 with: http://localhost:3000/do-pushups-together?reps=50

Example 2: Endpoint that does pushups with a friend

I included some extra logging and this is what we get out when we call our endpoint:

I have done 1 reps
My gym buddy has done 1 reps
...
I have done 25 reps
I did 25 reps in: 25.177s
My gym buddy has done 25 reps
My gym buddy did 25 reps in: 25.176s
total workout duration: 25.982s

The total workout duration is cut in about half! I (main thread) called in a gym buddy and split the 50 pushups into two workloads and we did 25 each.

Let us dig a bit deeper into the code in Example 3. The createWorker() creates a new Worker that takes two arguments where the first is the file path to the code you want to execute in your worker thread and the latter is an object where you can send data to your worker thread.

Example 3: Creating a worker thread

However, you might wonder why my file path is called /workerGateway.js and that’s because I use a little in trick Example 4 to make worker threads work with Typescript. Usually, if your file path points at a Typescript file your worker will throw an error.

Using the workerData from createWorker() I can send the path of the Typescript code I want to execute in my worker thread which in this case is pushupWorker.ts

Example 4: Make worker threads work with Typescript

Now that our gym buddy is all set, he knows that we will do pushups but he doesn’t know how many. Backtracking to our endpoint in Example 5, we wanted to split the 50 pushups into two working sets. So I (main thread) can do 25 and tell my gym buddy to do the other 25. To tell my gym buddy, we will use sendPushupsToWorker().

Example 5: Splitting the total reps into two sets and sharing them between the main and worker thread

The sendPushupsToWorker() in Example 6 is used for communicating how many reps of pushups our worker thread should do. That is enabled using worker.postMessage() which is the main communication channel between the main and worker thread, and upon sending a message through this channel it will create a message event that both the main and worker thread can listen on.

Example 6: Communication from and to the worker thread

Our worker thread is now created and set up with pushupWorker.ts. In Example 7, the worker thread uses the parentPort to listen for messages from the main thread. Everything that happens within parentPort.on() is executed on the worker thread, and this is where we want to execute one of our working sets with 25 reps of pushups. When done, we will message back to the main thread by parentPort.postMessage() to resolve the promise from sendPushupsToWorker().

Example 7: Communication from and to the main thread

We were able to do 50 pushups. You might wonder why the total workout duration took 25.982s while my gym buddy and I did 25 repetitions in about 25.177s. If we think about it, what do you need to do before you can start out your workout? Yes, you have to get changed. That is equivalent to a worker threads’ start-up time which counts time to the total workout duration, but hey, my gym buddy can get changed faster than he can do 1 pushup.

Should you apply this technique to your existing NodeJS service?

You might already have an existing NodeJS service and want to apply worker threads to it. The easiest way to get started is to get an overview of which part of your code is CPU-intensive and measure the execution time.

So how do we identify which part of the code is “too heavy” in the service? Firstly, pick the request endpoint with a high response time. Secondly, follow down the request pipeline and set up start and end marks from NodeJS perf_hooks to measure different segments of your code. Thirdly, trigger the request pipeline to process data sets with different sizes to see the impact on the response time.

Now that you have an overview of the CPU-intensive code that slows down your response time, you should assess whether worker threads are the way to go or a refactor of the code can improve the response time. This is difficult to answer but what you can ask yourself is — can this particular CPU-intensive code run asynchronously while executing other code? Or is the program solely depending on this code to finish before it can continue its execution? If the answers are yes to the first question and no to the latter, you should consider implementing worker threads.

You are adopting the fad diet into your NodeJS service

You have decided to implement worker threads then these techniques below are what you could do to simplify the implementation of worker threads:

  • Extract your CPU-intensive code for decoupling: This is relevant for easier managing what code is being executed in a worker thread. This is similar to what we did with our doPushups() task.
  • Multithreading strategies: Executing the extracted and decoupled CPU-intensive code can be done in parallel, concurrent, or combined depending on your data processing pipeline. In our example, we did parallel execution where we split our data set (total pushups) and ran the same task in parallel to improve the processing time of our data sets. Concurrent execution runs two different tasks on your data set simultaneously. For instance, if I did doPushups() and my gym buddy did doPullups(). A combination of these two strategies will also work if your data processing pipeline allows it.
  • Use of worker pool to manage and reuse your workers: When using more than one worker, we want to avoid the overhead from creating workers on every request. A great library for handling this is be https://www.npmjs.com/package/node-worker-threads-pool
  • Setup workers: You saw earlier that it takes some time for a worker to get ready after creation. The start-up time of many workers would likely exceed the purpose of implementing workers. To circumvent this disadvantage, you could start the worker threads on application start-up and populate them in your worker pool. When you need the worker to execute a task, they are all ready and changed to immediately fulfill their duty. However, this will prolong the start-up time and you should assess whether this is feasible for your solution.

Conclusion

A transformation is always hard to start. What is the best diet and workout program? In this article, I took you through a pushup workout and a diet suggestion that might be a good fit for the anatomy of your NodeJS service. We only did pushups but managed to reduce the time by half with our “gym buddy” which in reality was our inner will to lose weight. Joke aside, there are many techniques to improve the response time for a NodeJS service and equally many ways to lose weight. Depending on the performance issue of your specific service, my transformation journey might solve or inspire you to a more feasible solution. That is why I want you to assess and think about your code and whether the NodeJS worker thread is the best “diet”.

This diet was surely a great fit for my NodeJS service and I hope you got inspired by the journey!

👋 Hey there! I’m Dat Nguyen , and I am a developer with a passion for lifting. Working in a team that creates a cool headless websites platform for BankData’s customers and when I do awesome stuff, I write about it. You can contact me on LinkedIn if you got any questions, comments or anything in between!

More interesting readings about worker threads

--

--