Effective limited parallel execution in JavaScript

Let’s say you need to save 10000 books into a books service. The service’s API only allows us to save 100 books per request. My first idea would be to split everything into chunks of 100 books and fire everything in parallel.

The downside is that it will create a spike in the load of the books service. It usually means that the service will either struggle to process such a spike or it will be over-provisioned to handle loads like that. Over-provisioning usually means paying for computing power that isn’t used most of the time.

So we want to flatten the spike a bit. To do that we will use what’s called a limited parallel execution. Basically, we will be sending batches of books in parallel but not more than 10 at a time.

You probably already know that this code has one issue. On each iteration, we have to wait for the slowest operation to complete.

Image for post
Image for post

Let’s solve that as well with a promise pool. A promise pool allows you to limit the maximum number of functions running in parallel by waiting for the promises to settle. I’m going to use the @supercharge/promise-pool package, but there are many alternatives.

You should add a proper error handling though :)

As you can see we no longer wait for the slowest function to complete and start a new execution as soon as one finished execution.

Image for post
Image for post

In this particular example, the total execution time was reduced by a third.

Of course, there there is no limit to perfection. What if the books service has a rate limit? What if there is more than one producer? Your next step then would be setting up a queue like RabbitMQ, Apache Kafka, or Amazon SQS. But that’s another topic.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store