Async Concurrency in JavaScript

Jason Amador
Oct 4 · 2 min read

I was recently tasked with building a micro-service which needed to make thousands of HTTP requests and process the responses. To create a simple mock of the situation, I’m going to use a fake request function that accepts an arbitrary value and returns a Promise that resolves to an object containing the value and the interval:

const {promisify} = require('util')
const sleep = promisify(setTimeout)
const request = async(data) => {
let time = Math.random() * 1000
await sleep(time)
return {data, time}
}

I had a bunch of records that contained parameter data for the requests, so the first thing I did was query for those records. For this example, we will use an array of integers to mock the records. My first instinct (inexperienced as I was) was to iterate over the array of records, make each HTTP request and push the result to a new array:

async function main() {
const records = Array.from(new Array(10)).map((e, i) => i)
let responses = []
console.time('Inline')
for (let i = 0; i < records.length; i++) {
let response = await request(records[i])
responses.push(response)
}
console.log(JSON.stringify(responses, null, 2))
console.timeEnd('Inline')
}

This works fine but we have to wait for each response before making the next request, meaning this could take up to 10 seconds to execute. The results would look like this:

[
{
"data": 0,
"time": 48.95140139293264
},
{
"data": 1,
"time": 351.42969859007377
},
...
]
Inline: 5210.4460449ms

5 seconds is a really long time. Now let’s look at how we can perform these requests concurrently. We can push each of the requestcalls into an array and use Promise.all() to wait for them all to resolve:

async function main() {
const records = Array.from(new Array(10)).map((e, i) => i)
console.time('Concurrent')
let promises = []
for (let i = 0; i < records.length; i++) {
promises.push(request(records[i]))
}
let responses = await Promise.all(promises)
console.log(JSON.stringify(responses, null, 2))
console.timeEnd('Concurrent')
}

Voila, we have just sped up our service by records.length times! The results look something like this:

[
{
"data": 0,
"time": 160.08417354131944
},
{
"data": 1,
"time": 560.08495847237463
},
...,
{
"data": 9,
"time": 223.39482395749209
}
]
Concurrent: 560.08495847237463ms

Using this nonblocking method responses gets the results of all of the Promises as soon as they are all complete, so the total execution time is just a few milliseconds greater than the slowest of the HTTP request. This same technique can be used for any collection of asynchronous operations.

Just for fun, let’s do the same thing in one line using Array.map:

let responses = await Promise.all(records.map(record => request(record)))

I hope this helps, happy coding.

JavaScript in Plain English

Learn the web's most important programming language.

Jason Amador

Written by

I’m a software engineer, musician and avid pool player.

JavaScript in Plain English

Learn the web's most important programming language.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade