JavaScript: smooth UI & heavy-lifting functions

Jeremie van der Sande
7 min readFeb 10, 2018

--

If you’re a JavaScript developer or enthusiast, you must have heard it a thousand times by now: JavaScript in the browser is not multi-threaded. If you never heard that, well, there you go:

JavaScript in the browser is not multi-threaded.

This has several implications, but the one we are going to try and tackle here is the fact that a function doing some heavy-lifting can (and will!) completely block your website while it is running.

This is because not only is JavaScript single-threaded, it also shares its thread with the rest of the mechanisms running your web page: the DOM drawer, the CSS interpreter…

While your big, fancy JavaScript function is running its magic, nothing else can happen. Go ahead and try running an infinite loop on your web site:

while (true) { }

You will see that the complete web site becomes unresponsive: your :hover effects on your elements will never apply ; you will become unable to click on any link, any button on the page ; in fact, if you keep clicking on the page, chances are your OS will show you the “Browser not responding” popup.

While the infinite loop is an extreme example, the same will happen if a classic function takes a lot of time to compute.

The problem to solve

I recently stumbled across a problem due to the single-threaded nature of JavaScript at work. We had this function treating a String, and for technical reasons it had to run on the client side. But the String to treat could be anywhere from a few bytes to several megabytes. On an average computer, it took near one second per megabyte to finish. Of course, there was no way we could ship a function that could freeze our website for several seconds!

The solution

With server-side execution unavailable, I needed to find a way to work on the client side without freezing the complete UI. If in the meantime I could obtain a way to actually get the progress of the function, then I could even turn this headache into an even better UX than before the problem! And fortunately, the answer was right there, in my babel.rc file: Promise.

Side note about Service Workers: An experimental API called Service Worker could have allowed running the function on a dedicated thread. But since 1) this is not the intended usage of Service Workers, and 2) I didn’t have the time to wait for it to be implemented, I had to work with what was available.

Getting there

Callback, asynchronous and multi-threaded

One common misconception I encountered in JavaScript is that functions with a callback are asynchronous, thus don’t block the thread.

This misconception comes from the fact that Web APIs run on a different thread, and they happen to work with callbacks. But a callback does not have to be asynchronous (though if it’s not, then it is poorly coded), and a function with a callback does not run on a separate thread, except for Web APIs calls.

Let’s take a look at the following code:

myFunctionWithCallback takes a callback, so it is asynchronous and doesn’t block the thread!

Well… No. Running this code will produce the following output:

3
Waiting for the result...

But… What happened? There is a callback and all!

There sure is, but the callback is called directly, inside the function. Therefore it’s just like calling any function from within another: synchronous, linear, single-threaded.

Now, the following modification will make the callback asynchronous:

Wrapping our function in a setTimeout (which is a Web API call, by the way) will result in an asynchronous execution of our code. Executing it, we get the following, expected output:

Waiting for the result...
3

Ah, so now it’s running on a separate thread since setTimeout is a Web API call!

Unfortunately, not quite. The only part of it running on a separate thread is the Timer counting for the setTimeout. The content of the setTimeout, however, will run on the same thread as the rest of the JavaScript. It’s execution will only be deffered until the timeout happens and a free slot becomes available on the thread. Here, the timeout happens right away (after 0 milliseconds), but we still need to wait until the thread is free. And that happens right after our Waiting for the result... message is printed.

Until now, we’ve only been able to move around the execution sequence, but we did not allow the rest of the website to keep running smoothly since we’re still hanging the thread. Everything inside the setTimeout will run in one round.

Because of the single-threaded nature of JavaScript, we will never be able to reduce the time needed on the thread to compute our function. But what we can do, is split our execution in small chunks, and deffer their execution just like we did with the setTimeout.

Back to our example

In our example, the code to handle is the following:

const myTreatedString = treatmentFunction(myString)

The treatmentFunction has one property that makes this method possible: it does not need the complete string to run. That means, the following code will produce the exact same result:

const firstSlice = myString.slice(0, myString.length/2)
const secondSlice = myString.slice(myString.length/2)
const myTreatedString = treatmentFunction(firstSlice) +
treatmentFunction(secondSlice)

Now, as we have seen above, our goal will be to run each slice in the callback of a Web API, so that it executes only when a slot becomes available on the thread.

We will therefore create a function that should do the following:

  1. Cut the input in small slices
  2. Put each slices in a Web API callback
  3. Gather all the sliced results
  4. Call a callback when all the slices are done with the sliced results stitched back together

To be sure to let some time for other parts of the website to run, we need to register the second slice only after the first slice is finished, and so on. We also need a Web API that will allow us to differ the execution of a slice, while leaving some time for the UI to update.

Our Web API: requestAnimationFrame

requestAnimationFrame works just like setTimeout, except we don’t give it a time to wait. Instead, it puts the content back in the event loop right before the UI is updated. This way, we will get the following chain of events:

Treat a slice -> Update the UI -> ... -> Treat slice -> Update the UI -> ... -> Treat a slice -> Update the UI...

We are sure to let plenty of time for the UI to update between each slice. However, keep in mind that requestAnimationFrame will execute your function just before the next UI update, no matter how many functions have been registered. That means that if you register three slices in a row, they will all get executed before the next UI update.

Chaining the slices: Promises

We need to wait for a slice to finish before we can register the next one through requestAnimationFrame, but we would also want to prepare all the slices, register the first one and be done with it.

Now, the callback of the first slice could launch the next one, until we reach the last slice which would instead call the final callback.

Or, we can work with what ES6 has given us: Promise!

Since Promises are chainable, making sure our second slice executes only after our first slice is as easy as:

executeSlice1().then(executeSlice2)

If we have an unknown number of slices, we can simply do the following:

const slices = getSlices()
let promise = Promise.resolve(null) // Head of chain
for(let slice in slices) {
promise = promise.then(() => executeSlice(slice))
}
promise.then(() => console.log('All slices done')

Now all we need to do is create our executeSlice function so that it returns a Promise which resolves our treated slice:

function executeSlice(slice) {
return new Promise((resolve) => {
requestAnimationFrame(() => {
resolve(treatmentFunction(slice))
}
}
}

Running this code, we will actually treat the complete string, while letting the UI update between each slice! It’s just a matter of defining the slice size small enough that it lets the UI update smoothly, and large enough that it doesn’t take forever to finish.

Getting the result

While the above code does treat the complete string, it has one (big!) caveat: we are not getting the final result! We have all these result slices, that we dismiss right away and never save anywhere.

The solution for that is to modify our executeSlice function in order for it to take the accumulated result in parameter, and return the updated accumulation, just like an Array.reduce call would do.

function executeSlice(slice, accumulator) {
return new Promise((resolve) => {
requestAnimationFrame(() => {
resolve(accumulator + treatmentFunction(slice))
}
}
}

We then need to alter a bit our chaining for loop to take that new parameter into account:

const slices = getSlices()
let promise = Promise.resolve('') // Head of chain: start our
accumulator
for(let slice in slices) {
promise = promise.then(acc => executeSlice(slice, acc))
}
promise.then((result) => console.log('All slices done: ' + result)

And that’s it!

We now have a clean way of running our heavy function without freezing the UI. Furthermore, since we know the number of slices in advance, we could update a counter of finished slices inside the requestAnimationFrame callback which would provide a simple and accurate progress indicator.

Wrapping up

Before closing this article, I’d like to share a generic implementation that I’ve written, which allows to easily transform such a function into a chunked function.

Conclusion

I hope this article can help people struggling with UI freezes due to heavy JavaScript, and that it helps highlight the strength of the Promise Object.

This journey was really interesting for me, and I hope my new job will allow me thousands of new discoveries like this one. If so, I’ll be sure to share it there with everyone!

--

--