Retrofitting Node/NPM Packages with Async/Await.


I really, really love the upcoming Async/Await keywords. If you haven’t heard of them, Async/Await are proposed Javascript additions that allow for beautiful, expressive handling of non-synchronous tasks.

The original mechanism for dealing with asynchronous code is the callback. Enough has been said about the perils of callbacks. Then, Promises came along, and made the world a whole lot better, but they also introduced a new API that deviates from what many consider idiomatic Javascript.

The Async/Await proposal allows us to write code that appears to be synchronous, but is actually non-blocking.

When designing your own utilities, or working with modern promise-based packages, it’s easy to take advantage of this new tool. But how about when you’re working with a callback-based package?

Today I’ll be showing you how to retrofit built-in Node modules and external NPM packages to work with Async/Await. We’ll also make a couple stops along the way, to compare and contrast different asynchronous techniques.

All aboard!

Example problem: File manipulation

I’m working on a side-project that allows users to upload photos to be displayed on an LED matrix. In other words, I’m letting the internet decide what my wall art should look like.

Right now, I’m working on the ‘web’ part, which is a Node/React app that allows users to upload a photo. It needs to turn a regular, large image into a matrix of pixels, like so:

I am aware that this is probably a horrible, horrible idea :)

The client-side app is a very simple React app. The real business happens on the server. The steps are as follows:

  • Read the file into a Buffer.
  • Crop and resize the image to be 32x16 (the size of the LED matrix).
  • Save the resized image to disk (for reference).
  • Extract the RGB colour data for the 512 pixels in the image.
  • Send it to the client for previewing.

To accomplish this, I need to rely on:

  • fs, the built-in Node module for working with the file system
  • imagemagick-native, an NPM package that wraps ImageMagick. ImageMagick is a command-line utility for manipulating images, and it’s absolutely fantastic (I’ve used it before, for ColourMatch!).

Original Sin: Callbacks from Hell

Here’s what the route looks like, using the traditional “callback” style of asynchronous management:

This solution works, and it’s not even that bad. The biggest issues I see with it are:

  • No unified error-catching, each step’s failure needs to be handled separately.
  • A lot of visual clutter. It’s doing something pretty straight-forward, but it’s obfuscated by all the callback noise.
  • These problems will be compounded i̶f̶ when the requirements grow, and the route becomes more complex.

Let’s see how we can make it better…

A Promise Worth Keeping

Our first order of business is to convert each of these methods to support promises. Async/await is built upon functions that return promises, so we need to build some!

I’m gonna assume some familiarity with what promises are and how they work; if this is unfamiliar or rusty to you, MDN has a great article.

Let’s start with our filesystem methods, readFile and writeFile.

When we invoke one of these functions, it returns a promise that has been built to mimic the original, callback-based fs methods.

The original methods follow the errback Node style of returning two arguments: first the error, and then the data itself. We can take advantage of this signature, and resolve/reject based on the “truthiness” of the err argument. If there’s an error, reject with it; otherwise, resolve with the buffer.

You may have noticed that this solution is not very DRY. The two functions are incredibly similar. Let’s generalize it:

We’ve created a new wrapWithPromise helper function, which we can use to create new promises based on any errback functions we have!

This function is pretty dense, so let’s examine it in more detail.

We’ve set it up to be curried; We want to be able to create substitute functions (so, a readFilePromise instead of readFile), and to do that, we need to supply the function — readFile — before we know what the arguments will be (the path to the file).

So we create a function that returns the substitute function, and that function can be invoked with our “real” arguments.

Confused?
Currying (and first-class functions in general) take a while to feel comfortable with. Don’t worry if you’re having a hard time following :)

We’re using the ES6 rest operator to collect the arguments into an array. This is an important step, because the two functions we’re wrapping don’t have the same arity.

Next, we return a promise. The promise only has one job, and that’s to invoke the wrapped function with the supplied arguments. We’re using the ES6 spread operator to “unpack” them.

Because we’re working on the assumption that any wrapped function accepts an errback callback, the final statement doesn’t change; we reject when an error is provided, otherwise we resolve with the returned result.

External Packages often work the same way

We’ve tackled the fs module’s methods, but what about imagemagick-native, our NPM package to do image processing?

Happily, Because it follows the same errback signature, this is a piece of cake.

The world’s shortest Gist.
What if the module was different?
This technique can be applied to packages of all kinds, even if they don’t follow the errback style. Here’s an example of a wrapper function handling the legacy jQuery style of separate success/failure callbacks.

Vanilla Promises as Control Flow

If we wanted to, we could handle the route with these assembled promises, and call it a day. Here’s how that might look:

Ok, so we’re definitely moving in the right direction. We have much less nesting, a single ‘catch’ statement, and a fairly legible route. There are still some problems, though:

  • There’s a whole API to learn, and the details can get pretty hairy. It feels more like a third-party library than native, idiomatic Javascript.
  • Control flow can be tricky. Because then chains are waterfalls (a method’s output becomes the next method’s input), it can be hard to “preserve” data through the chain. 
     
    For example, I needed to retain the buffer created by resizeAndConvert for the next two stages in the chain — I need to save the file in saveToDisk, but then pass the original value onwards, to be used by readPixelsFromImage
     
    The most elegant way I’ve found is to wrap saveToDisk in its own little promise chain, to forward the right data. This works, but it’s a little too implicit for my tastes.
  • For an error to be caught by the exception handling, it has to be explicitly rejected. If your code blows up in unanticipated ways, it fails silently. I spent 10 minutes debugging, only to realize that I had a typo in a variable name.

The Promised Land: Async/Await

A Sink, Await.

In my opinion, vanilla promises aren’t an ideal solution, but they’re still a tremendous step forward for Javascript development. When used as the backbone for the generator-like Async/Await, they truly shine.

While this is a proposed feature for a future version of Javascript, you can use it today! You’ll need a pretty souped-up Babel setup though. This StackOverflow answer covers it pretty well.

The rules for Async/Await are very simple. Functions that utilize this fancy technique need to be augmented with the async keyword:

async function doThings() { ... }

Steps in the function that need to be treated as asynchronous need to be augmented with the await keyword:

async function doThings() {
const thing1 = await fetchThingPromise();
const thing2 = await fetchOtherThingPromise(thing1);
}

It allows us to write truly asynchronous, non-blocking code as if it was synchronous. We can have our cake and eat it too.

Drumroll, please…

At long last, here’s what the route looks like now:

There you have it! Vanilla, idiomatic Javascript, doing async without any complex constructs, like it’s no big deal.

In addition to how delightfully readable it is, there are a couple other benefits:

  • Native try/catch error handling! We don’t have to worry about anticipating failure details at every step.
  • No special API to learn (beyond the two new keywords). There’s no struggling trying to figure out how to pass data around in a chain.

Javascript continues to improve as a language. Now that the TC-39 committee is committing to yearly updates, I can’t wait to see how Javascript evolves! Thanks to the hard work of the community, these future features can be used today.

Thanks for reading :)

I’m Josh, a full-stack web developer at Breather
You can follow me on Twitter; I tweet about Javascript and cats, mostly.
I’m also on GitHub, although there are far fewer cats there.