Abusing ES6 Promises

Daniel Lundin
2 min readOct 10, 2017

--

At a first look, promises look like a different way to handle callbacks. Working with them for a while, I’ve realized the provided primitives provide great building blocks for higher level abstractions.

Here I list some of my favourite tricks with promises. Use them with caution!

Broken promises

ES6 promises cannot be cancelled, but we can simulate a cancellation by saying rejecting a promise with a specific value means a cancelled promise. This technique involves racing the promise with another promise that we control:

For a more elaborate example of this, see my wrapper around redux-pack, redux-packa.

Note: This is not a real cancelletion, the underlying operation will still eventually reject or resolve, we just ignore the results.

Promises with timeouts

Another example of Promise.race is setting timeouts for promises. Again, the promise is not actually cancelled, we just ignore the results:

Sequential execution

Promise.all can be used to run a series of promises in parallell. But what if we wanted to run them in sequence?
Turns out promises work quite well together with functional constructs like Array.reduce. However, since promises are eager(execution will start as soon as the promise constructor is invoked), we need to wrap the creation of the promises in factory functions:

This is pretty neat, sort of like a poor man’s async/await. The resolved value from one operation will be passed on to the next.

What if we wanted it to work like `Promise.all` where each resolved value is returned in an array?

By initializing the reduce-operation with an empty array and let each promise operation concat its result the outer most .then will be an array of all the results:

Promises as gatekeepers

The last trick involves using promises as a way to throttle work. Imagine we have an API-endpoint that does some memory intensive work. Doing more than one of these operations concurrently will risk the process crashing:

By using a promise for queueing the work we can make sure that new requests to the endpoint won’t start doing the memory intensive task until the previous is completed:

This can of course be generalized further to an array of `promiseQueues` that allows a number of task to be processed concurrently:

Here `completionPromise` is used as a gatekeeper to make sure that only one task is allowed to wait for a free slot in the array promiseQueues.

This little utility is available as an npm module here.

Happy hacking!

--

--