Use Generators to Asynchronous control flow

Rafael Mosias
3 min readAug 16, 2020

--

When we want to execute a set of independent tasks one after another, we do not care about the result returned from each task but all we care about is executing them one after another. In this article, I will walk you through the primary use-case for generators: Asynchronous control flow.

What are generators and how do they work ?

If you're already familiar with how generators work, you can skip this section.

Generators are special functions that can be run, paused and resumed at different stages of their execution, thanks to the special keyword yield.

here is a simple example of an ES6 generator

function* myGenerator() {
yield ‘first’;
let input = yield ‘second’;
yield input;
}
// Getting the generator object
let gen = myGenerator();
// Launching the generator
console.log(gen.next()): // { value: ‘first’, done: false }
// First resume (no passed value)
console.log(gen.next()); // { value: ‘second’, done: false }
// First resume (pass a value)
console.log(gen.next(‘third’)); // { value: ‘third’, done: false }
// Last (no more yield)
console.log(gen.next()); // { value: undefined, done: true }

So, what just happened here ?

  • We declared a generator function using the special syntax function* myfunction() {}
  • Calling this function at first returns a generator object. This object has a method next to resume the generator function at its current state.
  • The generator function does not start its execution until the first gen.next is triggered.
  • Each time gen.next is called, the generator function is resumed and run until the next yield. the gen.next call returns an object containing the yielded value and a flag telling if the generator function has ended or not.
  • You may also have noticed that we can pass data to the generator function.

Asynchronous control flow

Let’s say that we want to generate a pdf file on the backend and then use the generated file to send an email. Our code using promises might look like this.

function generatePdf(pdfId) {
return fetch(`/generate/${pdfId}/pdf`, {
method: ‘get’
}).then(response => response.json());
}
function sendEmail(urlPdf, email) {
return fetch(‘/email’, {
method: ‘post’,
body: JSON.stringify({
pdf: urlPdf
email: email
})
}).then(response => response.json());
}
generatePdf(pdfId)
.then(result => sendEmail(result.urlPdf, result.email))
.then(responseMessage => {
console.log(responseMessage);
})
.catch(err => {
console.log(err);
});

We can do better and make this looks more synchronous using generators. Here is what we get:

import { takeLatest } from "redux-saga";function* watchEvents() {
yield takeLatest('LOAD', load);
}
function* load() {
try {
const result = yield generatePdf(pdfId);
yield sendEmail(result.urlPdf, result.email);
} catch (error) {
console.log(error);
}

In the above example I used the Redux-Saga, is a an alternative side effect model for Redux applications. It handles all the asynchronous control flow of Redux applications in a central place using … guess what, 😄 generators.

These small utilities work by yielding to a function which returns a promise. When that promise resolves, it calls the generator’s next() function, resuming execution.

In hand you are now empowered to write cleaner, simpler, more maintainable asynchronous javascript!

--

--