Introduction to JavaScript generators: A beginner’s guide

Fabian Terh
8 min readJul 15, 2019

--

Recently, I’ve had to pick up the basics of JavaScript generators to better understand how Redux Sagas work (a topic for another day). I decided to piece together the bits and pieces of information I could gather from various websites, and condense them into a single article, which I hope would be both accessible yet rigorous enough to serve as a beginner’s working guide to generators.

Introduction

Generators were introduced to JavaScript in ES6. Generator functions are similar to regular functions, except that they can be paused and resumed. Generators are also very closely related to iterators, in the sense that a generator object is an iterator.

In JavaScript, functions normally cannot be paused or stopped once it is invoked. (Yes, an async function pauses while waiting on an await statement, but async functions were only introduced in ES7. Also, async functions are built on generators anyway.) A normal function only completes when it returns, or throws an error.

function foo() {
console.log('Starting');
const x = 42;
console.log(x);
console.log('Stop me if you can');
console.log('But you cannot');
}

In contrast, generators allow execution to be paused at arbitrary breakpoints, and resumed from the same point.

Generators and Iterators

From MDN:

In JavaScript an iterator is an object which defines a sequence and potentially a return value upon its termination. More specifically an iterator is any object which implements the Iterator protocol by having a next() method which returns an object with two properties: value, the next value in the sequence; and done, which is true if the last value in the sequence has already been consumed. If value is present alongside done, it is the iterator’s return value.

Therefore, iterators are essentially:

  1. Objects that define sequences
  2. Have a next() method…
  3. …which returns an object with 2 properties: value and done

Do you require generators to create iterators? Nope. In fact, you could already create an infinite Fibonacci sequence using closures pre-ES6, as seen in this example:

var fibonacci = {
next: (function () {
var pre = 0, cur = 1;
return function () {
tmp = pre;
pre = cur;
cur += tmp;
return cur;
};
})()
};
fibonacci.next(); // 1
fibonacci.next(); // 2
fibonacci.next(); // 3
fibonacci.next(); // 5
fibonacci.next(); // 8

I’ll again quote MDN on benefits of generators:

While custom iterators are a useful tool, their creation requires careful programming due to the need to explicitly maintain their internal state. Generator functions provide a powerful alternative: they allow you to define an iterative algorithm by writing a single function whose execution is not continuous.

In other words, it is simpler to create iterators using generators (no closures required!), which means less potential for errors.

The relationship between generators and iterators is simply that generator functions return generator objects which are iterators.

Syntax

Generator functions are created using the function * syntax, and paused using the yield keyword.

Calling a generator function initially does not execute any of its code; instead, it returns a generator object. Values are consumed by calling the generator’s next() method, which executes code until it encounters the yield keyword, upon which it pauses, until next() is called again.

function * makeGen() {
yield 'Hello';
yield 'World';
}
const g = makeGen(); // g is a generator
g.next(); // { value: 'Hello', done: false }
g.next(); // { value: 'World', done: false }
g.next(); // { value: undefined, done: true }
...

Calling g.next() repeatedly after our last statement above will only return (or more accurately, yield) the same return object: { value: undefined, done: true }.

yield pauses execution

You may notice something peculiar with the code snippet above. The second next() call yields an object with the property done: false, not done: true.

Shouldn’t the done property be true instead, since we are executing the last statement in the generator function? Well, no. When a yield statement is encountered, the value after it (‘World’ in this case) is yielded, and execution pauses. Therefore, the second next() call pauses on the second yield statement , and thus execution is not yet complete — execution is only complete (and done: true) when execution resumes after the second yield statement, and there is no more code to run.

Think of a next() call as telling the program to run until the next yield statement (assuming it exists), yield a value, and pause. The program would not know that there is nothing after that yield statement until it resumes execution, and it can only resume execution with another next() call.

yield vs return

In the above example, we use yield to pass values to outside of the generator. We can also use return (as in a normal function); however, using return terminates execution and sets done: true .

function * makeGen() {
yield 'Hello';
return 'Bye';
yield 'World';
}
const g = makeGen(); // g is a generator
g.next(); // { value: 'Hello', done: false }
g.next(); // { value: 'Bye', done: true }
g.next(); // { value: undefined, done: true }
...

Because execution does not pause on a return statement, and by definition there cannot be any further code execution after a return statement, done is set to true.

yield: 2-way communication

So far, we’ve been using yield to pass values outside of the generator (and also pausing its execution).

However, yield is in fact a 2-way street, and allows values to be passed into the generator function as well.

function * makeGen() {
const foo = yield 'Hello world';
console.log(foo);
}
const g = makeGen();
g.next(1); // { value: 'Hello world', done: false }
g.next(2); // logs 2, yields { value: undefined, done: true }

Wait a second. Shouldn’t 1 be logged to console, not 2? I found this part conceptually counter-intuitive at first, as I expected the assignment foo = 1. After all, we passed “1” into the next() method call that yielded Hello world, right?

It turns out that’s not how it works. The value passed into the first next(...) call will be discarded. There really isn’t a why except that this seems to be ES6 specifications. You can read more about it in this article (with examples).

I like to rationalize the execution of the program like this:

  • On the first next() call, it runs until it encounters yield 'Hello world', upon which yields { value: 'Hello world', done: false } and pauses. That’s all. As you can see, any value passed into the first next() call is unused (and thus discarded).
  • When next(...) is called again, execution resumes. In this case, execution entails assigning some value (determined by the yield statement) to the constant foo. Therefore, our second call of next(2) assigns foo = 2. The program doesn’t stop there though — it runs until it encounters the next yield, or a return statement. In this case, there are no more yields, so it logs 2 and returns undefined, with done: true.

Going async with generators

Since yield is a 2-way channel that allows information to flow in both directions, it allows us to use generators in very cool ways. So far, we’ve been primarily using yield to pass values outside of the generator. But we can also take advantage of the 2-way nature of yield to write asynchronous functions in a synchronous manner.

Using the concepts above, we could create a rudimentary function that resembles synchronous code but really executes asynchronous functions:

function request(url) {
fetch(url).then(res => {
it.next(res); // Resume iterator execution
});
}
function * main() {
const rawResponse = yield request('https://some-url.com');
const returnValue = synchronouslyProcess(rawResponse);
console.log(returnValue);
}
const it = main();
it.next(); // Remember, the first next() call doesn't accept input

Here’s how it works. First, we declare a request function and main generator function. Next, we create an iterator it by calling main(). Then, we call it.next() to start things off.

On the first line of function * main(), execution pauses after yield request('https://some-url.com'). request() implicitly returns undefined, so we are effectively yielding undefined, but it doesn’t matter — we’re not making use of that yield value anyway.

When the fetch() call in the request() function completes, it calls it.next(res), which does 2 things:

  1. It resumes execution; and
  2. It passes res into the generator function, which is assigned to rawResponse

Finally, the rest of main() completes synchronously.

This is a very rudimentary setup that should bear some resemblance to promises. For a much more detailed exposition on yield and asynchronicity, check out this article.

Generators are single-use

You can’t reuse generators, but you can create new generators from your generator function.

function * makeGen() {
yield 42;
}
const g1 = makeGen();
const g2 = makeGen();
g1.next(); // { value: 42, done: false }
g1.next(); // { value: undefined, done: true }
g1.next(); // No way to reset this!
g2.next(); // { value: 42, done: false }
...
const g3 = makeGen(); // Create a new generator
g3.next(); // { value: 42, done: false }

Infinite sequences

Iterators represent sequences, kind of like how arrays do. So, we should be able to express all iterators as arrays, right?

Well, no. Arrays require eager allocation upon creation, while iterators are consumed lazily. Arrays are eager because creating an array of n elements require all n elements to be first created/calculated so they can be stored in the array. On the contrary, iterators are lazy because the next value in the sequence is only created/calculated when it is consumed.

Therefore, an array representing an infinite sequence is physically impossible (we would need infinite memory to store infinite items!), whereas an iterator can easily represent (not store) that sequence.

Let’s create an infinite sequence of numbers from 1 to positive infinity. Unlike an array, this does not require infinite memory, because each value in the sequence is only lazily computed when it is consumed.

function * makeInfiniteSequence() {
var curr = 1;
while (true) {
yield curr;
curr += 1;
}
}
const is = makeInfiniteSequence();
is.next(); { value: 1, done: false }
is.next(); { value: 2, done: false }
is.next(); { value: 3, done: false }
... // It will never end

Fun fact: this is similar to Python generator expressions vs list comprehension. While the two are functionally identical, generator expressions offer memory advantages as values are evaluated lazily, while list comprehensions evaluate values eagerly and create the entire list at once.

Conclusion

This article has been really fun to write, and I’ve learned a lot throughout. However, there are probably still a ton of crazy, awesome things you can accomplish with generators that this article does not even scratch the surface of. It’s not meant to, though — it’s intended only as a working guide for beginners. So go ahead and dive deep!

--

--