ADVANCED PYTHON PROGRAMMING

Next Generation

This time, we evolve once again—unto generators, from basic iteration to full-fledge coroutines.

Dan Gittik
10 min readApr 20, 2020

--

The next stop after functions (and beyond, and internals 1 and 2) is generators: functions with a state. The premise is very simple—you write a regular function with yield statements instead of return statements, and when you run the function—even as it’s yielded a value, you can resume it from there on, harvesting the fruit of its gradual execution. The use-cases, however, are many—and fascinating, so let’s get to it.

Yielding Control

Generators are often presented as a sophisticated way to encapsulate an iteration, even though they’re much more than that. Even so, let’s start with that:

When we call this generator function, it doesn’t immediate execute. Instead, it returns a generator object:

To get this generator goin’, we have to use the built-in next function:

It starts executing, gets to its first yield statement, which evaluates to 1—and that’s what we’re getting, with the generator frozen in time at that moment. When we’re ready, we can revive it:

Until, finally—after the generator’s exhausted its execution—it’ll start complaining:

The cool part about it—and the reason generators are so often associated with iterators—is that this behavior is suspiciously similar to the for loop protocol. When I say protocol, it makes it sound menacing; what I mean is just “how things happen”. You know how if statements automatically convert their arguments to booleans, so…

Is actually like:

Similarly, for loops take their arguments, convert them to an iterator using the built-in iter function, and keep invoking next on that and doing an iteration with its result, assigning it to the parameters in its declaration. Eventually, a StopIteration exception is raised, which the for loop silently catches, and stops. In fact, we can emulate this for loop…

…with this while loop:

Back to the topic: generators conveniently fit the next/StopIteration paradigm—but they also have this curious quality:

In other words, they don’t care for iter; it doesn’t do nothin’ to them, so they can be used in for loops as-is:

Or even:

Do note that if you keep a reference to a generator object—once it’s exhausted, it’ll keep raising StopIteration on every subsequent next, so it’ll result in rather short for loops:

That’s also the reason we can’t index generators, or ask about their length—they don’t actually describe a collection of items, but rather a process to generate said items—which is ongoing, and as such, unindexable and immeasurable:

To pull that off, you’d have to convert the generator to a list—which effectively executes it to exhaustion, and gathers all its items in a more accessible data structure:

Reaping the Generated Benefits

Generators are great for a great many reasons. First of all, their memory footprint is almost nonexistent, seeing as they’re a moment frozen in time, all zen-like, ready to generate the next item, et c’est tout; no past, no future. Compare this:

To this:

As a quick sidenote, back in Python 2.7, range used to return a list—meaning it had to allocate everything before the first item was even used. This led to the introduction of the rather generative xrange, and in Python 3—they figured there were a lot of backwards-incompatible changes anyway—it was replaced, so range is actually a generator:

A bunch of other stuff that should’ve been generators were converted to such in Python 3, introducing all sorts of fun breakages. But even in Python 2.7—generators were there, and they were used when it was clear an incremental solution is in order. Consider os.walk: here’s a code that prints the directory containing the password.txt file:

Traversing something of a recursive, nest-y nature such as a filesystem was an obvious candidate for a “stunted” process, which yields the next directory’s contents (separated to subdirectories and files), rather than loading the entire thing into memory at once.

Counting to Infinity

Speaking of obvious candidates for generators—some things are just impossible with regular data structures or functions. Consider this:

This guy’s gonna keep spinning forever—just yielding one integer after the other, until the stars go out and the universe grows silent. It’s actually quite handy for a counter:

But if an infinite loop is what your heart desires, there you have it:

Silly use-cases aside, this can be used to produce an actually useful infinite stream, such a pseudo-random generator. Whenever I talk about pseudo-random generators, I make it a point to mention the Blum-Blum-Shub algorithm, just because it has an amazing name, and I have a great time imagining the Carolesque tea party in which Blum, Blum and Shub came up with it:

If you really want to do list-like things to your iterators, there’s a standard module for that, give it a look:

Some More Nifty Features

What else is there to say about generators? Well, they can be nested—

But more elegantly so, using the yield from statement:

And they even work in comprehensions, if you use rounded parenthesis. Now you know who stole the tuple’s comprehension syntax:

This can come in handy if you have to do a loop to arrive at some result, but would like to break as soon as you do to avoid unnecessary calculations. You can do so in a comprehension, but using a list actually goes ahead and finishes the entire loop before giving you the result you need. For example, to find the first integer whose square is greater than 1000, you could do:

But then it’d go ahead and calculate all the squares from 33 to 999, too. This, on the other hand:

Only ever generates the first integer, and since nobody bothers to re-run it, it doesn’t bother to compute the rest; this is sometimes called lazy execution, and generators are accused of being lazy functions.

This usage is in fact so common, it even has some syntactic sugar of its own—you can drop the double parenthesis for better readability:

next gets a tad annoying if the iteration is empty. If I’d try to guess that such a number would be within the first 30, and limit my range thusly…

…I’d get an ugly error for trying to run what was an exhausted (effectively, empty) generator. To that end, next takes an extra keyword—a default value, in case the iterator it’s consuming ends up a disappintment:

The bad news are, I lied: the syntactic sugar for more elegant generator comprehension only works if it’s the only argument to a function; otherwise, it has to be parenthesized explicitly. The good news are, I didn’t lie: next does take an extra argument for a default value:

Oh Coroutines

The truth is, generators are much more interesting that that. Think about it this way: standard functions (sometimes called “subroutines”) have a single entry point—that is, they always start at the beginning, right underneath the signature—and can have multiple exit point—that is, return statements. Generators, on the other hand—in this context, called coroutines—also have multiple entry points.

Sure, they start at the beginning—but then they execute until the first yield statement, pause—and the next time you run them, they resume execution from that point. “Sure,” you might say, “but that’s not really an entry point—when I call a function, I can communicate some arguments to it, which parametrizes its execution, and is the entire point. Surely you can’t pass in arguments when you resume a generator.” Well:

This takes a while to digest, but it’s actually quite simple—yield statements are in fact expressions, whose value is, wait for it…

No, seriously, wait for it. When a coroutine hits a yield statement, it pauses—so while the yield’s argument is returned, that expression’s value is still pending. Only when you resume the execution, does it resolve:

That’s rather disappointing; None is not a very interesting value at all. However, the real reason we got it is that we called next, which simply resumes the generator, without passing in anything. It’s good for writing iterators, but for a proper coroutine you’d have to use send:

The first time we’re running the coroutine, there’s nothing to communicate—any initial arguments are passed in via the parameters, like in a normal function, so we might as well use next. The second time, however, we send in 'a', which gets assigned to x and printed, right before we get back 2. Should we send in something else:

That assigns 'b' to y, then falls right off the edge of the generator, resulting in its end and a StopIteration exception.

Exceptions and Return Values

Coroutines also work with exceptions; any yield statement can resume with an error—as if during the evaluating of its value, an exception was raised. That’d let us use more sophisticated flow control in our coroutines:

Similarly, when a coroutine does end—it can do so implicitly, silently falling off the function’s edge; or explicitly, with a return statement, which even supports a return value. It still raises a StopIteration, because that’s what ending a generator means—but the exception carries with it the return value, like so:

Conclusion

Generators are immensely cool. With a simple, additional keyword—yield—Python has managed to introduce an entirely new programming paradigm. Not only can it be used for efficient iterators and comprehensions, or to tackle tasks of infinite nature—it’s actually a full-fledge coroutine, with multiple entry and exit point, to which we can communicate arguments, and even exceptions, at any time. We’ll encounter generators again later on, as they resurface in some pretty advanced topics, like context managers and asynchronous programming. Until then, I yield my time.

The Advanced Python Programming series includes the following articles:

  1. A Value by Any Other Name
  2. To Be, or Not to Be
  3. Loopin’ Around
  4. Functions at Last
  5. To Functions, and Beyond!
  6. Function Internals 1
  7. Function Internals 2
  8. Next Generation
  9. Objects — Objects Everywhere
  10. Objects Incarnate
  11. Meddling with Primal Forces
  12. Descriptors Aplenty
  13. Death and Taxes
  14. Metaphysics
  15. The Ones that Got Away
  16. International Trade

--

--

Dan Gittik

Lecturer at Tel Aviv university. Having worked in Military Intelligence, Google and Magic Leap, I’m passionate about the intersection of theory and practice.