Via UCLA’s Young Research Library collection

Why I love National Novel Generation Month

My favorites from 2017

Every November, a hundred or so people write computer programs that attempt to generate “novels.” NaNoGenMo is a tongue-in-cheek competition that, like its inspiration, National Novel Writing Month, has no prizes or rankings. There are only two rules: your program must output at least 50,000 words, and you must publish its source code.

NaNoGenMo inspires news stories in the vein of, “Can AIs Replace Authors?” and while I appreciate the attention those pieces give to individual entries, they tend to miss the spirit of the whole endeavor.

These are tributes to human creativity, not attempts to subvert or replace it. The stories are often deliberately bad or unreadable. They are a chance to publicly write lousy throwaway code in a field that can be judgmental about what constitutes “real programming.” I often encourage new developers to participate—creativity and a sense of humor is more important than raw skill.

My entry for 2017, A Physical Book (live demo)

NaNoGenMo is, for me, an opportunity to think of something, code it up in a few days, and release it to a small community of like-minded weirdos. Last year I wrote The Days Left Forebodings and Water in a white-hot rage the days after the US election. This year was a little better — I had fun again, I made words fall down — but I appreciate the projects that touched on social issues.

Some of the 2017 entries that I found most striking (in alphabetical order):

B-9 Indifference

Eoin Noble’s second NaNoGenMo work (after Captain’s Log) makes smart use of Markov chains, a good corpus, and pitch-perfect visual design to make a genuinely entertaining read:

B-9 Indifference

Citizens[]

Cameron Edmond’s simulation of a fictional immigration bureau only admits candidates who resemble existing residents or have enough money to buy their way in. After 22,612 simulated petitioners are generated, only 21 are admitted:

Citizens[] full output (large PDF)

Emic Automata

Mark Rickerby made this entry. I don’t understand it, but I like it.

First Lines

Janelle Shane couldn’t find enough first sentences for her entry, so she crowd-sourced them. She wrote a fun writeup of the project, which serves as a good introduction to neural-network-driven writing.

The network produced some genuinely good and original first sentences:

I am not a story to be read.
The first day of the world was a mistake.
I am a preacher of merry days!
We’re going to be a book in the year 1071.

Also a good candidate for “2017: The Novel”:

It was a dumb afternoon of the world.

The project also serves as a warning about the dangers of unsanitized input data:

140,000 words of output available here. Unfortunately, due to a prank in the input data that I didn’t catch till after I trained the neural network, 37,000 of them are the word “sand”.

On the other hand, that is also extremely NaNoGenMo.

Hard West Turn

Nick Montfort’s entry is neither fun nor playful. It mixes factual information scraped from Wikipedia entries on mass shootings in the United States with handwritten narrative. Like many great NaNoGenMo works, it is effective because it reads like it was written by something broken and wrong, about something that is both broken and wrong:

The man had regrets; he was trying to see if, through thinking on matters, he could develop something other than regrets. Together. Sometimes. For that reason many countries only allow soldiers. Nose was bleeding heavily, nose was bleeding heavily. The nine millimeter is often referred to as a nine. Officially, officially, officially, officially, officially, officially, officially, officially.

HUMAN.DOC

Ranjit Bhatnagar always makes wonderful things — even his last-minute entries are great. I especially like NaNoGenMo books that change behavior throughout their length.

An homage to Tom Phillips’ A Humument (1973) using the same source material

The Infinite Fight Scene

Filip Hráček took his procedural adventure game Insignificant Little Vermin and let the game’s initial battle run forever to produce The Infinite Fight Scene. I enjoyed playing Vermin in this year’s Interactive Fiction Competition and as a former engineering manager I applaud Hráček’s pragmatic reuse of existing code.

Briana swings Orcthorn at the kobold. He tries to dodge but is out of balance. Briana cuts across his neck. He drops to his knees, and keels over.

Oldschool Dungeon Crawler GameBook

This super-polished entry from delacannon generates a playable if nonsensical hybrid of a game manual and a Choose-Your-Own-Adventure book:

Visit the web version to get a new randomized game each time.

Pride, Prejudice

Picking an entry by project organizer hugovk is almost like cheating, but I love this concept so much:

The problem isn’t generating over 50,000 words. The problem is existing books are too long. Pride and Prejudice is 130,000 words, Moby Dick is 215,136 words (or 215,136 meows). And we all know 50,000 is the gold standard for a novel!

Hugo applies a series of transformations to apply contractions everywhere, remove redundant phrases, and delete honorifics, then summarizes the remaining sentences. The novel now begins ’Tis a truth universally acknowledged, which I think is unquestionably more punchy.

The Program Which Generates This Book

Self-describing books are practically a sub-genre of NaNoGenMo — Read Code Aloud is another example from this year — but Martin O’Leary’s entry is particularly elegant and, in his words, mind-numbingly verbose:

Offset 94
The computer loads a reference to the local variable named s and places it on top of the stack. The computer places the literal string 'The `if` statement ends here.\n\n' on top of the stack. The computer takes the top value from the stack and (in place) adds the second from top value from the stack to it, placing the result on top of the stack. The computer takes the top value from the stack and stores it in the local variable named s.
The computer loads a reference to the local variable named s and places it on top of the stack. The computer exits the current function, returning the top value on the stack.

Because of the nature of Python interpreters, the output will vary based on the exact hardware and software of the computer that runs it. O’Leary has a Patreon which is well worth supporting.

Sestina Generator

It is possible to accidentally learn something from NaNoGenMo; now I know about 12th century Occitan poetry. I appreciate forms of literature that require diagrams to explain:

From Wikipedia’s Sestina artice

Alisha Ukani’s Sestina generator creates a poem based on occupations and common household objects:

The mobile home installer cured a vase
The paving equipment operator tested a tomato
The training and development specialist muddled a water
The hostler sprayed a fridge
The apparel patternmaker relaxed a sofa
The shaper pleased a sandglass

I can’t tell whether it’s a serendipitous bug or a side-effect of the algorithm stretched out over 50,000 words, but the full version of the poem goes on for thousands of lines about needles and bonesaws:

The industrial machinery mechanic terrified a needle and a bonesaw
The psychology teacher settled a needle and a bonesaw
The elevator repairer sprouted a needle and a bonesaw
The segmental paver forced a needle and a bonesaw
The economist confessed a needle and a bonesaw
The magistrate judge bowed a needle and a bonesaw

Tillman, Victor Lima, KOD

Many NaNoGenMo projects rely on substituting words based on their semantic relationships to each other —an early favorite of mine, Twide and Twejudice by Michelle Fuller, replaced 19th century dialogue with slang from Twitter.

Kevan Davis goes a different direction and presents three books replaced with sentences based on their edit distance from each other—stripping meaning entirely. The reason it works so well is that he chose interesting corpora.

The Tempest with each sentence replaced by the closest match from International Code of Maritime Signals, 1969:

Re-enter SEBASTIAN, ANTONIO, and GONZALO
Welcome! what is your course? Shall I take you in tow? Have
you a doctor?
SEBASTIAN
I cannot be refloated by any means now available!

Romeo and Juliet replaced by a 2008 Geocities page on internet slang:

ROMEO
Are we having any fun yet?
JULIET
Anything that turns you on baby.
ROMEO
On the other other hand; The more I know, the less I
understand.
JULIET
Thanks for the thought (or) tip.

William Shakespeare Summarizes Everything

I am a sucker for visual entries and classic book covers, so J.R. Ladd’s is right up my alley:

More examples and a full 50,000 word page linked from the entry.

White to Play and Win

In the grand tradition of other books that take the fun out of puzzle-solving, Greg Kennedy’s entry narrates a mate-in-two chess problem:

White Rook e1e2
Harold considered moving his Rook from e1 to e2. Maude may answer by playing her Rook at a2 to a3.
Black Rook a2a3
Harold might respond with his King from h1 to g1. Harold might counter using his Rook from e2 to e3. Possibly, he may answer using his Rook from e2 to e4. Harold might answer by playing his Rook from e2 to e5 and putting Black in Check.

These were just my favorites; you can find more than 100 other projects, both complete and sketches of ideas, from 2017 and hundreds more from other years at the main NaNoGenMo repository.

References and other writeups