How to Write With an Artificial Intelligence

Creative Writing 1010101

Jamie Brew
HuggingFace
5 min readNov 26, 2019

--

Text-generating neural networks like OpenAI’s GPT-2 often raise questions about the dangers of fake text: Can a machine write text that’s convincingly, deceptively human?

As a comedy writer, I’m more interested in the opposite question: Can a machine produce words that no human would ever write? Can it help me write things that I would never write?

Write With Transformer is a web app that lets you write in collaboration with a text-generating neural network. It’s a demo for Transformers, a state-of-the-art software library developed and maintained by Hugging Face.

This post covers the basics of the app, a few strategies for using it as a writer and some more advanced controls.

Basic controls

https://transformer.huggingface.co/doc/gpt2-large

Write With Transformer is a normal text editor with one twist: At any time, you can appeal to GPT-2 for suggestions.

To make them, the machine considers all possible next words, chooses one of those words, considers all possible next words after that, and repeats until it runs out of time. It does this in three different places at once, which is how it arrives at three different suggestions.

You can read more about what’s going on under the hood here.

For now, here are the main predictive text commands to know:

  • Press Tab to ask the neural network for 3 suggestions to continue what you have written so far.
  • Keep pressing Tab as many times as you like to repeatedly request three more suggestions.
  • Use the arrow keys and enter or click to select one of the suggestions.

Writing methods

Here’s a list of just a few approaches you can take to using Write With Transformer.

1. Blind devotion

To remove yourself from the equation and see what the neural net might generate “on its own”, you can decide from the start that you’ll always take the first suggestion. If you start from a blank page, the first few words can be disorienting, like falling asleep and waking up in a random corner of the internet…

2. Branching path

Limit yourself to the three options supplied by the app, letting it tell you a choose-your-own-adventure tale about whatever the internet had on its mind when the training data was collected…

3. Tag team

Prompt the machine with a thought, then let its response prompt you. Go back and forth as cowriters, or warring Wikipedia editors…

4. Rewrites

Bring in familiar text from somewhere else, delete the end of it and see how Transformer would have completed it…

Note: I got curious about the second option, which seems to be the start of a full-scale FAQ about chickens. So I opened the app again and kept going. You can read the FAQ here.

5. Continuing lists

Transformers are great at picking up patterns in series of items. This makes it especially fun to prompt them with incomplete lists.

Try prompting with the start of a horizontal list, like this:

Or the start of a vertical list, like this:

6. Freeform

The repetitive structure of lists lends itself to transformer writing. The same applies to any kind of writing with a recognizable, consistent structure. Try interviews, step-by-step instructions, or invent your own new format and see what patterns the neural net picks up.

Advanced settings

You can adjust four settings in the bottom left corner of the app, controlling Model size, Top-p, Temperature and Max time.

Let’s look at each of these in turn.

Model size

Larger models have more parameters, which roughly means they can remember patterns from their training set in greater detail. This means larger models offer suggestions that are more specifically related to the prompt.

Suggestions from larger models are also shorter. This is because the models run slower, so within the time window set by Max time (see below), they can generate fewer words.

Temperature

The most poetically named parameter, temperature controls how adventurous the algorithm is with its word choices. Turning the temperature up makes suggestions wilder and less predictable.

Here’s a typical continuation at low temperature:

And here’s one at high temperature:

Top-p

This setting controls how broad a range of continuations are considered. Set it high to consider all continuations. Set it low to just consider likely continuations. The overall effect is similar to temperature, but more subtle.

Max time

This controls how long the suggestions are. The model will always generate as many words as it has time for. To ask for just a few words, set the maximum time to low. For longer suggestion blocks, choose a small model size and a high maximum time.

Sharing your writing

Write With Transformer has two built-in sharing mechanisms.

  1. Screenshot

For short paragraphs, this button exports your document to an image, with Transformer-written text rendered in bold.

2. Save and publish

Ideal for longer documents. This option gives you links that let you return to editing a document later, or share with friends, who can read it or edit further themselves.

For example: Here’s the Chicken FAQ I created from the document that started “Why did the chicken cross the road?”

3. Duplicate and edit

Starting from a shared document (like the Chicken FAQ) click the Duplicate & Edit button to do just that: create a copy that you can edit via any human-machine balance you choose.

Please, duplicate the FAQ and help me learn more about the chicken.

--

--

Jamie Brew
HuggingFace

writer/computer @botnikstudios + @huggingface, formerly @clickhole + @theonion