After working on and using Project December over the past few months, here are some of my insights.

GPT-2 and GPT-3 were initially described as text generators. You give them a prompt, and let them predict the next word over and over, until they spit out several paragraphs of text. Ovid’s Unicorn, The Universe is a Glitch, and lots of other noteworthy examples have been the result of exactly that: an initial prompt, followed by long-form generation. And when we study those examples, even the best ones, we can notice a few of the seams showing. Impressive, but definitely not…

Jason Rohrer

AI Research: Project December. Games: One Hour One Life, The Castle Doctrine, Diamond Trust, Inside a Star-filled Sky, Sleep is Death, Gravitation, Passage.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store