TLDR: The pre-order of my book, Paranoid Transformer, completely generated by a bunch of neural networks, is now open. In this post you can find some story behind it.
Today starts NaNoGenMo, the annual challenge for automatic text generation. To participate you have to write and publish a code that generates a literary text at least 50K words long. I have not yet decided whether I will participate this year, but last year I took part, and now I will tell you what came of it, just to close the gestalt.
A year ago I made and submitted a combination of two neural networks: the first is a conditional GPT generator, pre-trained on a bunch of cyberpunk and cypherpunk texts, aphorisms, and complex authors like Kafka and Rumi. The second is a BERT filter, which rejects boring and clumsy phrases and keeps valid and shiny ones. I trained this filter on a manual markup, the main part of which was done by Ivan Yamshchikov. In the end, it turned out as a pretty good generator of cyber-paranoid delusions in English. I named it Paranoid Transformer if you understand.
NaNoGenMo ended quietly, but in mid-December, Augusto Corvalan, editor-in-chief of the strange DEAD ALIVE magazine, wrote and invited me to participate with the same project in their competition THE NEW SIGHT. I decided it’s boring to use the same thing twice, and added the following twist: I took an existing recurrent neural network trained for generating handwritten text, made the “nervousness” of the handwriting to depend on the emotionality of a particular sentence (determined by sentiment analysis) and “hand-wrote” the whole text of Paranoid Transformer. It turned out quite atmospheric so that at the end of January I even took first place according to the results of that competition.
In February, as the winner of the competition, I was offered to publish this text. Again I decided that it was not very interesting to re-use the same project without changes, so I added a few more things: I made the generation of dates for the sections, as in a real diary, added round coffee mug marks here and there; Dmitry Kuznetsov suggested an idea for drawings in the margins, and I used something like a Sketch-RNN, pre-trained on the Google’s Quick, Draw! dataset, so that every time a word from the dataset categories appeared in the text, scribbles appeared in the margins, corresponding to this word. The publishers read the entire generated text (sic!) and sent me a list of potentially offensive statements (a lot!). I decided it’s too lame to just delete them, and instead made a heuristic that aggressively crosses out such areas right in the text, which added some emotion.
In the meantime, Ivan Yamshchikov and Yana Agafonova helped me write the text of the gonzo paper “Paranoid Transformer: Reading Narrative of Madness as Computational Approach to Creativity”, which in September got to the International Conference on Computational Creativity 2020, and just a week ago its expanded version published in the Future Internet magazine.
At the stage of searching for authors for the preface and foreword to the book, I had a hard time. Borislav Kozlovsky gave me a spontaneous master class on writing “cold” requests to unfamiliar celebrities so that they at least answer the letter, so I asked a review from a couple of the most relevant to this project and indeed important for me people: Luba Elliott, a producer of creative AI projects, co-organizer of NeurIPS Creativity Workshop, and Nick Montfort, a poet and professor of digital media at MIT and the known narrative and interactive fiction enthusiast. Their reviews completed the book. The cover with the generative pattern was made by Augusto Corvalan.
The Paranoid Transformer book itself is now available for pre-order on the publisher’s website, the shipping is promised in December, however, I personally do not bear any responsibility for the sale of the book, also, please note, delivery can take quite a lot of time especially during the quarantine period. Judging by the signal copy that I have in my hands, the book turned out to be quite solid, see several photos and pictures from it below: