Build Your Own Quotes Generator

Teach machine to make quotes!

Hervind Philipe
Data Folks Indonesia
4 min readFeb 3, 2019

--

Photo by Émile Perron on Unsplash

Famous people like Ralph Waldo Emerson, Oscar Wilde, Anne Frank, Walt Disney, Victor Hugo, and Barack Obama have gone through so many up and down situation and learned from numerous gurus in their life. What they said or wrote are based on life experience, deep thinking combined with personal creativity. The question is could we borrow the wisdom of the authors only by reading their infamous quotes? Let make a machine learning program!

Most people are other people. Their thoughts are someone else’s opinions, their lives a mimicry, their passions a quotation — Oscar Wilde

Through this article, we will be able to make a machine that can generate its own quote only by read several quotes using Recurrent Neural Network (RNN). Many thanks to Alvations for providing quotes dataset in Kaggle.

This dataset is consist of 36,165 quotes with 878,450 words from 2,297 famous people (Author, singer, politician, sportsman, scientist, etc.).

What a wonderful dataset!

How machine eat the data

In this case, I use character level learning. Basically, it takes a sequence of character and matches with the next character.

We just tell that after ‘Love peopl’ is character ‘e’ and so on.

Well, it might sound ridiculous how can they make a quote or even a proper sentence by these inputs, keep going and see the magic happen.

How they learn

From a bunch of inputs as defined above, we want the machine to learn other possible input. we have 10 characters and 1 target at the end of each input so we will use many-to-one RNN (other types are well explained by Karpathy) with Long Short Term Memory (LSTM) cell

Inside of LSTM cell consists of several components:

LSTM cell on Deeplearning.ai course

Basically, LSTM cell is favorable in sequence model because it incorporates the current input (for example 7th character) with processed information of previous inputs (1st — 6th character) in an intuitive way.

When processed information and current input is combined in the cell, the cell will consider how much the past information should be forgotten (in forget gate), how much the new information should be used (in update gate) and calculate the output of the cell itself (in output gate) that will be used in next input.

If you are interested in the detail of LSTM, Olah blog is a good reference or direct to the original paper Hochreiter (1991) [German] and Bengio, et al. (1994).

Now speak up!

After the network is trained, it is time to make the network produce a full sentence (inspirational one hopefully). The steps are described as:

  1. Start with 10 initial characters as out sequence, it can randomly pick 1 or 2 words.
  2. Insert the last 10 characters in the sequence to RNN then we get the probabilities of next candidate characters.
  3. We sample the next candidate with weight is the probabilities from RNN.
  4. Repeat step 2 and 3 until we get the end of a sentence.

This is the result of an untrained network and trained network.

I know the quote is kinda absurd

You can find the full code on my Github repo.

Poetry in Bahasa Indonesia

With a similar concept, we can generate poetry as well. This is my Github repo to generate Bahasa Indonesia poetry, there is also a small dataset of famous Indonesia poetry.

What to be improved

There are so many improvements you can to in quote generator

  • Use word level instead of character level.
  • Using embedding in character vector.
  • Use larger network, combine bidirectional LSTM and then LSTM again sounds cool!
  • Implement beam search method in generating part.
  • Personalize quote with current mood (?).

References:

Cheers!

--

--