Programming a Chatbot

Chatbots are very common nowadays. Often when you go on a company website, there will be a bubble on the side of the page indicating for help with any questions. These bubbles are a form of chatbot. Many companies use them nowadays to personalize customer service and help improve the service they can provide to customers.

But how do these chatbots work? What exactly do they entail?

These chatbots derive from a form of artificial intelligence called natural language processing (NLP). This is the use of data sets to train a program to be able to answer certain sets of questions or input.

Here are 6 basic steps to program a chatbot:

1. Upload the Data set

The first step to programming a chatbot is through uploading a dataset. Here is the link to the data set that I used: Chatbot Intents Dataset. You may also edit the data set to provide any output for any input you would like.

from google.colab import filesuploaded = files.upload()
import json
with open('intents.json') as file:intents = json.load(file, strict = False)intents = intents['intents']print("[", end = "")for intent in intents:print("{", end = "")for key, value in intent.items():print("{}: {},".format(key, value))print("\b\b\n},")print("\b\b]")

2. Import Libraries

This is where you will import the libraries, a.k.a. the different pips you will need to use in order to perform certain applications on Google Colab, or whichever python environment you use.

import tflearnimport randomimport pickleimport numpy as npimport tensorflow as tf

3. Natural Language Processing

This is the section where NLP comes into play. NLP is a branch of AI that particularly analyzes human language and “translates” it to code so that it can be used in multiple ways. One particularly important aspect of NLP that is used for programming a chatbot are the concepts of stemming and tokenization.

Stemming is the concept of taking words and simplifying them so that the chatbot can understand the words easier. For example, the word “loving” could be made simpler into “love” and have the same general meaning. This is important for chatbots because the input that any person provides could vary widely depending on the person.

Tokenization is the concept of separating and creating ‘tokens’ out of the words in a sentence. This helps the program understand which words of a sentence are individual concepts and how to easily separate words within a huge data set of words.

import'all')from nltk.stem.snowball import SnowballStemmerstemmer = SnowballStemmer('english')retrain_model = Trueif retrain_model:all_words = [] all_tags = [] intent_patterns = [] intent_tags = [] for intent in intents:for pattern in intent['patterns']:words = nltk.word_tokenize(pattern)all_words.extend(words)intent_patterns.append(words)intent_tags.append(intent['tag'])all_tags.append(intent['tag'])all_words = [stemmer.stem(word.lower()) for word in all_words]all_words = sorted(list(set(all_words)))all_tags = sorted(all_tags)x_train = []y_train = []y_empty = [0 for i in range(len(all_tags))]
for index, intent in enumerate(intent_patterns):bag_of_words = []intent_words = [stemmer.stem(word.lower()) for word in intent]for word in all_words:if word in intent_words:bag_of_words.append(1)else:bag_of_words.append(0)one_hot_encode_y = y_empty[:]one_hot_encode_y[all_tags.index(intent_tags[index])] = 1x_train.append(bag_of_words)y_train.append(one_hot_encode_y)x_train = np.array(x_train)y_train = np.array(y_train)with open('training_data.pickle', 'wb') as f:pickle.dump((all_words, all_tags, x_train, y_train), f)else:with open('training_data.pickle', 'rb') as f:all_words, all_tags, x_train, y_train = pickle.load(f)

4. Deep Learning

This is where you create neural networks that help you train the data. At this step, your program will go through a series of epochs to train the data set. This is how a program can decide its accuracy and improve upon itself.

tf.reset_default_graph()neural_net = tflearn.input_data(shape = [None, len(x_train[0])])neural_net = tflearn.fully_connected(neural_net, 8)neural_net = tflearn.fully_connected(neural_net, 8)neural_net = tflearn.fully_connected(neural_net, len(y_train[0]), activation = 'softmax')neural_net = tflearn.regression(neural_net)model = tflearn.DNN(neural_net)if, y_train, n_epoch = 2000, batch_size = 8, show_metric = True)'model.tfl')else:model.load('./model.tfl')

5. Create the Chatbot

This is where you program the outputs for whatever the program has determined the output to be. In this case, we will only get a response if the neural network is more than 80% certain of the intended input. It will typically decide the output by random based on the intended input.

def text_to_bag(text, all_words):bag_of_words = [0 for i in range(len(all_words))]text_words = nltk.word_tokenize(text)text_words = [stemmer.stem(word.lower()) for word in text_words]for word in text_words:if word in all_words:bag_of_words[all_words.index(word)] = 1return np.array(bag_of_words)def chat():#Starting messageprint("Enter a message to talk to the bot [type quit to exit].")context_state = Nonedefault_responses = ['Sorry, Im not sure I know what you mean! You could try rephrasing that or saying something else!','You confuse me human. Lets talk about something else.','Im not sure what that means and I dont really care. Lets talk about something else','I dont understand that! Try rephrasing or saying something else.']while True:user_chat = str(input('You: '))if user_chat.lower() == 'quit':breakuser_chat_bag = text_to_bag(user_chat, all_words)response = model.predict([user_chat_bag])[0]response_index = np.argmax(response)response_tag = all_tags[response_index]if response[response_index] > 0.8:for intent in intents:if intent['tag'] == response_tag:if 'context_filter' not in intent or 'context_filter' in intent and intent['context_filter'] == context_state:possible_responses = intent['responses']if 'context_set' in intent:context_state = intent['context_set']else:context_state = Noneprint(random.choice(possible_responses))else:print(random.choice(default_responses))else:print(random.choice(default_responses))

6. Chatting!

This is where you may begin the conversation with the chatbot! Here is an example of one of the first conversations I had with my chatbot. The chatbot in this case is particularly mean, but is based on SkyNet from the Terminator. This is just based on the data set that you choose to use. Chatbots can be programmed to help with services in many industries, provide better customer service, and more!

I based this project on:

Here’s a link to my git:

My name is Alyssa Gould, and I’m passionate about the intersection between Artificial Intelligence and Second Language Acquisition!

I’m excited to use my skills to find interesting opportunities this summer. If you have advice on interesting summer opportunities in NLP, I would love to connect with a quick 15 minute conversation!

Feel free to contact me at for questions or anything!



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store