How to build an FAQ Chatbot with API.AI using Node.js — and PHP

In Universal Chat, we’ve implemented a chatbot which can be trained by anyone effortlessly. The logic is simple; just give it your questions and answers, and the bot will reply to any customer who asks something similar.

In this paper, I’ll show you how we’ve done it so you can do the same.

Choice of framework

To start, I’ve tested many different Machine Learning frameworks out there — Wit.AI, Microsoft Cognitive services, I had a peek at Luis… for me, it turned out that API.AI was the correct choice. It was totally free with a good response rate, it had a great user interface for development, and it was easy to setup, get started with and expand.

I’m not going through development as the documents out there are already great. There is just one shortcoming when it comes to the Node.js SDK; none of the methods that involve a developer token have been implemented. So, I created my own fork which solves this problem. Feel free to clone and use it.

Unique problems — and solutions

When you build a service that’s going to be used by many users, you want to make sure the information does not overlap. For instance, one user might put a question such as “Which ML Framework is the best?” and set the answer to “” while another user might put “” as the answer. The bot should understand which user is being queried and return the respective answer.

To solve this, we used Context variables — by defining the context, we could separate the answers for each user and her clients on an individual thread.

Letting users create their own FAQ is pretty smart — turning it into a conversational agent is just about adding new intents and defining the bot’s response. You can also enrich the experience by importing ready-made templates from the library, which give you a head-start in your niche.

To create new intents, you’ll need the github fork I talked about earlier.

Just one more issue before we start; the chatbot will not be accurate all the time. We need a way to fall back when that happens.

In Universal Chat, we are coping with this in two ways:

  1. For each answer, we allow visitors to escalate to a live operator. When visitors do that, we also note the question they asked so that the correct answer can be added to the database later
  2. Sometimes cannot find the answer — in that case it returns an “input.unknown” result. That’s another place to watch out for and create the questions

So, now we have all ingredients;

  • as the Machine Learning framework
  • A Node.js fork that can correctly create new intents
  • A roadmap on how to build it

Lets get started. I’m not getting into much details of issues you can find elsewhere on the web, but I’ll show you the two main issues we discussed above: calling with context and creating intents.

Calling within a context

We’ll create this method for querying

function ask(text, options) {
let apiaiRequest = apiai.textRequest(text, options);
apiaiRequest.on(‘response’, (response) => { console.log(response); })
apiaiRequest.on(‘error’, (error) => { console.log(error); });

Then, you can query it like this:

ask(‘some question’, {
 sessionId: ‘some unique id for a session’,
 contexts: [{ name: ‘user 1’ }]

That’s all. Replace the console.log code with something more useful. In github,I’ve shown how you can rewrite the above using promises for a cleaner alternative.

Creating new intents

To create a new intent, use the POST method. Again, all of this is well documented and available, so lets get only in the new stuff:

function createIntent(options) {
 return new Promise((resolve, reject) => {
 let request = app.intentPostRequest(options);
request.on(‘response’, (response) => { return resolve(response); });
request.on(‘error’, (error) => { return reject(error); });

Now, call it with something like this:

var opts = {“name”: “<the new question>”,
 “auto”: true,
 “templates”: [“<the new question>” ],
 “contexts”: [‘<the context>’ ],
 “userSays”: [
 { “data”: [ {“text”: “<the new question>”} ], “isTemplate”: false, “count”: 0 }],
 “responses” => [{“speech”: “<the answer>”} ]};

Important things to note:

  • Don’t forget auto: true — or you will not benefit from the magic of machine learning
  • You’ve probably noticed that I have used “the new question” several times — that’s OK and it works
  • You also noticed that we included the context here — a guarantee that the questions and answers are kept and served separately for each client

Bonus; creating intents with the PHP SDK

In case you’ve setup a website to manage those things, you could use the PHP SDK. Again, the original one did not support creating intents either, so we added that in this fork. There is a self-explanatory sample on how to do that in the so I wont bother you with that here.


Modern Machine Learning frameworks make it very easy to setup conversational agents — you just witnessed one in less than 4 minutes. Just be sure to introduce proper fallbacks to save the bot’s skin when harder queries hits it.