Developing a Customer Support Chatbot with Botfuel QNA in 45 minutes

Kevin Adda
Botfuel
Published in
7 min readDec 1, 2017

During Microsoft Experience ’17 in Paris, I set myself a challenge : teach an audience of developers how to make a customer support chatbot in 45 minutes, and still leave time for some questions at the end of the session. The use case is pretty simple: create a FAQ chatbot from a well-known driver platform.

To make it possible, I decided to use:

  • Botfuel QnA, our FAQ chatbot building interface, for its simplicity and its realtime performance assessment;
  • Microsoft Bot SDK, for its tooling and debugging interface. However, the same approach would work with any bot building framework of your choice.

Let’s jump right into it.

Bootstrap a chatbot with BotBuilder

Before we start, be sure you are able to open a terminal on your computer, create a folder and navigate to it. If you are not comfortable with it, take a few minutes to find and read some material on this, specific to your OS.

Initialize a new node project

You will need to have Node.js installed before going any further. If you don’t, you will find setup instructionshere.

In your command line and from your freshly created folder:

npm init

Create a server with restify

In your command line:

npm install --save restify

In its simplest form, a chatbot is an API with only one endpoint.

Create a new file called app.js. In this file, copy and paste:

const restify = require('restify');// Create a server
const server = restify.createServer();
server.listen(3978, () => {
console.log('%s listening to %s', server.name, server.url);
});
// Create route /api/messages to communicate with your chat interface
server.post('/api/messages', (req, res, next) => res.send('OK'));

This code is all we need to create our bot’s endpoint. Let’s test it!

In your terminal, run it with the command:

node app.js

In another terminal, try to call our endpoint using cURL:

curl -X POST localhost:3978/api/messages

NB: you may need to install cURL to do this test.

Install and use Botbuilder

Botbuilder provides specifications for our endpoint. In your command line:

npm install --save botbuilder

We need to instantiate a connector to chat apps and a bot. In app.js, add:

const builder = require('botbuilder');

// Instantiate the BotBuilder connector and bot
// both parameters can be left undefined when testing with emulator
const connector = new builder.ChatConnector({
appId: process.env.MICROSOFT_APP_ID,
appPassword: process.env.MICROSOFT_APP_PASSWORD,
});

const bot = new builder.UniversalBot(connector);

The connector should be plugged to our endpoint to listen to incoming messages. In app.js, replace the previous route definition with:

server.post('/api/messages', connector.listen());

Let’s test our endpoint again with cURL:

curl -X POST localhost:3978/api/messages --data '{}'

You should see an error: this is intended, BotBuilder implements a protocol that we don’t respect with our naive call. We’ll use Bot Framework Emulator as a chat interface, that will take care of calling our endpoint with the right specifications.

Install Bot Framework Emulator, launch it and enter your endpoint url (i.e.localhost:3978/api/messages) in the top bar.

Create a minimal dialog

NB: As the demo took place in Paris, the example bot is in French. However, Botfuel apps support English, French and Spanish.

BotBuilder bots rely on dialogs to answer calls. There can be one or many dialogs. In app.js, let’s create our root dialog:

bot.dialog('/', (session) => session.send('Bonjour!'));

You can test it on Bot Framework Emulator: whatever the message, the bot will answer the same.

Add a first level of complexity

Of course we don’t want our bot to give the same answer every time. In a single dialog context, our bot should answer based on the user message.

In app.js, replace our dialog with:

const greetings = ['bjr', 'bonjour', 'bonsoir', 'hello', 'coucou', 'enchante'];
const goodbyes = ['au revoir', 'bye', 'adieu', 'ciao'];

const isGreetings = text => {
return greetings.some(word => word === text.toLowerCase());
};

const isGoodbye = text => {
return goodbyes.some(word => word === text.toLowerCase());
};

bot.dialog('/', session => {
if (isGreetings(session.message.text)) {
session.send('Bonjour!');
} else if (isGoodbye(session.message.text)) {
session.send('Au revoir!');
} else {
session.send('Je ne comprends pas ...');
}
});

Our bot is now able to distinguish greetings from goodbyes, based on the above arrays and simple matching tests. Test it!

Add Botfuel QnA service to your Bot

So our bot is already able to say hello and goodbye, based on user messages. Integrating a FAQ is a bit different: the user questions will show more variation in vocabulary and syntax, and our predefined questions may have more similarity than greetings and goodbyes. In our use case, the terms “driver” or “car” will not be sufficient to answer: defining arrays and matching tests as above has become a way more complex task.

Botfuel QnA does this job for you. Based on predefined questions and answers, as well as word corpuses (see below for more details), it builds a function that reads the user message and responds with the most probable answer.

Create a QnA bot

  • Create an account on Botfuel’s platform.
  • Create an app, select a language: in the rest of the article, we will provide examples in French, choose it if you want to use the provided sentences.
  • On the app dashboard, scroll down to services, and click on the Botfuel QnA button.
  • On Botfuel QnA interface, add some questions together with their answers.

Here are a few questions and answers specific to our use case. Note that multiple formulations of the same question (or intent) can be added, while the answer is unique.

For a better experience with Node.js, we implemented a light SDK for Botfuel QnA API. To install it from your command line:

npm i --save botfuel-qna-sdk

In app.js, instantiate it with your credentials:

const QnA = require('botfuel-qna-sdk');

const appId = ;
const appKey = ;

const QnAClient = new QnA({ appId, appKey });

NB: you will find your app credentials in the app dashboard.

With the SDK, calling Botfuel QnA classification API takes three lines of code. In app.js, add this simple method:

const answerWithQnA = (session, sentence) => {
QnAClient.getMatchingQnas({ sentence }).then((response) => {
session.send(response[0].answer);
});
};

Notice that response is an array, and we are only considering the first element.

Finally, update the dialog by adding this call. In your app.js, replace the dialog with:

bot.dialog('/', session => {
if (isGreetings(session.message.text)) {
session.send('Bonjour!');
} else if (isGoodbye(session.message.text)) {
session.send('Au revoir!');
} else {
answerWithQnA(session, session.message.text)
}
});

We have our first version of a FAQ chatbot! From the QnA interface, you can add questions to see how the bot behaves.

In real-world applications, predefined questions are numerous and user questions may not be precise enough to single out one of them. Furthermore, spelling mistakes or user vocabulary are affecting our bot performance. Let’s see how to tackle these issues.

From simple to real-world applications

Handling ambiguous questions

Remember we considered only the first element of Botfuel QnA response array. Botfuel QnA returns up to three intents when the probablity gap between them is fairly low (the thresholds are learned but you are free to use your custom threshold a posteriori). In most cases, you want to rely on the user for clarification.

We can do this by creating another dialog. First, update your call to Botfuel QnA to initiate this new dialog when multiples responses are received. In app.js replace answerWithQnA with:

const answerWithQnA = (session, sentence, callback) => {
QnAClient.getMatchingQnas({ sentence }).then(response => {
let message;
if (response.length > 1) {
session.replaceDialog('ambiguousQuestion', response);
} else {
if (response.length === 1) {
message = response[0].answer;
} else {
message = 'Je ne sais pas quoi dire ... Réessayez?';
}

callback ? callback(message) : session.send(message);
}
});
};

In app.js, add the ambiguous question dialog:

bot.dialog('ambiguousQuestion', (session, response) => {
if (response) {
const msg = new builder.Message(session)
.text('Je ne suis pas sur de comprendre, précisez:')
.suggestedActions(
builder.SuggestedActions.create(
session,
response.map(answer =>
builder.CardAction.postBack(
session,
`Vous avez demandé: "${answer
.questions[0]}"\n\n -- \n\n${answer.answer}`,
answer.questions[0]
)
)
)
);
session.send(msg);
} else {
session.send(session.message.text);
session.endConversation();
}
});

This dialog is invoked every time Botfuel QnA returns multiple intents. It offers a choice to the user, and answers accordingly to user choice, before ending the conversation.

Integrate spellchecking within your bot

In real-world situations, a simple typo in a discriminative word and your user won’t get an answer. For instance in our use case, the word chauffeur matches questions that will not be matched if spelled aschaufeur.

Botfuel offers NLP services including spell checking. As for Botfuel QnA, a light SDK is available to call the service.

To install the SDK from your command line:

npm install --save botfuel-nlp-sdk

In app.js, instantiate it with:

const { Spellchecking } = require('botfuel-nlp-sdk');

const spellchecker = new Spellchecking({ appId, appKey });

Then, replace answerWithQnA with:

const answerWithQnA = (session, sentence, callback) => {
spellchecker.compute({ sentence, key: 'FR_1' }).then(response => {
QnAClient.getMatchingQnas({
sentence: response.correctSentence,
}).then(response => {
let message;
if (response.length === 1) {
message = response[0].answer;
} else if (response.length > 1) {
session.replaceDialog('ambiguousQuestion', response);
} else {
message = 'Je ne sais pas quoi dire ... Réessayez?';
}
if (!(response.length > 1)) {
console.log('callbacking with message', message, callback);
callback ? callback(message) : session.send(message);
}
});
});
};

That’s it. You can test it by typing chaufeur again.

Use corpuses

Corpuses were made to allow the user to emphasize specific words, and manually add term synonyms.

In our use case, the word chauffeur is both important, and should be assorted with synonyms such asconductor or even operator. Botfuel QnA already does some work on synonyms by using word embeddingsbut this is usually not sufficient to discriminate one intent from another.

Conclusion

Within less than an hour we have been able to create a chatbot, connect it to a connector service that supports most of the popular channels, build a knowledge base for our bot on Botfuel QnA that generates an intent classification model, plug it to our bot, and fix a number of caveats that appear real-world scenarios.

Try Botfuel QnA for free

--

--