ConverCon — Build Your Own Chatbot 🤖 (part 2 of 2)

Mary-Jane McBride
6 min readSep 5, 2018

--

This is part two of a two part walkthrough — find part one here and a list of the pre-requisites here.

If you’ve been following along, you’ll have completed the LUIS app for our ConverCon chatbot. In this part, we’ll be learning about how to integrate LUIS into a chatbot to give it natural language understanding.

Setup

In our repo, open app.js. Notice the two imports we’re using:

var builder = require(‘botbuilder’);var restify = require(‘restify’);
  • botbuilder — Microsoft’s Open Source tool for programmatically building chatbots that can be easily deployed on a variety of different channels, such as Facebook Messenger, Skype, or Slack
  • restify — a node.js framework for building RESTful web services

The first thing we’ll do is install these packages. If you’re using Visual Studio Code, you can right click the file and select ‘Open in Terminal’. This will open a small terminal window within VS Code.

Run the following commands in the terminal to install botbuilder and restify:

npm install -s botbuilder
npm install -s restify
npm install -g nodemon

The next part of the code on the page, is the ChatConnector — this is what enables your chatbot to communicate with the aforementioned channels. When you register a chatbot on Azure, you’ll receive an App ID and App Password — this won’t be covered in this workshop, but you can find instructions in Microsoft’s Docs.

var connector = new builder.ChatConnector({   appId: '',   appPassword: ''});

This project uses restify to create a RESTful service, but you can use others, such as express. Here we set the port to 3978 and create a new URI to listen for messages.

var server = restify.createServer();   server.listen(3978, function () {   console.log('%s listening to %s', server.name, server.url);});server.post('/api/messages', connector.listen());

Next, we declared a new ‘bot’ of type ‘UniversalBot’ and set it to use MemoryBotStorage, which will manage the session state for your users.

var inMemoryStorage = new builder.MemoryBotStorage();var bot = new builder.UniversalBot(connector).set('storage',    inMemoryStorage);

Integrating LUIS

In the part 1, we published a LUIS app and received an endpoint — lets add that endpoint to our chatbot so it can start understanding us!

Under the previous code, add the following code with the link to your endpoint.

var recognizer = new builder.LuisRecognizer([ENDPOINT URL HERE]).onEnabled(function (context, callback) {var enabled = context.dialogStack().length == 0;   callback(null, enabled);});bot.recognizer(recognizer);

Notice, I’ve already added a few dialogs to our chatbot — check out the greeting dialog now.

bot.dialog('Greeting', function (session) {   session.send('Hello, I\'m your virtual guide to ConverCon. Ask me anything!');   session.endDialog();}).triggerAction({   matches: [/^hello*$|^hi*$/i]});
Your chatbot can only respond to the keywords ‘hello’ or ‘hi’

It’s a simple greeting dialog that sends a message when the user uses the words ‘Hello’ or ‘Hi’. This is obviously very limited — what if I want to say ‘Hey’? Our chatbot won’t know how to respond. If you used the pre-made LUIS app in part one, then you already have a ‘Greeting’ intent that we can use here instead. In the triggerAction section of the code, replace the regex with a simple string ‘Greeting’. This will tell your chatbot to use this dialog if the top scoring intent is ‘Greeting’.

.triggerAction({   matches: 'Greeting'});

Now your chatbot can understand all kinds of greetings!

Now your chatbot can respond to a variety of greetings, including ones it’s never seen before

Challenge: Can you add the intents for the ‘About’ and ‘Sponsors’ dialogs?

Responding to the ‘About’ intent

Creating a New Dialog

We’ll create a new dialog for the Schedule intent

bot.dialog('Schedule', function(session){}).triggerAction({   matches: 'Schedule'});

This dialog doesn’t do anything yet but lets break down what’s going on. The first parameter ‘Schedule’ is the name of the dialog — it needs to be a unique name. The second parameter is a function, it can also be an array of functions — these are the steps of the dialog. It refers to session, which manages the conversation with the user. Lastly, we’ve included a reference to the ‘Schedule’ intent in our LUIS app as the triggerAction.

But let’s give our dialog some meat… Add this snippet of code within the function of the ‘Schedule’ dialog.

var card = new builder.HeroCard(session)   .title('ConverCon 2018 Schedule')   .text('Here\'s a link to our full schedule...')   .buttons([      builder.CardAction.openUrl(session, 'https://www.convercon.ie/schedule/', 'See schedule')   ]);var msg = new builder.Message(session).addAttachment(card);session.endDialog(msg);

This might seem like a lot but I’ll break it down. Botbuilder uses HeroCards to display media and add more interactive components, like buttons. In Line 1 we create a HeroCard, line 2 gives the message a title, line 3 gives it a body of text. The subsequent 3 lines are adding a CardAction (button) that will open a Url when clicked, which we’ve specified as the schedule page of the ConverCon website. Next, we create a Message object and add the HeroCard as an attachment then session.endDialog sends the message while also ending the dialog.

There are other types of CardActions, which you can read about in more detail here.

Our chatbot now has a button to send users to the conference’s schedule

Waterfall Dialogs

Remember when I said that a dialog can have an array of functions, one for each step of the conversation? That’s called a Waterfall Dialog.

bot.dialog('BuyTickets',[
function(session){
},function (session, args){}]).triggerAction({matches: 'BuyTicket'});

Copying the same bare dialog as the ‘Schedule’, this time we’ll have an array with two functions instead of one single function.

Introducing the args parameter!

args will hold the response of the user after they respond to the first step in the first function of the ‘BuyTicket’ dialog.

In the first function, copy the following code…

var noTickets =   builder.EntityRecognizer.findEntity(args.intent.entities, 'builtin.number');if(noTickets){   session.endDialog('Ordering you %s tickets', noTickets.entity);} else {   builder.Prompts.text(session, 'How many tickets would you like?');}

This time, we’re using the EntityRecognizer object to receive any entities that LUIS identified in the utterance. Remember in part one when we labelled numbers in the ‘BuyTicket’ entity?

Because the number entity is a built-in entity, we must prefix it as builtin.number but custom entities don’t require this prefix.

And for times when the user doesn’t specify the number of tickets they’d like (and the returned entity is null), we’ll add a Prompt to ask the user how many tickets they want. Otherwise, we’ll end the dialog with a message (at this point you would connect your chatbot to a service that might handle the payment — check out Adaptive Cards to see how you could add rich cards and create forms)

Our chatbot can pick out important entities from messages sent by the user

This leads into the next function, or conversation step, which will receive the response from the Prompt.

Copy the following code into the second function in this dialog…

session.endDialog('Ordering you %s tickets', args.response);

Using the args parameter, we can retrieve the response from the user.

Our chatbot now recognises when there’s missing information and can ask for it

Further Reading

And that’s it! You’ve developed a chatbot that can act as a virtual guide to the ConverCon conference in no time!

Thanks for reading!

--

--

Mary-Jane McBride

A Software Engineer from Belfast, Northern Ireland - Twitter: @a_crafty_coder