The ‘What’-’When’ design pattern for Salesforce Einstein Bots

Sebastiaan de Man
Lightning-seb
Published in
3 min readSep 11, 2020

Using Entity Extraction to fine-tune Intent Recognition in chatbots

The fun thing about using an Enterprise Chatbot way outside its comfort zone is that it makes you think about design patterns for conversations a lot more. Take for example the innocent question of ‘What are we going to do tomorrow?’ It’s easy enough to throw a bunch of utterances at it and make an Intent of it, but then comes the related question ‘When are we going to go to the zoo?’, and it’s cousin ‘Are we going to the toystore tomorrow?’

In a sales executive scenario these are equivalent to ‘Which customers do I see tomorrow’, ‘Do I have a meeting scheduled with Acme soon’ and ‘Can I go to the car-dealer tomorrow’

From an NLU perspective they are very close, but the answers in the dialog (fulfillment for dialogflow admins) are different.

If we look at the things asked there are two Entities we can extract in our Salesforce Einstein bot, one is the Date (when) and the other is the Activity(what).

What are we going to do tomorrow(when)

When are we going to the zoo(what)

Are we going to the toystore(what) tomorrow(when)

As you can see in the example each one has a different Entity filled in, while the other one is empty. In the last scenario both are filled in! So that gives us an option to make a decision in the Bot Builder!

A cool feature that is not very well publicized is that Intent Extraction happens before the dialog is called, which mean we can make decisions about where to send the conversation before we try to fill the variables. Have a look at this example:

We decide on routing before we ask any questions

We fill the what and when variables using entity extraction, but in our dialog design we make decisions based on their content before we even get to the question!

So this means we can have a single Intent (Activities), and branch our conversation out for all scenarios

Pretty cool! No more confused Intents, and (with some extra variable checking and messages) we get the following in the map:

And finally we get a conversation where the kids can ask about activities, when they happen and suggest new ones as well:

Yeah, I copied the bot to make it easier for English readers. I failed.

Using this design pattern we can fine-tune our Intent recognition by using Entity Extraction, so basically we shape the conversation by what is (not) being said.

--

--

Sebastiaan de Man
Lightning-seb

I like apps, and my family, and beer, and a lot of other stuff