Chatbot Design System: Part 1. Introduction and Knowing your Tools

Daan Luttik
Chatbot Design Systems

--

This blog is the first of a series on systems for chatbot design. This is not about design choices like “should my chatbot be serious or funny?” or “should my chatbot act human, or clearly, be a bot?”. There is already a lot of material available on this topic, and different sources give conflicting advice. Instead, I will focus on systems that help you with:

  • Implementing your design.
  • Spotting weaknesses in your design.
  • Comparing the implementation with the design.
  • Updating the design based on user interaction.

Proper design of websites requires tools like Figma and Adobe XD to transfer the design to engineers in a concrete manner. Proper websites also require analytics to check if the design actually works in an intended way. Systems to support, validate, and improve a design are even more important for chatbots since chatbots are less established than websites and design principles aren’t set in stone yet.

Design systems to support and validate design!

Chatbot design is a field where the people behind the software, data, and content need to work in tight conjunction. As a data scientist and software engineer, I love to think about these systems and how they can help reach goals and facilitate creativity. But I do need the content (in my case marketeers) to help sculpt sentences (and sometimes flows) and they, in turn, need me to guide their creative power.

This article is an introduction. In the next one, we will start answering the questions above.

Step zero: Know your tools

First, it is important to know how your platform works. When you start to create systems that support your design process you don’t need to be able to think about the concepts of the tools as building blocks, you cannot be distracted by questions about the particulars of these concepts.

Don’t get distracted by the specifics of a tool!

I did get distracted by the particulars of how some concepts work. And some of the answers weren’t even present in the documentation. I use Dialogflow, but if you don’t you can still benefit from the systems that I will share and the discussion that I hope to spark. I also still suggest you skim the questions that distracted me and think of the questions you need to get answered before you start with the design so that you don’t get distracted during the process.

Does the input context of an intent just disallow the intent from being called without the context, or does it also infer priority to the intent with the “best fitting” input context for the given state?

In my testing, I found out that priority is given to the intent with the best fitting context. Additionally, the priority resulting from context overrules the manually set ‘intent priority’ (the dot next to the intent name).

However, there is an exception to this rule: Dialogflow will prefer any matching intent to a fallback intent, even if the context of this fallback intent perfectly matches the given context.

How are parameters handled by contents? Are they handed to the next content as well?

Parameters are not automatically sent to the next context. However, rather than maintaining a large set of messy contexts you can manually transfer a parameter from one context to the next.

For instance, you want to move parameter x from input context A to output context B. You can then, in the parameter section of the Intent, add a new parameter with the notation #A.x (essentially get parameter x from context A) in the value section and the ‘new’ parameter that takes this as value will then be forward from the input context to the output context.

Should we opt for one big intent and fill empty parameters with prompts, or should we have a separate intent for every question?

Parameter prompts (see example below) do have a clear cut use-case. However, if you have a more complex list of questions that you need to ask you can find that both approaches should have an equivalent result. The issue is that Dialogflow does not address this question at all in their documentation.

Example: if you ask “How far are you willing to travel” you might expect “5 km” having a number and a unit if the user answers with “5” you can ask for the unit with a parameter prompt. You shouldn’t design a chatbot where these are 2 separate questions.

My suggestion: By default, split questions into multiple intents if possible, you will know when this is wrong (e.g. the clear-cut use-case mentioned above). Multiple intents give you more control and guides your thoughts better (e.g. how to handle an unexpected answer).

Finally

Now that we have these particulars out of the way we can start with the real innovation… Got to the next blog here. And follow the publication for updates. And if you have ideas of your own please join in on the discussion.

--

--