Lessons learned while designing a Chatbot
There is a lot of material available on how to design a Conversational User Interface, CUI for short. This story is about finding answers to the question when all the available material doesn’t help.
I recently had a chance to design a conversational user interface in the form of a Chatbot. At first, I was super excited as finally, I had a chance to actually apply my theoretical knowledge to practice. The difference between the Theory and Practice became more and more clear when I started working on the chatbot. In this story, I would like to share some of my key learnings while designing a Chatbot.
The Client approached us with a brief to design a CUI for a digital chat based assistant to be integrated into their existing accompanying App for their physical product to enhance the customer experience. (Can’t reveal much — #NDA). And the journey of learning to design a chatbot started.
First Stop: Lesson 1
Chatbot Vs. Voice-based assistant
Even though Chatbot and Voice-based Assistants fall under the field of Conversational User Interface (CUI), they use different modalities to communicate. Most Voice-based assistants have voice first approach and have little or no visual feedback. Whereas the chatbots are primarily text-based hence mainly visual. This difference changes the whole communication flow and experience. Think of ordering a Pizza on the phone Vs. online. Though the process is the same, the way one gets confirmation of the selected pizza and the price of the order changes.
This difference definitely came in handy while designing conversations for the Bot. We were at the liberty to use media elements like videos, audios, weblinks as well as some graphical interface elements, i.e. form elements and lists. This also enabled us to avoid asking repetitive questions by just showing users actionable options. While it was really easy to use graphical interface elements, we had to make sure that the use of such elements doesn’t change standard chat experience. One of the benefits of the CUI is users already know how to converse and they don’t have to learn new patterns of interaction and had to be maintained.
One more benefit in Chatbots is that they readily offer a history of the conversation. Users have access to all the previous messages. So they can retrace their conversation in case they need to. Which is difficult in only Voice based conversation. This also allows the user to continue their previous conversation and finish their task if they left it in between. Which is not possible with voice assistants.
Second Stop: Lesson 2
Messenger Bot Vs. Custom Bot
As a new way to achieve customer centricity, more and more businesses are introducing their Chatbots for popular messenger platforms like Facebook messenger, Slack, WhatsApp, Telegram, Skype, etc. They are faster to implement, have wider customer reach and work really well in most cases. Of course, it depends on individual businesses and not all the businesses have the same requirements. For example our client. They needed a custom Bot for their existing App. Designing a custom chatbot meant we didn’t have to abide by the standard templates provided by most messenger platforms. It meant freedom in designing at the same time most of the prototyping platforms don’t support much custom design elements.
Prototyping chatbot was one of the biggest challenges we faced in this project. At first, it was all fun and play. We started by writing the script and acting out. It was really enjoyable and in the end, we had a good base for building a more detailed conversation. We then made a paper prototype that helped us in refining the conversation and deciding on which kind of graphical elements can we use in the Chatbot.
Build better voice apps. Get more articles & interviews from voice technology experts at voicetechpodcast.com
Trouble started when it was time to go digital. Even though Conversational User Interface is a Trending topic right now it is really surprising how little freedom we have while prototyping them.
There are several tools available which provide templates for common Bot platforms like Facebook messenger, Slack, WhatsApp, etc. Since we were designing a custom Bot these templates were not useful to us. We tried several states of the art tools like — Botmock, Botpreview, Botsociety, Landbot.io. While every tool could simulate conversation most of them lacked proper On-device testing feature and support for creating custom graphic design elements. On-device testing, the feature was crucial for us as we wanted to test the conversation with the real user and since our Bot was a custom Bot we had some design elements which were different from the standard Bot platforms.
Out of all the tools we tried, we found Botsociety to fit most closely to our requirements also they provided good support. Still, we had to change our design in order to be able to make the prototype and test it. Not having a way to test the design concept exactly the way it was conceptualize was disappointing. I still wonder if implementing certain graphical elements would have made the experience better or worse.
Third Stop: Lesson 3
Tone of voice: Great Job 👍 Vs. Good Job
When our clients approached us they already had a persona for the chatbot in mind. They wanted the digital assistant to be perceived as a friendly coach. A coach who would pat you on the shoulders after successfully completing a task. So we worked with the coach persona in mind to create the Tone of voice. Following that our Bot used Emojis to show appreciation with positive feedback. And then we tested our prototype.
In our first round of testing, we found that while Emojis were well received by a few test persons most of them didn’t like them. Not all participants liked that a Bot gave them positive feedback.
“I know, I did good. I don’t need a machine to tell me I did a great job.”
— Test person A
“Oh, that’s too American!”
— Test person B
(**We tested our chatbot with 8 participants in Berlin -as our clients were a German company and we too are based in Berlin- and the Tone of Voice was perceived as “very American”.)
In our second user test, we used a more neutral tone of voice, removing most of the emojis and providing more subtle positive feedback. It was well received by all the test participants. No one missed Emojis and yet found the conversation friendly.
What we learned from the testing is that following cultural background people might have a different level of acceptance for the way a digital assistant would talk. A good way would be to do A/B testing with a different tone of voices. And see which one is accepted more in the target audience group.
Also, Gifs and Emojis are not as universally accepted as we would like to believe them to be. In a lot of the material we came across, it was mention that using Emojis makes the conversation more natural and fluid, but I would rather say Test it first. It may not be as natural to users as we might think it to be. And it could lead to annoyance on the users part.
To Conclude
Of course, there were many more lessons we learned while working on this project. But to conclude I would say like every customer-centric design process Testing is crucial in designing CUI. Test as early as possible, refine and test again. Testing would assure the desired customer experience.
Work in a team. Build conversations with teammates. Get feedback from others. All these things help in enhancing the conversation.
Be aware of the fact that most of the materials available online are mainly best practices and guidelines for designing CUI. Follow them but also leave them when it makes sense.
Be very careful in using Emojis and Gif. Not everyone perceives them in the same way and cultural differences may lead to miscommunication and may lead to frustration.
Here are links to some of the helpful articles to get started:
Cheat sheet for Chatbot design
On Voice and Tone for the chatbots and voice interfaces: