Conversation Design with Financial Voice Assistant built by Artificial Intelligence

A.I. to ask about personal finance

When interacting with chat bot, more often than not people are expecting the bot to acknowledge it’s a bot not a human, and don’t want the chat bot trying too hard to be like a human. However, from the research we did at Clinc, when people are talking to financial A.I. that actually understands the context instead of analyzing keywords, people do expect the conversation with A.I. to be more like a human.


When we talk about A.I., what are we really talk about?

Similar to Siri, Cortona, and Ok Google, A.I. in this article refers to use Machine Learning, Natural Language Processing, Voice Recognition, Neuroscience and UX Design to make a product or API that understands voice input and context, and transfer data insights into better decisions.

The goal for this article is designing conversation with financial A.I. that allows people to have unconstrained and unbounded conversation instead of phrasing questions in restricted ways.

JARVIS (Slashgear)

Why do people want to use A.I.? Why do people won’t use A.I.?

In the innovation adoption curve, early adopters, early majority, and late majority take about 81.5% of the total population in developed countries. Majority of people don’t know what they want until they are exposed to innovations and people around them start to adopting new technology. At beginning, it was a challenge for our design team to dig deeper on user needs as conversation with financial AI is still novel to many people, but soon we realized we were designing for people instead of users, as people shared many similarities. The challenge shifted to discover fundamental values that shared among people.

Reasons why people want to use AI:

  • People want to save time, increase productivity, and have fun interactions.
  • People want to buy technology for the future of their children.
  • People want to feel smart, in control and build the public image of themselves by adopting advanced technology in the market.
  • People are curious about AI and want to try it.

Reasons why people don’t want to use AI:

  • When AI doesn’t get what people are saying and it turns out wasting time to interact with AI.
  • People have to phrase questions in certain ways but they don’t know how.
  • People have concerns about privacy and security, especially when it comes to their finance and health care.
  • People don’t trust the technology as they think it changes too fast and lack of stability.

How do people want to have conversation with financial A.I.

Hearing

Gender

Significant amount of people prefer AI using female voice. From surveys 90% participants prefer AI using female voice, 30% participants prefer AI using male voice, and 20% participants do not really care about gender. Digging deeper, 2 participants indicated if AI had a gender, male AI gave them bad impression because of the movie ‘Terminator’.

Default Value of Voice Reply

Voice reply of personal info should be off by default. 90% of participants who took the survey have their phone either muted or less than half volume, and 50% of participants prefer voice reply is off by default, 30% participants don’t care about the default of voice reply, and 20% prefer voice reply is on. At personal space, people won’t care if the voice reply is on or off by default, while at more public space like cafe, people prefer to mute the voice reply if they are searching personal related info like finance and locations, but do not have significant preferences of voice reply on web info like food recommendations, weather.

Moreover, technology should be accessible to all people, including deaf and dumb groups. Therefore, when designing for voice assistant, always give users options to display text and adjust volume on devices.


Speaking

People pause, rephrase, and think about questions when talk to AI all the time. The future of AI should allow people talking in unbounded and unconstrained ways. There are in demand of research on how people speak with A.I., like how fast and slow in seconds do people speak in 95% confidence interval, how people alter their speaking speed when they are thinking about questions to ask AI or when people speak in public or private context, how often do people correct their questions when asking AI questions, and how long do people normally talk to AI.

Therefore, we invited 5 users and video taped them when they talked to AI product — Finie. Contrary to our assumption that the possibility of rephrasing questions depending on how well people speak the language, like native English speakers are less likely to rephrase questions than non-native speaker. We found out its more about personality and education that affects the possibility for people to rephrase questions. Thus it’s important for AI to analyze the whole sentence and context instead of identifying keywords. For example, as of today, if I ask Siri:

‘The temperature is so hot in Chicago today, how is the weather in Ann Arbor?’

Siri should return the weather in Ann Arbor instead of Chicago.

It’s important for AI to analyze the whole sentence and context instead of identifying keywords.

Seeing

color scheme

From our user interview, 8 out of 10 of people choose light blue or pure red color when they think of AI. We invited 10 participants and asked them:

‘when we talk about AI, which movie characters will you think about?’

There are many interesting answers: Jarvis, Her, Hal 9000 from Space Odyssey, Sonny, Cortona in Halo, GLaDos from Portal, and Pat from Smart House. When participants were asked to choose shapes like circles/rectangles/triangles and colors to represent AI, 8 out of 10 chose Jarvis Red, 6 out of 10 chose Cortona Blue, and 7 out of 10 picked circle.

Jarvis Red

Gestures and Thumb Zones

Mapping gestures with current user behaviors can reduce the learning curve for people when they interact with new technology. For example, people can hold the screen to select object then double taps or spread the object to zoom in, then slide rightwards to drill in details like Siri, then slide downwards to exit like in Elevate. When interacting with AI, it’s important to have gesture control apart from voice control in case voice control fails.

However, Samsung Galaxy S6 allows users to type with only one hand in natural thumb area, in the competitive market small improvements can significantly enlarge user base. For designing AI, it will be impressive to users if they can totally hand-free or only using one hand to finish tasks smoothly.

Moreover, similar to the back button of Samsung Galaxy X7, it will be convenient for people to wake up AI through hardware buttons or voice control. Admittedly, the challenge to use voice to wake up AI is like when people say ‘Ok Google’, Android phones around will be triggered.


Practice of designing conversation with financial A.I., not chatbot

1. Categorize User Group

Categorize the large user groups when digging deep into ways of talking: deaf, the elder, drunk people, people who have cold, and etc. When designing technology for people, give people options to adjust font size, volume, notifications, and colors.

2. Onboarding for Conversation

When creating onboarding to teach first time and returning users about how to ask questions, there are 3 commonways:

  1. splash screens showing what the application do;
  2. list of questions that people can ask on welcome screen;
  3. step-by-step guides of sample questions.

Then we tested with users and found out people always skipped splash pages except one tester who was a designer appreciating the nice graphic on splash pages. And it’s very easy for users to forget what questions they can ask and try to phrase their questions in certain ways until we instructed them to ask questions more freely.

Things I learnt from designing onboarding for conversation:

  • Test with users early
  • People always skip onboarding and trying to mess with AI then they ask for help
  • Create personalized questions that catering to people’s needs
  • Design for the moment

3. Conversational UI

Conversation with AI is one-to-one conversation, or many-to-one conversation, where one person or a group of people interact with artificial intelligence. We are so accustomed to conversation UI where incoming message appears on the left and our message on the right.

Things I learnt from designing conversational UI for A.I.:

  • No need to mark names of conversation on the navigation;
  • No need to mark who have seen the message like many-to-1 conversation UI does;
  • We can remove avatar, time of sending message, and time of receiving message;
  • Consider the whole context of conversation, such as devices, time, location, tone, and question length, then respond with proper responses;
  • Leveraging more gestures for people to drill in and drill out.

4. Out-of-scope questions

WIP prototype

Similar to 404 pages, when AI can’t understand what people are asking about, it’s important to find the balance in design where AI is really trying to answer the questions instead of pretending to be smart.

Things I learnt from designing out-of-scope questions:

  • Provide relevant questions that guide
  • Reasons AI breaks
  • How to fix the break down by myself
  • Wait for some time
  • Contact the company to fix it

5. Recognition vs Recall

Instead of making people think about what questions they should ask, AI could also notify users about things they should pay attention to. For example, Finie is an AI for personal finance, thus when there is a large transaction exceeds normal amount, Finie will give notification to users instead of waiting for users to check transactions.

Things I learnt from recognition and recall:

  • Phrase notification concisely and straight to the point.
  • Customize the notification based on location and time. For example, notification on Friday evening can be more informal than Monday morning. And after 12am, notification should be in more polite and gentle tones.
  • Recommend next questions when people finish their first conversation with AI.

6. Personal and natural ways of conversation

Talking to A.I. is different from talking to human as there are less visual feedback indicating if the machine understand the question or not. Also people constantly pause to think about questions, rephrasing, repeating, or And people go back and forth with conversation. Thus it’s a challenge for A.I. to learn and adopt diverse ways people have conversations.


I am a product designer designing for a financial voice assistant built by conversational A.I., I write about things I learnt when designing conversational AI in an awesome AI team. See Twitter for updates or read more about financial AI on Forbes.