What it is, how it can combine with conversation design, and where it will go next.
Multimodal design is the art and science of creating user interfaces that combine different modalities across multiple touch points. For example, you could use your voice as an input mechanism and see the output on a graphical user interface, such as a tablet. Instead of relying on just one modality, multimodal design includes all types of input and output.
Input and output
Let’s start with the basics — input and output modalities.
Think of input modality as any method of interacting with a system. This could be through your mouse, keyboard or touchpad. Other options include speech, gestures and even physiological events such as an increase in your heart rate. Input modality allows users to provide information to a system, ideally in the most efficient and effortless way possible.
An example of a fairly straightforward input modality is a Voice User Interface (VUI), such as Google Assistant or Siri. It’s simple — all you need to do is ask. This ask is commonly referred to as the intent.
You would then expect the system to respond, with information or an action. This can come through via a Graphical User Interface (GUI), audio output or even AI-enabled speech, such as a smart home device.
Think like a designer by thinking like an economist
Economists often ask these three questions:
- What goods or services are we going to produce?
- Who are we producing these goods or services for?
- How are we producing these goods or services?
These are very broad questions. As they should be, so there’s plenty of room for creative, practical solutions.
We can use this economist mindset in design thinking by asking those same questions but in a design context:
- What is viable from a business perspective?
- What is valuable from a user’s perspective?
- What is feasible from a development perspective?
Imagine — we want to help the visually impaired access our banking services. Some creative solutions may include voice and braille tablets, or a physical button that can slow down or speed up the system’s speech as it reads the display aloud to the user. We can explore modalities beyond GUIs to help our users get the best possible banking experience, while making sure they’re viable for the bank and can be feasibly developed.
Smart living: an example of simple multimodal design
You may produce a lot of input, whether explicitly or implicitly, which can influence your experience. Take for example, your location.
Imagine, you arrive home after a long at work. Your location (input) informs your smart home system which starts a sequence of automated actions to welcome you home, like turning on the lights and playing music on the radio (output).
Next, you could use a remote control or simply ask the AI-enabled assistant to lower the temperature of your air-conditioning (input). It does exactly as you ask and lets you know (output).
As designers, we should consider how we can make life easier for our users by incorporating and automating such actions through different modalities in a way that feels intuitive and easy. That way, we can design simple but seamless experiences for our users.
Health management: an example of complex multimodal design
It’s not just the smart home or retail industries that can leverage multimodal design. Others, like healthcare, could benefit from designing multimodal experiences.
We’re now able to capture even more input from the user, such as their stress levels, heart rate, sleep cycles, water intake and more. Application Programme Interfaces (APIs) can be connected to analyse these inputs, inform the user of their health and provide advice or recommendations (outputs). Services such as Fitbit could evaluate your health through your heart rate (input) and tell you to slow down when your heart rate is increasing too quickly by beeping or vibrating (output).
Even though the modalities mentioned in this example are fairly limited, there’s a lot happening behind the scenes as the input needs to be analysed before it produces an output. It can become even more complex when a user starts taking home remedies and medications, which may affect his physiological inputs and therefore, the output.
Conversation design is arguably the most human way we can interact with a system. After all, we grew up communicating with others through our native languages.
While it primarily focuses on text and speech interactions, conversation design should not be limited to text and speech. For example, conversation design can apply to chatbots, which should be able to use dynamic graphics to provide the best possible response.
Even though user experience (UX) traditionally focus on the visual experience, we should not underestimate the impact that conversation design can have. That’s because speaking to a human or comprehensive AI-assisted bot can help a user achieve an action more quickly, without having to understand an interface or navigate their way through the information architecture of a system. All you’d need to do is ask.
Conversation design will pioneer a new way of how we as humans interact with systems. And in an ideal future, you would be able to enjoy better user experiences and services based on input you’ve provided actively and passively before, without having to ask.
Making predictions based on history
We’ve discussed explicit input modalities such as speech and keyboard. But the analysis of a user’s past behaviour can be another equally important input.
Banks are able to identify which profiles may carry higher risks than others, based on the customer’s history. While this information isn’t strictly classified as a modality, it can be vital in helping banks provide the financial support that their customers need. That way, banks can better help their customers reach their financial goals or understand which customers may need more tailored financial assistance.
Even though such a use case may be more educational, it’s a perfect fit to multimodal design where we aim to use the right channels to reach our customers and create the best possible banking experience for them.
What’s next for multimodal design
Chatbots and VUIs will greatly impact the future of user experiences, along with other technologies. As we develop and refine these technologies, we’ll be able to improve the way we interpret and fulfil our users’ intents. And in the near future, systems will be able to recognise intents even without the user explicitly stating their intent.
Therefore, it’s essential for designers to think outside the (chat)box, and explore design solutions that go beyond a screen.
Frederik Goossens is the Lead Design Manager for Conversational Banking. He leads a design team in Hong Kong and mainland China as part of our wider “Digital as a Channel” team. You can find him on Medium here: https://medium.com/@frederikg
Disclaimer: No reliance should be placed on the content. The views in this article are the author’s own and may not represent HSBC.
© Copyright. The Hongkong and Shanghai Banking Corporation Limited 2020. All rights reserved