In Your (Inter)face

Designing interfaces that allow us to engage with technology in more natural ways

Hannes Van de Velde
Shift ‘17
6 min readDec 20, 2017

--

Humans and computers unfortunately don’t speak the same language, or at least not yet. We need interfaces to communicate with each other. Interfaces capture our input, translate it to the computer, allow the computer to perform requested actions and feedback the result of those actions back to us in a human-readable format.

Adapting to the limits of technology

For the past four decades, we’ve been using the Graphical User Interface as the main interface paradigm to interact with computers. But hovering our mouse or fingers over a glass screen, clicking and navigating ourselves through lists and menus fundamentally doesn’t come natural to us.

Our natural way of gaining information, completing tasks, and fulfilling needs is to engage all of our senses in a multi-dimensional environment. We use our eyes to visually observe the world around us, our ears to capture sounds and listen to other people, our voice to express ourselves, and our hands and bodies to feel and handle objects.

Technology is a human-made tool. So as technology evolves, and the interfaces evolve with it, we should make it evolve in such a way that technology adapts to us, rather than us adapting to technology.

Moving beyond the screen

A series of recent technological advances are allowing us to interact with technology in more natural ways.

Image source — https://i.ytimg.com/vi/xWH8IkOnNjM/maxresdefault.jpg

Technologies like augmented and virtual reality, voice commands and conversational interfaces, gestural controls and haptic feedback allow us to extend our interfaces beyond the screen and facilitate more natural and intuitive interactions.

Leveraging those emerging technologies, we can start building interfaces that are closer to our natural behaviours by engaging more of our senses.

Seeing

Augmented reality allows us to add a digital interface layer on top of the real world, taking away the metaphorical abstraction of purely graphical interfaces. You perform actions and tasks by looking at the real world, with digital information added on top of it, rather than looking at purely digital representations.

Speaking

Virtual assistants like Siri and Google Assistant already made us familiar with using our voice to interact with technology. Devices like the Amazon Echo, Google Home, Apple Homepod and countless chatbots radically do away with a visual interface, and use voice both as input (people talking to the device) and output (the device talking to people).

Feeling

3D Touch and other forms of haptic feedback can recreate the sense of physically touching something through subtle forces, vibrations or motions. It’s this technology that allowed Apple to remove the physical home button on the iPhone X, while still giving you the illusion that you’re pressing that same physical button you were used to. So even though the interface you’re controlling is virtual, you will physically feel natural feedback.

Being

Virtual reality can immerse your whole mind, body and all of your senses into the virtual experience. It is your entire presence and movement that controls your actions. By walking around in the room, looking at things or touching things with hand controllers, you control the natural interface simply by being in it.

Devices adapting to our needs

In an article he wrote in 2011, Bill Gates said the following:

“Until now, we have always had to adapt to the limits of technology and conform the way we work with computers to a set of arbitrary conventions and procedures.

With natural user interfaces (NUI), computing devices will adapt to our needs and preferences for the first time and humans will begin to use technology in whatever way is most comfortable and natural for us.”

With that, he captures the essential advantage of natural user interfaces (NUI) over previous interfaces: natural interfaces exploit the intrinsic skills that we have acquired through a lifetime of living in the ‘real world’. By doing so, they reduce the learning curve and cognitive load and minimise the distraction. Ultimately, NUIs allow us to obtain digital information or complete tasks quicker and more seamlessly.

Getting started with NUIs

We’re already seeing great examples of digital products whose interfaces are designed in more natural ways.

A prototype we made for Telenet, where we used AR to help people install their Digicorder

One example: augmented reality is changing the way we assemble and repair things. With technologies like Google Glass, the HoloLens or even your iPhone, instructions can now be projected on top of the actual object we are assembling or repairing, allowing us to focus on the task at hand instead of constantly having to shift our attention between the object and the instruction manual.

Where NUIs will probably once be seen as one of the radical shifts in how we interact with technology, designing your current product or interface to become more natural should be more of a gradual approach instead of a radical one. Just starting to use one of the enabling technologies, like voice, won’t make your interface natural. The technology is a means to an end, not the goal itself.

In order to build interfaces that really feel more natural to use, we can follow a set of guidelines and best-practices.

  1. Aim for instant expertise
    In order for an interface to feel natural, it has to match a user’s context and skills. If our digital interfaces can leverage our intrinsic skills, we can save users the trouble of having to learn the specific UI of our product. The only learning curve they will then have, is to learn which of their existing skills they have to apply to control the interface.
  2. Allow for progressive learning
    Beyond instant expertise, natural interfaces should enhance our ability to express and perceive complex ideas and perform complex tasks. How natural our basic interactions may be, when the tasks we learn to complete using technology become more complex, we will have to provide a clear learning path for users. Just like we learned how to walk before learning how to run, we will have to break down complex digital tasks into a set of easier subtasks. Experienced users should be able to perform the complex tasks in one go (“Hey Siri, I’d like to order a medium pepperoni pizza from PizzaCompany with ASAP delivery to my home address”), where more novice users can gradually work their way through the different subtasks to accomplish the same goal in the end (“Hey Siri, I’d like to order a pizza.” — “Here is a list of companies you can order pizza from: …” — “I’d like to order from PizzaCompany” …)
  3. Provide direct & contextual interaction
    When interacting with the psychical world, we’re used to direct and contextual feedback. NUIs will require the same direct correlation between action (by the user) and reaction (by the interface). Interacting with a virtual object while receiving immediate haptic feedback can create the natural idea of physically touching something. Turn-by-turn navigation can start to feel a lot more natural when the interface is projected onto the real world through Augmented Reality rather than on a virtual map. And when we have a spoken conversation with a computer, we will expect it to answer with the directness as a human if we want that conversation to feel natural. Besides direct interaction, NUIs should always keep the user’s context in mind. No interface can be natural in all contexts or to all users. Having a spoken conversation with a computer might feel natural when you’re alone, but could feel unnatural in a crowded environment. So while many of the building blocks of natural interfaces can feel natural in some contexts, they might feel very out-of-place if they don’t match the user’s expertise or context.

When applied right, NUIs make it easier for us to interact with technologies in fundamental ways, allowing us to spend less time navigating interfaces and more time getting actual things done.

And if we succeed, maybe one day humans and computers will speak the same language.

Originally published at inthepocket.mobi.

--

--

Hannes Van de Velde
Shift ‘17

Director of Product Design @itpocket — Ghent, Belgium