Zero UI, User Experience Design, and How Context Matters

Ariana Ocampo
This is Design
Published in
6 min readFeb 23, 2018

Last year, I visited the conference for the STHLM Tech Fest. While I attended many interesting panels there, there was one I still remember the most vividly half a year down the line, which was called the Future of Living, and it featured speakers who talked about their ideas for what is known as the smart city. The smart city is a vision of future living that takes our current understanding of the Internet of Things (IoT) and dials it up to a maximum — an entirely integrated, city-wide system that assists us in our daily tasks from the home, to the commute, to the office, and back again. It’s a vision in which self-driving cars come to pick us up without having to be called, in which our work space is transferred from the backseat of the car into the office without us having to save and upload it, and our groceries are delivered to our doors exactly when we need them, without us having to place an order.

When talking about the smart city, we’re talking about designing a system that goes far beyond us interacting with an interface on a screen — a system that responds to our voices, our gestures, our looks, and even our thoughts. In tech circles, this new design paradigm is called Zero UI. It refers to designing systems and products that not only meet our needs, but that learn how to anticipate and predict them, through the use of things such as machine learning and artificial intelligence.

While the kind of widespread, advanced systems described by the panelists at STHLM Tech Fest seem like a far-away future utopia, cities around the world are already investing in the infrastructure of the smart city. In fact, many of the building blocks needed to implement it are already around today — and they are gaining popularity by the second. For example, one of the most frequently talked about cases of Zero UI is voice interfaces, with smart home speakers such as Amazon Echo, Google Home and Apple’s Homepod currently dominating conversation within the tech space.

Other alternatives to traditional touch screens, such as haptic and ambient interfaces, have been around for awhile. One of the most talked about trends in the past couple of year — smart wearables — are poised to change how we use natural human movement to interact with the technology around us and before that we had the Microsoft Kinect, which while commercially unsuccessful as a gaming console it gained widespread use within the fields of marketing and academic research. Take a look below at just what the Kinect was capable of.

As for ambient interfaces, companies such as Flic have been enabling their users to build their own, small-scale automated systems using a combination of their Bluetooth smart buttons and a mobile phone.

Today, Zero UI is being implemented across the board in useful, efficient manners that seem to present many advantages over traditional touch screens. Voice commands not only provide an easier, quicker way for users to access information and services on-demand, but they are more accessible to those who have disabilities, e.g. visual impairment. Equally, there are examples of when alternative interfaces either don’t work as advertised, or don’t live up to the hype around them. Besides the aforementioned Kinect, a notable example of the latter is the now notorious Google Glass. Google Glass was released to the public in 2014 and was widely criticised over issues around privacy and safety. However, it was later announced in 2017 that the product would be re-released in an Enterprise Edition, presumably in reaction to the by now many established use cases of Augmented Reality in sectors such as manufacturing, health care and aviation.

The context in which technology emerges matters. Innovation and progress comes not only as a result of us developing the technology to do it, but also because we, as a species, change our expectations of what technology is supposed to do. In an era of the IoT, where previously unconnected people, objects and places are now linked together, we want more than individual products and services that respond to our commands in an expected manner. We want fully integrated systems working together in harmony, creating a seamless, effortless and indeed entirely frictionless experience for us. As complicated as designing for screens can be, moving towards Zero UI means that the use cases we design for are more complicated, and require us to take into account more complex ideas around human thought, emotion and behaviour.

It has been said by the person who coined the term Zero UI, Andy Goodman, that the phrase does not actually refer to a future in which we have literally no visual user interfaces — but rather that it was used as a way to provoke designers, who are traditionally more concerned with how things look than with how they work. Even in my own field of User Experience Design, much of what we talk about focuses on what is happening on the screen or in the device itself. We discuss common design patterns, best practices, haptic feedback and other tools and tricks that we can use to help us design products in the right way.

However, touched upon earlier, technology and indeed design are only as good as the context in which they are used. We should not only be concerned with designing products in the right way, but with designing the right products for the right situation — and that applies to all products, regardless of how cutting-edge they are. As bonafide technophiles, we risk falling into our own preconceived notions of what innovative technology is and falling victim to the hype around it, unless we are able to look past the product itself and into how the product is used. After all, our job is to design experiences. That’s why we’re called User Experience Designers.

As an industry, we are currently having amazing conversations around Zero UI, the future of integrated systems and what that means for being a designer. But I would like to argue that we should already be designing by taking more complex ideas around human thought, emotion and behaviour into account. We might be moving into an era of alternative user interfaces which, by their very nature, demand that we take a more holistic approach to design, but our devices have never existed in a vacuum. They are already a part of a larger whole because they are a part of our lives. And it is that larger context, including all of the complicated human motivations that go along with it, that we should be taking into consideration regardless of what interface we design for — whether it be for screens, voice, or something that hasn’t even been invented yet.

— Ariana Ocampo, UX/UI Designer

Here at Glance, we wholeheartedly believe in embracing a holistic approach to product design. If this sounds like the right thing for you, please contact us to discuss how we can help you not only build your product right, but help you build the right product — for the right context.

--

--

Ariana Ocampo
This is Design

UX Strategist, #GeekGirl and lover of superheroes. I like ethical technology, elegant solutions and social justice.