Exploring the Future of UI

Greg Cavanagh
ELSE
Published in
4 min readJul 7, 2016

With the pace of technology advancement showing no signs of abating, what does the future of UI look like? How will it affect the job prospects of experience designers and will we ever be able to stand on a station platform without reaching for our phone?

Last month our experience design team gathered at the Prospect of Whitby to explore these burning issues.

So first let’s take a step back. What do we mean by UI? As experience designers we think of what it means to us but in reality it doesn’t have to be just that — any interaction or communication with a machine is just that — when we turn on the washing machine by hand, via an app or by voice this is all user interface.

Our general consensus is that we’re still struggling with the adoption of voice activation. However convenient it may be, we are encumbered by our need for privacy and still feel discomfort at speaking aloud to inanimate object, particularly in public. Yet there’s no denying that voice will continue to get much bigger and for now at least, for specific tasks in the home we can see the relevance of Amazon Alexa, freeing you from tapping on a device display — there’s an interesting review here of how this can work in practice.

amazonalexa

Other than voice, or tapping on a glass screen how will interaction develop? For us simplifying interactions to make tasks as easy and intuitive as possible are likely.

We see advances in more sensory and 3 dimensional technology as interesting developments.

Less than three years ago we saw MIT’s media lab unveil displays made from atoms rather than pixels that allowed users to ‘touch’ from hundreds of miles away. How long until we see this technology become common place?

We like the idea of gestures and movements creating shortcuts harking right back to the 80s idea of turning on lights or turning up the volume by clapping.

We’d like to know how algorithms will develop to monitor our behaviour to trigger actions — when I sit down on the sofa and cross my legs, do I want the music on? Can the algorithms learn from our behaviour to predict our needs?

But what about inappropriate technology? Things that have needlessly defaulted to an app — to open your car door when arguably the car key is still the most efficient tool for the job.

Touch is a great advancement but remember the mobile phones with buttons? When you could write a message on your phone without looking at it because you could feel the keys?

Haptic feedback, the ability for devices to give users tactile feedback (by means of vibration, motion or other forces), will likely play a big part in future touch technology, by adding a layer of interaction that got lost with the introduction of touch screens.

The lapel clip camera ‘Get Narrative’ is an interesting development. It takes images every 30 seconds throughout the day then filters a selection of those with the best lighting, subject matter for you to approve.

Get Narrative

Here’s the rub — the definition of ‘best’ as decided by an algorithm. Taking a photograph is an expression of yourself. You chose the moment, the light, the subject, all because of how you feel, in that moment. Can a machine recreate that? We think not because art needs imperfection.

Ed Rex, founder of Jukedeck disagrees, you can see his recent TED talk describing how machines CAN be creative.

So what about our jobs? What does the future look like?

We agreed we need to continue to design the experience but not get fixated on the touchpoint.

We must continue to think about the bigger picture — the emotional experience and responses rather than the interface.

We become conditioned by the technology we adopt, it’s too easy to look at a screen — try waiting for a train and not looking at your phone, avoiding eye contact with anyone else. We are never allowed to be bored — but we still must think beyond that and the wider move towards technology connecting people — the likes of Uber, AirBnB, Tinder all connect people in the real world.

We’d love to hear your thoughts — join in the discussion with us on twitter using #whatelse and check back for next month’s topic

Originally published at ELSE.

--

--