Why the Future Has No Room for UI Designers

Alborz Heydaryan
5 min readDec 30, 2015

--

In a few years the interface of most applications will be redundant. Let me tell you why.

There are apps that you use to create stuff, then there are apps like games that are immersive experiences. Pretty much everything else is about consuming information. Reading the news, getting notifications, receiving messages, and getting updates on your loved ones.

Productivity apps and immersive experiences will go through their fare share of changes with new technologies on the horizon. VR and AR (augmented reality) will both have their impact on these apps. But the apps that will have the largest amount of change are the consumption ones.

Everything will be contextual and real time in the future

Right now if you want to interact with a person who has left you a message, you see the notification, you unlock your phone, and then you reply to them either by typing or dictation. If you want to get directions to a place, you unlock the phone, type the address in the maps app, and you’re on your way. If you want to read the news, you open your favourite news app and all the curated news are there ready for you. But there’s an inefficiency involved with all these. You’re interacting with a small screen, that has a password for your safety and privacy, and it usually suffers from not giving you what you really want at the right time.

Imagine

You wake up in the morning and your phone senses that you’re awake. So it says: “Good morning! It looks like it’s going to rain today. You might wanna bring an umbrella with you to work.”

The phone knows that you walk to work everyday and knows that since you walk you would need an umbrella.

Then it says: “your mom wants to have dinner with you. Looks like your Wednesday afternoon is open and there’s a nice sushi restaurant near your work that you would both enjoy. Do you want me to book a reservation for you?”

To which you’d say: “Sure! and remind me that day to wear the shirt she got me for Christmas!”

Next, the phone would ask you if you wanna hear the news headlines. With your confirmation, it starts reading the headlines of what it thinks you might be interested in. Once you hear an article that gets your attention, you can ask your phone to read it in full for you.

Notice something strange about those interactions? To do all of those, you didn’t need to touch your phone once. You were simply interacting with a smart device, that communicates with voice. This isn’t a crazy utopian dream. There are already many pieces of puzzle in place. Apple is working on proactive suggestions that it will show you on your lock screen and on the search page on your home screen. (You know, that screen that you sometimes see by accident when you swipe left unintentionally) Google has been doing a better job with those with their Google Now cards. Since Google knows more about you and they don’t care one bit about your privacy, it’s much easier for them to get more and more accurate with their suggestions. Other stuff like conversational AI and accurate dications are getting there as well.

What’s missing and what’s next?

What’s missing is a piece of hardware. An earpiece to be more precise. One which you put on every morning and all those voice interactions happen through. Google already has Google Glasses, which is perfect for this. It has an earphone, a microphone, and it’s connected to Google’s servers which know you better that your mom does. On top of all that, it also has a tiny screen on it! So when your high school classmate gets married and upload a million photos to Facebook, the device tells you: “David got married, do you wanna see some pictures?” to which you’d answer: “show me a couple, and like a few good ones on my behalf”. Apple on the other hand hasn’t shown what their version of future looks like. But there are rumours of an AirPod circulating. It’s supposed to be a wireless version of EarPods. With other rumours pointing out that Apple might ditch the 3.5mm jack and include wireless headphones with every iPhone, it looks like they’re also preparing for this incredibly awesome future. If only there was a section in the privacy settings on the iPhone that said: “Treat my information as if I was an Android user”, then we would see some more accurate and actually relative information shown on the iPhones.

Design doesn’t matter anymore!

In this future, the look and feel of your apps don’t matter anymore. Heck, they might not even HAVE looks, or feels. You’d basically be writing algorithms that send strings of text to be read to the users via the earpiece, and that’s your app.

“When the user wakes up, and it’s going to rain that day, and he walks to work, remind him to take his umbrella”

That’s gonna be an app. It’s not going to have a user interface, and the user experience is all uniform through Siri or whatever google decides to call it later. (Google Now just doesn’t do it for me. “Hey ask google now if it’s raning tomorrow”… nah… they’re gonna change it.)

But i’m a designer! Am I doomed?

Not really. Just make sure in the next couple of years you’re working for a company that focuses on productivity apps, and if you can’t find one, I hear blackberry is always hiring people specialized in obsolete technologies, so you’ll have a place there.

What about us UX guys?

Well, start thinking about the context of your apps and just let go of dribbble already! No one cares about those pesky animations now, and no one’s going to be seeing them in a couple of years. Just let it go. Sure it looks cool, but… just… let it go… Instead, focus your attention and energy on contextual experiences. Every time you think: “Damn, for SMART phones, these things are pretty stupid.” think of an app that can solve that. because the future isn’t about smart PHONES, it’s about smart EXPERIENCES. Delivered to the users via AirPods.

--

--