No UI is the future UI

Jun 13, 2018 · 5 min read

No more cursors or icons. No more screens. Death to UI; All of it.

Like TFT monitors, UI is getting outdated (Photo credit: Simson Petrol)

Wait… What does that even mean? How is that possible? Did I read that correctly? Yes, you did. Now let me briefly explain what I mean by this. I’ll admit the title is a little farcical but there is an element of truth to it. Currently, I don’t think it’s possible to not have a user interface when communicating with technology — that would be absolutely absurd—however, we are close.

What is a UI anyway?

UI stands for user interface. It is how we interact with most of our tech. A simple example is when you switch on your phone or laptop. The things you see on-screen are part of the UI.

No UI is insanity. Why would we do that?

Hold on a minute, what I meant when I stated ‘no UI is the future UI’ is actually a sleek, super-minimal, completely bare-bones user interface. Just imagine a future smartphone with a screen that literally only lights up and displays sound-waves when in use —it would essentially be a slate with the functionality of Alexa, Siri, Google assistant or Cortana etc.

Eventually, tech will be invisible and fluid. Talking to your devices will be as natural as talking to another person.

Humans are said to be an intelligent and advanced species not only because we can reason, but also because of how we can communicate through language. Language is essential to humanity and it can be argued that “language stands alone as the greatest accomplishment of man and it is language, sequentially, that fostered a myriad of cultural products.”[1] Imagine us trumping our greatest accomplishment by being to communicate with technology as we do among ourselves.

Currently we communicate with technology by ceaselessly pointing and clicking with cursors, or bashing in touchscreens with our violent, incessant tapping. It’s not just our wrists doing the manual labour but our eyes also. If you’re a relatively normal person (like myself obviously) then you’ve probably noticed that you spend close to 2/3 of your day (~16 hours) in front of some sort of screen, staring at some sort of interface. A quick google search on Carpal Tunnel Syndrome, digital eye strain or computer eye strain will show that our bad habits have a detrimental effect on human health. I think it’s about time we gave our hands and eyes some well deserved rest.

No user interface exists when we speak to people, so why must we have one with our tech? As natural language processing (NLP) improves over the years and becomes more human like I believe there will be and should be less reliance on screens and user interfaces in general. Eventually tech will become invisible and fluid. Talking to your devices will be as normal as talking to another person.

In order to realise a future where this is possible then certain conditions must be met, namely:

  1. NLP research must be intensified — with emphasis on voice/speech to text and sentiment analysis
  2. Cloud computing must remain secure, cheap and accessible – processing power must continue to increase while price to compute drops
  3. Data science, machine learning and artificial intelligence algorithms, techniques, methods and API’s must be more efficient, accurate and precise

Fortunately we live at a time where large companies are pumping billions into researching and improving NLP. On one hand, the cost of cloud and high performance computing are relatively low, continue to drop and available to almost anybody, anywhere, anytime as long as they have a half decent internet connection. While on the other, machine learning and artificial intelligence techniques & methods have not changed much over the last few decades – which might suggest stagnation. However, the increase in compute power over the same duration of time means that these algorithms are trained and executed at an accelerated rate.

What does a UI-less system look like?

Imagine holding a smooth little grey rectangle, about the size of an iPod classic, in your hand. There are no physical buttons, or distinguishable screen on the device and all the corners are curved. How does one operate such a gadget? Is what your probably thinking but the gimmick is you don’t need to think at all. Be natural, just talk.

“What time is it?” you ask, the device lights up and you see a sound wave illustrated on it “12:44am” it replies before briefly displaying the time and going into standby mode again. You throw another question at the weird device thingy “How much money have I spent on food this week?”, it lights up again and responds “£215.80. You are getting close to your weekly limit.” Now, I admit that so far I haven’t mentioned anything exciting here. You can perform these voice commands today with Alexa, Siri, Google assistant or Cortana.

As with most technology, it gets better when combined with other types of tech. Let me demonstrate. You’ve just wrapped up your morning run and on the walk home. The device has noticed the change in your pulse, speed and can see that you are on your way home.

“Have a good workout?” it asks you as it syncs the exercise data to the cloud

“You’re making good progress on your goals this month” it adds

“I’m happy to hear that”

“Based on the contents of your fridge you could have an omelette, French toast or a green shake for breakfast”

“I think we should go with the green shake today”

“Excellent choice. I’ve noticed that you are running low on milk, should I order more from the usual store?”

“Ermm… Yeah, actually, chuck some potatoes, chips & prawn cocktail crisps in there too”

“I’ve placed the order, the food should arrive an hour after you get back from work. By the way, your car is 92% and it’s Jasper’s birthday today — don’t forget to call him”

The tech being combined above are NLP, geolocation and machine learning all facilitated through the cloud. Notice that this is different to what we have today as this device initiates conversation and is able to understand colloquial language. It might not think like a human but it can communicate like one.

Imagine us trumping our greatest accomplishment by being to communicate with technology as we do among ourselves

The device recognises your voice, is hooked up to the cloud and thus has access to accounts that you give permission. The beauty is that everything is in sync, you have secure access to your , , — everything in the palm of your hand, all controlled through your voice.

Now, I know I said we should move away from UI but by combining this tech with AR (augmented reality) to give us more information about what we can already see would be absolutely magical. That is, only adding a few essential layers to give us a better insight — that is what technology is for after all.

Perhaps moving away from user interfaces is absurd but I think it is something we should definitely look into. Getting technology to talk like humans is one of the hardest challenges in artificial intelligence but maybe one day, just one day, we will exceed our limitations and it will become a reality.


  1. PALMER, K. A. 2009. Understanding Human Language: An In-Depth Exploration of the Human Facility for Language. Inquiries Journal/Student Pulse [Online], 1. Available:


Written by


(Futurist). Data Science. Smart Cities. IoT.

More From Medium

More from AEO

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade