Natural Interfaces

Ilya Belikin
4 min readDec 16, 2018

--

“ …somebody at Scientific American had the insight to test the efficiency of locomotion for a man on a bicycle. And, a man on a bicycle, a human on a bicycle, blew the condor away, completely off the top of the charts.

What a computer is to me is it’s the most remarkable tool that we’ve ever come up with, and it’s the equivalent of a bicycle for our minds.” — Steve Jobs

I love this quote. It is exactly how I understand personal computing. It is dear to my heart to see the emergence of efficient symbiotic intelligence.

The type of intelligence that we desperately need to survive the transition out of our natural ecosystem. The ecosystem that we exploited and almost destroyed already.

Ironically, the future of computing is in our biological past. Until we develop a direct brain-to-machine interface, we need to design for our natural senses and faculties. Thus the rise of the mega-trend: everyone in technology is chasing the natural UIs.

The iPad is arguably the closest we get so far. It is the only device that can leverage the multi-touch fully: a user can put it on a desk or a knee, touch, hold, tap a few more targets (without releasing main one), scroll, drag around, make a bunch of gestures, release and finish a complex operation elegantly and directly with just the fingers.

It is a recent innovation and it is at the very beginning of the adoption curve. Many people still have no idea you can drag a bunch of photos on iPad and drop them to any other app. Or that you can take a layer out of procreate and drop it in another procreate file.

And we also just got the Pencil — the brain bicycle before the personal computer; the consequential stick that provided us with miracles of reality modelling and symbolic abstraction on a surface. And inviting surface iPad is. Apple mastered fluid design and created a natural, but effortless environment where a tiny force applied by a user can create a dramatic and beautiful consequence. It is natural interface, but it is wonderful as well.

The Apple Pencil added handwriting and drawing to iPad and forever changed how I perform interviews and meetings:

iPad is the next big thing in computing not only because of its outstanding performance, but because of the of the natural, intuitive, deeply personal ways of interaction that are layered together.

It starts with touch, which just got enhancement with OS-level fluid controls. With the drag and drop and polished interactions consistency, it is the best multitouch computer on the market.

It continues with fluid and responsive pencil input. The second generation of Pencil have the place on the device; it is easier to carry and fetch when you need it fast. Man, I do not need to do this anymore:

First generation Apple Pencil in my mouth

The software support for the pencil is still not OS-wide. Pencil is great, but you can not mark you thoughts just on top of this article in Medium without taking a screenshot first. I guess we will get there eventually.

“Let people make a mark the moment Apple Pencil touches the screen. The experience of putting Apple Pencil to screen should mirror the experience of putting a classic pencil to paper. Don’t require people to tap a button or enter a special mode before using Apple Pencil”. — Apple HCI Guidelines.

Next, with FaceID and Animoji we got the first glimpse of what tracking of face features can do. We are on the way to get gaze-based interactions everywhere, not only for the iPhone X notifications.

The voice is still lagging. I was writing about voice interactions in-depth already: they will transform productivity and replace keyboard shortcuts.

There is no doubt iPad will catch up and leapfrog on voice. The on-device neuron engine will enable the next level of voice-interactions. Maybe with AirPods at first. But we will get there, and you will be mumbling to your computer (for productivity sake, not just as a lousy habit).

I can not see how we could get to the smell yet… but by-and-large iPad is the manifestation of natural UIs: multi-touch + gestures + pencil + gaze + voice. The revolution in Human-Computer-Interaction is happening. It is exciting.

What next?

— Clap to this article anywhere between 0 to 51 times.

— Read the whole series: Just say it, The case for voice interactions, The case against voice interactions, The iPad default.

— Are you in Hong Kong? Join us at UXD Meetup. Meet members of Posit network: UX Design professional, instructors, students.

Follow me on Twitter and say hi.

Thank you!

--

--

Ilya Belikin

Founder of Posit network of design practice for good. Hong Kong.