Making AR as Easy to use as your iPhone

Dovid Schick
TapWithUs
Published in
4 min readJun 22, 2022
Steve Jobs announcing the iPhone

When Steve Jobs introduced the iPhone in 2007, he defined a “Revolutionary UI” as an “interplay of hardware and software”. The hardware he was referring to was the touchscreen, and the software, of course, was iOS. The clever marrying of these two elements produced a user experience that was uniquely suited for mobile devices, and has enabled the mobile revolution that we have been enjoying these past 15 years.

Similarly, when Doug Engerbart demonstrated the first Graphical User Interface (GUI) in 1968, he also introduced the mouse, a revolutionary piece of hardware that made it operable. We can not imagine a GUI without a mouse, (or its equivalent), and the mouse had only limited utility without a GUI. His ability to imagine an entirely new experience unlocked the era of modern computing. (Though, once again, it took Steve Jobs to realize the vision.)

The situation today in AR and VR feels very much like the early days of smartphones before the launch of iOS — it was clear that those phones could be wonderfully useful, but the process of operating them was slow, imprecise, and frustrating. The promise of 3D computing is becoming more and more evident, but getting anything done is still slow and clunky.

The industry today is focused on hand and finger tracking, which are moving to replace hand held controllers in VR, and have already been used in high-end AR systems for the past few years. But, while getting rid of these controllers and freeing up your hands is an essential step forward, the user experience with hand tracking is, if anything, worse than with physical controllers.

The limitations of hand tracking lie not with the technology, which has improved tremendously, but with the human aspects of the experience. Controlling a virtual device with your hands in free space is both visually and physically fatiguing. It is also difficult to imagine that consumers would be willing to use hand tracking in public places, where the overt gestures would probably be socially unacceptable.

Computer image of hands with tracking marks on fingers
(Source: UploadVR)

The average user performs about 3,000 gestures, (taps and swipes), on their mobile devices per day, and heavy users can perform over 6,000 gestures per day. While a touchscreen gesture requires minimal movement and force, hand gestures require large travel distances during which the arm must be extended. While operations on a mouse, keyboard or touchscreen provide tactile feedback, hand tracking relies entirely on visual feedback. The result is that hand gestures tend to be slow, and require much more visual attention, as well as physical effort.

To compensate for this, UX designers tend to make screen objects large and generously spaced. This does reduce error rates in hand gestures, but it obscures the field of view of the user, which defeats the purpose of immersive displays. As a result, AR and VR UI’s tend to look like oversized desktop systems.

Woman wearing AR goggles watching many computer screens floating in air
(Source: Rokid)

For AR and VR to find an audience beyond gaming and niche enterprise uses — for it to be widely adopted as the next step in the evolution of computers — a completely different user experience must be imagined.

The Tap System is a new approach to input and control in immersive computing. It combines novel input hardware with a novel GUI. On the hardware side, the Tap Band is a wrist wearable that can detect tap gestures. Each time the user touches a surface, the system detects which fingers have made contact, and sends the corresponding command to the headset.

The Tap Band wrist wearable controller
(Source: Tap Systems, Inc.)

Unlike hand tracking, tapping is inherently tactile and eyes free — one does not have to see their hand to know which fingers they are tapping with. Tapping is extremely fast — in typing applications, users have exceeded 70 words per minute with one hand, which corresponds to almost six taps per second.

On the GUI side, a small set of simple gestures controls the operation of the GUI across any context. The main menu of the UI always displays five icons, each icon can be selected by tapping the corresponding finger. Scrolling is done by tapping the two rightmost or two leftmost fingers. The global command for ‘ENTER’ is tapping all fingers, and the global command for ‘BACK’ is tapping the three middle fingers.

Tap can also be used as a keyboard for alphanumeric entry. Any letter in the alphabet can be emulated by a finger tap, with punctuation and special characters accomplished with multi-taps.

The result is a completely new user experience for AR and VR that is fast, accurate, intuitive, low-fatigue, eyes-free and tactile. In short, it brings the elements that make mobile and desktop interactions so comfortable and brings them into the immersive world.

An earlier version of the Tap System — the finger wearable Tap Strap — has been on the market since 2018, and has over 35,000 users — primarily for keyboard applications. We will be accepting pre-orders of the Tap Band in October, with first units scheduled to ship in December.

See the video here.

--

--

Dovid Schick
TapWithUs

At Tap Systems, we are making immersive technology as easy to use as your iPhone.