Software You Can Feel

The haptic use-case matures

Michael Trapani
ThatsLogical
2 min readSep 19, 2016

--

The iPhone 7 is only one of several devices in recent years to include a haptic engine. This handy little component makes the iPhone’s new home button, which does not actually click, feel like it does by emmitting a short pulse when the button is pressed. Think of it as a more versatile and programmable version of a vibration motor inside a phone.

The iPhone’s new haptic engine (or in Apple marketing speak, “Taptic Engine”) does more than just simulate the sensation of a click when you press on the home button. It responds to animations, taps, drags, and other gestures on the phone’s screen itself. And because the Taptic Engine is accessible to 3rd-party app developers, it has the potential to usher in a new era of user interface — apps you can feel.

Haptic feedback will play an essential role in the rise of virtual reality (VR) and augmented reality (AR) as developers continue to add senses beyond just sight and sound. An experience can only be so immersive if you cannot reach out and touch anything in the environment you are in. It makes sense that VR and AR manufactures started with sight and sound, but feel seems like the next frontier.

The potential is massive — from VR gloves that press back on your hands as you pick things up, to running your fingers over raised braille on a tablet’s display, to entire pressure applying suits, like those fantasized in Ernest Cline’s sci-fi novel, Ready Player One.

Haptic feedback on our devices today will help developers experiment with new use-cases and lay the foundations for an increasingly realistic virtual world.

I can’t wait to see what they do with it.

Thanks for reading. Follow me on Medium or Twitter @thatslogical for more space stuff, as well as other logical thoughts.

--

--

Michael Trapani
ThatsLogical

Product marketer, designer, public speaker. Product Marketing @IBM.