Touchy feely app design

Sergey Germanovich
3 min readMar 9, 2017

Recently Apple has introduced a Taptic Engine technology which provides for transferring tactile experience from device to a user. The old motor can’t afford wide variety of subtle haptic “messages” from a gentle reminder to a critical alert. So Apple created a new engine which can “communicate” with a user on higher level of sensing.

Unfortunately, not many developers and designers embed this new feature in their applications. My guess, that’s because Taptic Engine is an absolutely new & unique tool to represent app’s reactions on user actions, and one cannot simply imagine when exactly this new device might be incorporated. This article is aimed to help understanding it and to begin adding it to your own projects.

Examples from Apple

Many users are already familiar with Taptic Engine. On iPhone, it gives additional feedback on user actions; on Apple Watch it’s more about notifications (new messages, low battery etc.)

Here are a few standard examples:

  • a pull-to-refresh makes a light haptic click when scroll is dragged enough;
  • when you turn on the switch control it gives a stronger click than if it is turned off;
  • alert “something is not allowed” (you may not remove something or error happens) has a special kind of haptic feedback;
  • the 3D touch shows us different appearing views (peek and pop) and gives taptic engine signal with different strength, thus we could feel and understand every mode (a short “have a glance” view, average size form or full-view mode).

Designers got the same opportunity to push interface towards tactility when iOS 10 was introduced and developers got access to a Taptic Engine API (unfortunately, only for iPhone 7 / 7 Plus). Just imagine! It allowed to endow application with new ways of interactions, add original character with real emotions or just create additional features for Accessibility applications (for example for vision-impaired users).

Problem

When graphic designers develop mobile application, they tend to work not only with static/dynamic pictures which display some content, they usually examine by hand (I hope so) and explore their projects (in particular, controls: buttons, switches etc.) on real devices in order to come up more convenient solutions.

But how can the designer make a right choice of new haptic “click” if he doesn’t touch and feel it?

Additionally it should be taken into account that designer needs to hand over parameters of a chosen click to a programmer.

Solution

I realised that designer would like to have a palette of available clicks and get an experience of the feeling of the material. So I created an iOS app TapticMe (download from Appstore) where you can find such palette and get a real experience of all available Taptic Engine clicks.

TapticMe iOS app

For example, you may notice that when you tap on the button “Light” in a section “Impact” you have the same click as when you turn on/off switch or change an alarm time in Clock app.

Conclusion

So! Try, investigate and embed Taptic Engine feedback to your interfaces to create a better user experience as Apple incorporated in their applications. Use this technology as a new environment for your design fantasies and experiments that weren’t possible before. In addition, it’s a chance to get your app Featured in the App Store.

TapticMe iOS Application in App Store

--

--

Sergey Germanovich

Founder: Habit — Daily Tracker. Head of iOS. UI/UX lover.