Tightening the screws on haptics

Ed Colgate
Feeling Disruptive
Published in
4 min readSep 25, 2016

A few days ago I was up on a ladder mounting a light fixture on our front porch. Tightening the final screw a thought suddenly occurred to me: How do I know that the screw is tight? For me, this was a “why do apples fall?” kind of moment. The apple is ripe, the screw is … tight.

But really, how do you know?

Is it because of the amount of rotation? Somehow, you know that you’ve turned far enough? That can’t be right because sometimes you turn and turn and turn, and sometimes it snugs right up. Ok, then, is it because of the torque? At first blush, that makes a lot more sense. As the screw tightens, you twist harder until it feels like you’re twisting plenty hard. But is “plenty hard” the same thing for the little #6 screw on my lighting fixture as it is for the bolts on my head gasket? Somehow I don’t think so. Maybe I just learn the right amount of twist for each situation.

Maybe.

But here’s another thought: Perhaps it is the relationship between how much I’ve turned and how hard I’ve twisted. That seems much more like it. When the screw is loose, a bit of extra umph gives me plenty of movement, but as it tightens, that same amount of extra umph doesn’t give me nearly as much movement. In other words, the way that the screw responds to me changes.

And that was my epiphany. The information that a screw is tight lies not in a single signal like a twist angle or a torque, but in the relationship between those signals. And that is pretty exciting because it sets touch apart from the other senses that rely on signals, like the light that hits the retina, the sound waves that impinge on the ears, or the chemical soup that bathes the nose or tongue.

To be sure, movement matters in those other senses as well — you focus your eyes, tilt your head, sniff your nose and chew your food. But those movements serve to guide us to the interesting signals and to improve signal acuity. The information still resides in signals rather than relationships between signals.

Haptics stands apart because it is bilateral.

What we feel depends on the two-way interaction between what we do and how the world responds. A light switch feels like a light switch because of the way it responds to me. So does an overcooked steak, or a car door or a combination lock.

Of course, my epiphany is something that product designers have known for a long time. Think of the lovely feeling of opening your laptop, or the feel of switches in a luxury car — those are all carefully designed haptic experiences.

Unfortunately, this satisfying response has been less the norm in programmable haptic interfaces. Most of us have tried a haptic key click at some point. These are those touch screens that vibrate a bit when you press a button. They typically feel kind of squirmy. It’s hard to imagine that feel is what a product designer really intended. More likely, it is what the designer settled for because the technology was still in its infancy.

Moving toward more satisfying haptic responses

We are fortunate to live in a world where haptic-powered products now respond in delightful ways — even if it’s more the exception than the rule. Case in point: I recently replaced my Macbook Pro with a brand-spanking new one. The old one had a mechanical click on the touchpad, and the new one is electronic, using the Taptic Engine. And it is so awesome. It is crisp and responsive. It is better than mechanical. Sometimes I click it just to feel the darned click. And don’t even get me started on the “deep click.”

Of course, this isn’t exactly programmable — it’s just one really nice click. But I think it makes the point that electronic haptics has the potential to feel really, really nice. Moreover, that should be our expectation not just for clicks, but for all the richness that the world of programmable touch has to offer. I, for one, can’t wait!

Postscript: No, I haven’t felt the home button on the new iPhone 7 yet, but it seems that reviews are mixed, at best. I think this just goes to show that designing bilateral effects isn’t so easy. Think about it: the “signature” of the physical button is the way it moves in response to your push. The new home button wants to provide that same signature, but it can’t actually move! (i.e, it doesn’t actually depress into the case) You still might be able to fool the brain with a vibration that masquerades as the button’s movement (this is what the trackpad does so well). But it’s tough to fool the brain when the device is lightweight and hand-held and all of the other fingers feel the vibration as well.

So, has my enthusiasm for the future of programmable touch been, shall we say, dampened? Not at all! There is so much cool haptic technology coming down the pike. For example, a recent paper by Monnoyer et al. (including my former postdoc Michaël Wiertlewski) demonstrated that a sudden reduction of friction as a finger pressed down on a surface could create the sensation of a button click, presumably due to the skin “letting go” and spreading out across the suddenly slippery surface. The team built a demo which I felt and found pretty convincing. More importantly, this is a completely local effect that would not be perceived by any other fingers, even on a mobile device. Or so I think: when it comes to the future of haptics, I’m definitely an optimist!

--

--