Bad UX: Apple Watch
When the first Apple Watch came out, I was so excited to try it out; a year later, I’m less than hopeful about the future of the Apple Watch, or smartwatches at all. This comes down to a fundamental usability pattern that makes it less convenient: it takes two hands to use it.
I have the iPhone 7+ and I had the iPhone 6S+. As large as those screens are, they still allow me to use them with one hand. Although Reachability is a feature that lacks conviction, apps like Gboard have made one handed typing a pleasurable experience. Snapchat relies on a swipe based interface (vs. a tap based interface) that allows you to take advantage of the large screen real estate without having to stretch to reach the very top of the screen, for the most part. I’m certain other apps will recognize the benefits of having the majority of interactive zones toward the bottom of the screen as time goes on.
I can send a text with one hand. I can start a workout or call an Uber with one hand. I can control Homekit with one hand. I believe Apple has tried to make these things hands free with Siri, but Siri is often slow to respond and quick to misinterpret. In addition, I don’t want to respond to text messages or announce an Uber in public; it’s pretentious and often just a private event.
Apple has essentially created a miniature, less efficient iPhone that lives on your wrist. The most efficient thing it can do is biometric tracking and notifications. I think if Apple were to go back to the drawing board, they’d recognize the benefit of those niche abilities. Imagine an Apple Watch that was more gesture and vibration based.
Last night, I was delighted to feel the subtle popping of fireworks in iMessage. Even without sound, this felt closer to a real fireworks experience than watching a video. It isn’t necessarily the auditory feedback, but the tactile feedback, that evoked the memory of fireworks, moving the animation from cheesy to meaningful. That’s the incredible power of the Taptic Engine. And it isn’t the first time that I’ve experienced it. When scrolling through the datepicker or the subtle, yet solid, thunk of the notification tray hitting the bottom of the phone, the Taptic Engine enhanced the experience tremendously.
This isn’t the same Taptic Engine from the 6S, either. The first version was lacking in emotion and followthrough. This feels…authentic. But didn’t it all start out in the Apple Watch?
Yes. I have the first generation Apple Watch Sport, so I can’t affirm whether or not there are significant improvements to the tactile experience in the Series 1 and Series 2, but I do know that the concept of gestures has not been introduced to the wearable, yet. Imagine slowly rotating your wrist and feeling subtle haptic feedback, like opening a combination lock. This on its own isn’t enough, but in combination with the self awareness that the Internet of Things brings, it can be incredibly meaningful.
You walk into your living room. The Watch instantly recognizes the room you’re in and that your Apple TV has just turned on. You raise your hand and flick your wrist. Slowly you rotate your wrist and your lights cycle through scenes or your hand works as a dimmer. Maybe you scroll through the air, controlling your Apple TV — because even though the screen is across the room, the Taptic Engine allows you to feel as though you are touching the screen.
The amount of gestures may seem limited, at first, but this is where biometrics comes into play. Companies like Thalmic Labs and Vi-Band have built products on the principle muscular electrical impulses. Imagine this being integrated into an Apple Watch. Suddenly the gestures are endless and the interactions infinite. From starting your car, pulling up your wallet, controlling your TV, and turning on your lights, your Apple Watch is now your keys, cash, remote, and light switch.
For now, the Apple Watch demands attention from both hands. Neither hand can be free and the myriad of bands are like shopping for your own shackles.