Forget wearables
It’s about magic and interfaces
It seems like every day there’s a new wearable that gets announced. Pebble, Samsung, maybe Apple someday soon. But this intense focus on the wrist is distracting us from the real magic that the internet of things will bring us. Instead of just listing off all the truly magical experiences we’ll have when we reframe our view, let me walk you through a day in my life, in the not too distant future.
At around 4:55 AM the smart sensor in my bed decides that now is the perfect time to wake me up. Instead of a loud noise or vibration, it instructs the smart light bulbs in my bedroom to slowly turn on, going through just the right phases of sunrise light. Once these lights are open, it instructs my blinds to slowly open. I wake up naturally, easily, and without having to look at or “interact” with anything.
When I walk into the bathroom, I step on my weigh scale, which gives me a thumbs-up for staying the same weight, and the displays today’s weather forecast. Ah, so I know what to wear today, and I didn’t need to “interact” with anything.
As I walk to my kitchen, my espresso machine has already woken up (it did when I did) and is warmed up. It knows that I slept especially well, so it recommends only two shots, instead of my usual three. Sounds like a good idea, I think, putting a mug in the machine and pressing the brew button.
As I walk to my living room coffee mug in hand, my TV turns on, and has displayed for me the weather, top news headlines, and my agenda for the day. I gesture Minority Report-style upward with my hand, and my TV’s 3D camera recognizes this gesture, and my calendar scrolls upward.
“Turn on BBC News,” I say. Ordinarily I’d have to say “okay Siri…”, but since the camera knows I’m looking right at the TV, there’s no ambiguity in my instruction and the news starts playing. As I watch the news, I see a glowing orb in the top-right of the screen. I make a swiping gesture to see it, and I realize I need to leave now to get to the office on time.
As I walk to my car, it sees that I have both my phone and my smart-bracelet. Since I’ve got both devices and the car is in my garage, it’s highly likely to be me, so the doors unlock and the engine starts. As I sit down, the camera embedded in my dash looks at me and verifies my identity. If I didn’t look like me, the engine would stop, and I’d have to biometrically authenticate to restart it.
The center display in my car has a pre-populated list of destinations. At this time of day, the office is unsurprisingly automatically at the top of my list. But this time it’s glowing amber, which it normally wouldn’t. I drive to the office every day, so it’s not like I need directions, but it’s calling for my attention because traffic is especially bad, and it wants to suggest a route more efficient than what I normally take. One tap of this display and the car begins navigation. Notably, all this navigation intelligence is coming from Waze, running on my phone. It just knows that the best place to use Waze isn’t always the phone.
When I get to the office, I jump on a conference call and place a Bluetooth headset in my ear. Once I’m off the call, I leave it in as I walk to my next meeting. As I walk, my meeting gets cancelled (got to love last minute changes). My headset listens to my environment and realizes that it’s pretty quiet and I’m not talking, so it just whispers in my ear “your meeting with Joe was just cancelled.” Great! Time for more coffee, I think, and walk to the Starbucks across the street. This time it’s a bit noisier, so instead of whispering, I hear a beeping sound. I tap the headset to acknowledge the notification and it says, “your wife’s flight from London has been delayed by 3 hours.” “Okay, I am picking her up, so please update my schedule, and see if Chris is free for an early dinner.”
Finally, at the end of the night when my wife and I get home, and I walk to the bedroom, my bed sensor recognizes that I’m in the bedroom (it uses the motion sensor built into my thermostat) and that it’s bedtime. Since I’m about to go to bed, my phone alerts me that I’ve left it on the sofa, and haven’t plugged it in. Okay, a dead phone in the morning sucks, so I take it off my sofa, and drop it on my coffee table, which has an inductive charging loop in it, and finally crawl into bed.
All through this magical day, I interacted with many different systems all around me. I didn’t once glance at a smart watch or other display-enabled wearable. I’ll be the first on my block with a Moto 360, but we need to realize that it’s just one device, and one interface. Instead we must consider how people live, communicate (sensations, sound images, gestures, expressions) and interact. We must design products and experiences that are contextually aware and use the best device right now to serve our many needs. Wearables are a part of the magic, not all of it.
I’m @suthakamal, a product entrepreneur and executive. If you’re building or thinking about something in this space, I’d love to hear from you.
If you found this interesting, you should follow me on Twitter.
Thanks to Ethan Stock (@onohoku), Christian Gammill (@gammill), Seema Kumar (@seemakumar) and Steve Garrity (@stgarrity), and Drake Martine (@withdrake) for commenting on drafts of this.