How iOS 9 Will Get The Apple Watch To Click

The Apple Watch is Apple’s most heavily constrained device. A tiny screen, a short battery-life, and limited developer access. Prior to launch, the media framed it within its limitations — depending on an iPhone, lacking a keyboard, missing a web-browser, being a fiddly gaming platform, etc. The developer community, too, is focussed on the limitations — particularly limitations of WatchKit extensions and glances when compared to the rich development environment we’re used to with iOS. Much of the chatter prior to the announcement of iOS 9 on Monday is focussed on hoping the developer constraints will be lessened and we’ll get richer access via a Native Apple Watch SDK.

The ever insightful Marco Arment wrote a list of questions about the upcoming developer kit. A grounded, incremental view of what iOS 9 might mean for the Apple Watch.

The problem is, after weeks with the watch, I don’t see ANY value in the existing developer app extensions, and I only ever access the app-grid accidentally. Moreover, I don’t see any more value in more complex or richer apps enabled by native developer access. To put it bluntly, I would be just as happy with the Apple Watch if it was app-less akin to the original iPhone.

It’s a very nice watch. It’s customizable and “smart” in that is an effective window into the lock screen of the device in my pocket. But first and foremost, it’s a watch. The apps, glances, and interactions with notifications feel disconnected and distant. So an incremental evolution to support native development won’t help much, but instead may exacerbate other limitations; battery life, lack of connectivity etc. Marco’s questions are guarded by those same concerns.

I don’t see the developer generated content becoming dramatically more usable, (or usable in the first place) with a simple evolution of WatchKit and a move to a native SDK. For that to happen, I’m betting on the introduction of a paradigm shift, and for it to arrive in iOS 9.

Making Apple Watch Smart

The truth is I think apps, glances and their WatchKit UI are great. How un-used and impractical they are stems from how unnatural it is to navigate a visual, touch based UI to expose functionality whose utility is measured in seconds. But if that functionality were available via the primary, natural input to the Apple watch, things would get interesting. And that input is, of course, voice.

Voice dispatch

Let me explain with something I do every single morning:

Grind beans, half fill French press with water.

To Siri: “Set a timer for 1 minutes”. Timer goes off

Stir the bloom, and top off with water

To Siri: “Set a timer for 3 minutes”. Timer goes off

Push down plunger and enjoy.

(Based on Stumptown Coffee Roasters brew guide)

I’ve done this for months with my iPhone and yet every day since I got my watch, which is much more accessible and convenient, it feels cumbersome. I love those moments, where your subconscious raises a little flag saying “well, that didn’t feel quite right”. I have a hard time ignoring my sub conscious, so I’ve been mulling over what the device should do differently.

My belief is that it boils down to making Siri the entry point to ALL apps on Apple Watch. Siri isn’t going to have OS level support for coffee brewing methods, neither is through extension to Wolfram Alpha, so it’s up to the development community to fill in the missing parts and explore the potential of this established but as-of-yet closed off paradigm, much like when the App Store opened the original iPhone to developer code.

That is MUCH more exciting than anything promised for iOS 9, or even the watch announcement itself. New development paradigms like I’m imagining don’t come around too often.

“What can I help you with?”

Since Siri was introduced back in 2011, there has been an expectation that developers would get access to it. Apple has, in regular-Apple-form, defied expectations, but I get a feeling enough stars may have aligned in Cupertino to make it make Apple-sense in the context of the Apple Watch.

So what would my morning routine look like with developer access to Siri?:

Grind beans, half fill French press with water.

To Siri: “I’m making my morning coffee

UI updates to allow user to pick coffee brew method. French press is selected

UI updates to show instruction “Half fill French press”

User taps accompanying “Next” button

UI updates to show 1:00 timer counting down

UI updates to show “Stir bloom and top off French press”

UI shows 3:00 timer counting down

Timer goes off

UI updates to say “Enjoy!”

Supporting the Paradigm Shift in iOS 9

So what does this developer access to Siri look like in iOS 9? How do I make this coffee brewing utility a reality? My guess is something like this:

  • Apps will be able to declaring instruction patterns they can responds to — optionally parameterized. Perhaps initially you’d have to mention the name of the app in the verbal instruction to Siri
  • The instruction would be parsed and dispatched with any parameters, by the OS/Siri. I expect the entry point to the app would be similar to how push notifications work
  • The app would run some logic to validate/process the parameters, possibly asking for clarification via another Siri code hook. It would then configure and dispatch to a “glance” extension that serves the action concisely
  • The glance would be presented to the user, and stay in the “foreground” for the duration of the task. Screen would still dim etc. Can resume with crown double tap
  • The glance would have access to configure and display/embed “OS” type utility functionality (timers, reminders, messages etc.)

Using the iPhone to manage Apple Watch settings works perfectly for setting up enumerated parameters and configurations — maybe for French press I want an initial 2 minute soak, followed by 5 minutes when topped off. Maybe I can say “Help me make coffee for Madeline”, and based on Family sharing, the watch can help me brew coffee just like my wife has it configured on her phone.

It’s easy to see the potential here, and see how these interactions would come together:

  • Play the latest Game of Thrones episode in the living room
  • Make sure my garage door is closed
  • Bench Tom Brady from my fantasy team
  • What is this song? Create a pandora station based on it

Siri as a miniature OS

No Apple article has stuck with me more than one about a Siri job posting, in which the description asks potential candidates to do the following:

Consider it an entire miniature OS within the OS, and you get a good idea of the scope

I think big things are coming in this area, and I’m excited to see if Monday will deliver the first steps into developer access.

Siri has matured, is central to CarPlay, and will soon be part of Mac OS and AppleTV. With developer access in iOS 9, it would dramatically change the utility of those nice looking watches, by making apps useful, and their development actually worthwhile. I’d be so excited to see the magic that could be created with that power exposed to the dev community.

We’re getting closer:


Originally published at slytrunk.com.

Like what you read? Give Slytrunk a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.