A developer on dub dub ‘17

Andrew Procter
Treble Apps
Published in
3 min readJun 12, 2017

I’ve been following Apple’s WWDC in one form or another since around 2005, and writing apps for Apple platforms since the inception of the App Store. I’ve never attended the conference in person though. Until this year.

I’m not sure how I was lucky enough to get a ticket. Maybe less people entered the ticket lottery since it was moved back to San Jose from the flashier San Francisco. Maybe the ghost of Steve Jobs just smiled upon me this year. But either way I’m grateful, because this year’s conference was a good one. Heck, Michelle Obama even showed up.

As is tradition, Apple trotted out new beta versions of their tvOS, watchOS, macOS, and iOS products. All the updates were welcome, and iOS was particularly feature-rich, especially for the iPad. Developers, though, were there to get into the under-the-hood stuff: new API’s for us to play with. These are the tools that help us make our apps better.

The big themes this year were Augmented Reality and Machine Learning – ARKit and Core ML, respectively. While ARKit is certainly exciting and probably the best AR framework for mobile, I’m more excited about Core ML.

When I was a kid I thought James Cameron made the term “neural net” up just because it sounded cool.

Machine learning has been a hot topic for years, but Core ML’s presence in the foundation of Apple’s operating systems opens up a ton of opportunity for developers who may not be previously well versed in the topic. Take a look at Apple’s slide on what it can do:

This slide is from the “Introduction to Core ML” session, which is worth a watch. In it, Apple engineers run through an example app that detects flower types when fed an image. It works really well, and on images you might not expect it to, like shots from awkward angles.

One of the most exciting things about Core ML is that it runs entirely on-device. It’s not relying on a server for processing – or worse, sending off your personal data.

Apple also came up with a new format for machine learning models that it hopes will become “the PDF of machine learning models”. Along with that, they provided a Python package for translating existing models from popular repositories to this new, open format.

I’ve already been toying with Core ML – emotion detection and sentiment analysis – for an upcoming project that I’m really excited about. I’m even more excited to see what people smarter than me do with it.

Some other unrelated sessions I enjoyed:

Introducing the New App Store – Thought processes and tidbits behind the big App Store redesign, and new ways for developers to drive downloads.

Design for Everyone – Apple employees evangelize the too-often-overlooked task of making your apps accessible.

Designing Sound – While I didn’t see this one in person, I heard a lot of chatter about how great it was and watched it following the conference. It’s a captivating look at how carefully sound is considered at Apple.

Overall, WWDC renewed my belief that Apple makes the best mobile OS, and did its job getting me excited about the future, both for Apple and for computing in general. Now, back to Xcode (9)!

--

--

Andrew Procter
Treble Apps

Currently @trebleapps, previously mobile engineering @envoy, @resolutionim