Apple Restarted Their AR Platform Strategy, and Nobody Noticed

Andrew Hart
Jun 14, 2019 · 4 min read

Last week, Apple invited developers to their annual WWDC conference, to unveil everything new across all of their platforms. The past few years at WWDC, Augmented Reality has been a star of the show, with the presence of an “AR demo table” on-stage signalling new updates.

This year, the AR announcements seemed more incremental and muted, coming toward the end of a long presentation. I asked a few developer friends (outside of AR circles) their impressions of what was announced. “People occlusion? There was an awkward Minecraft demo.”

After watching the keynote, I had a similar impression. But then I started digging into the documentation the next day, and I realised that the changes are actually really significant — Apple has made a major restart with AR.

A History

Apple’s ARKit platform was originally built on top of SceneKit — a 3D graphics framework they’d developed a few years earlier for mobile games. That gave them a solid foundation where they could leverage SceneKit’s rendering technology and APIs, and build their AR features on top.

But it came with some cruft — there are concepts in SceneKit which aren’t relevant for AR, such as modifying the camera setup. And also, SceneKit has been neglected in recent years, falling behind other 3D engines like Unity and Unreal. One of the top items on my WWDC wish list was a new visual editor, to bring SceneKit up to speed.

Suffice to say this didn’t happen, and SceneKit hasn’t received any major updates this year. What Apple announced instead has turned out to be quite dramatic — a new framework called RealityKit.

Initially, I thought this would be an abstraction layer on top of SceneKit — a way to make it easier for regular mobile developers to get started with AR, without the learning curve of 3D development. But it turns out that RealityKit is far more significant than that - it’s a ground-up replacement. It comes with a brand new API, and no backwards compatibility with SceneKit.

To be clear — they haven’t replaced the underlying algorithms they use for recognising the world. But they have replaced their rendering engine, which displays 3D content in-place and provides developers with an interface to build their experiences.

So, What Does This Mean for Developers?

First of all, SceneKit is still around, so existing AR developers can continue using that in the short term, and maintain their backwards compatibility. This release of RealityKit is definitely a “v1”, so it’s still missing some advanced features, such as support for custom geometry, and particle effects — those will probably come with time.

If you’ve previously tried developing with ARKit+SceneKit but found it too overwhelming, I’d encourage you to give RealityKit a try. You’re still developing for a 3D environment, but the API is far more familiar and less scary. Plus there’s a new app — RealityComposer, which lets you build your AR scenes without any code.

Apple’s AR arcade game, built with RealityKit and RealityComposer. I hope you have a large living room.

Now that AR development is open to everyone, I’d hope to see a larger community begin to grow — something similar to the strong community that Unity has built with their Asset Store.

What Does This Mean for Apple?

It’s a significant move for Apple to leave behind a two-year old platform, and replace it with something completely new. They’ve obviously made that decision with a long-term view — SceneKit had some downsides, and they thought a new framework would give them a better foundation for the future.

With that in mind, it’s interesting to think about what RealityKit brings — an AR-only framework, with high-performance, and a modern Swift-only API that regular developers can use. I don’t like to speculate too much, but I’d imagine that this is Apple’s path towards wearable AR. And features that we’re seeing emphasised today, such as anchored content, shared experiences and Quick Look — all now supported through RealityKit — could be a primary part of a future wearable product.

With last week’s introductions of Combine, SwiftUI, and RealityKit, it seems developing for an Apple platform for the future may be a very different experience to today.

I’ve been involved with AR development for two years, initially developing a popular open-source library which pioneered location-based AR, and more recently starting Dent Reality, to bring Indoor AR navigation to shopping centres and other large public spaces.

You can follow me on Twitter, where I frequently tweet my thoughts on AR, and share AR demos of our work.

Dent Reality

Building Indoor AR navigation for shopping malls and other…