What is People Occlusion and what does it mean for the future of AR?

Steven Brammah
Twinkl Educational Publishers
4 min readSep 24, 2019

iOS 13 has finally been released alongside the new iPadOS. Along with a bunch of software advancements including improved device performance, support for motion capture, smaller app downloads and a brand new dark mode, the new systems bring with them People Occlusion — a long-awaited missing aspect of current AR apps. So, what is Apple’s People Oclusion and what benefits does it add to the AR experience?

What is occlusion

Essentially, occlusion means that virtual objects can be hidden or ‘occluded’ by real-world objects. This means that if a real-world object (like a person) stands between your devices’ camera and a virtual object, the virtual object will now become hidden behind that real-world object. Whereas virtual objects would previously exist on a flat plain irrespective of the real-world objects and environments surrounding them, virtual objects can now change to realistically interact with real-world ones. Since the release of ARKit 3, developers such as myself have had the pleasure of implementing these advancements in occlusion and, in terms of improving immersion and interactivity, the opportunities for innovative AR applications are extensive and extremely exciting.

A Twinkl Quick Look model without occlusion enabled
A Twinkl Quick Look model with occlusion enabled

ARKit 3, Apple’s People Occlusion and more

As mentioned, Apple’s People Occlusion feature finally brings People Occlusion to widely available commercial devices. ARKit3 introduced the technology to developers earlier this year and, with it, Apple demonstrated a real focus on becoming a market leader in AR. By allowing apps to place individuals into developers’ augmented worlds and from seeing People Occlusion in action first-hand, I believe it’s going to have a hugely positive impact on user experience. The advantage now is that having objects within the AR world no longer breaks immersion but, instead, elevates it to whole new heights. Previously, children would wave their hand in front of the camera and the illusion of a realistic AR world was broken because distant objects appeared in the foreground rather than the background. Now however, this issue has been resolved and the same action is significantly more immersive.

On top of that, rather than all the interactivity coming from users’ interactions with a touchscreen, the iOS 13 update means that people can move and touch objects in the real-world and impact the AR world with their body movements. Improved motion tracking alongside detection of up to three faces at once means that AR developers can introduce a huge assortment of new mechanics to their apps, bringing them to life.

See more of what our Digital team gets up to by visiting the Twinkl Reality page

This is just one of the advancements brought on in the systems update. Additionally, the arrival of collaborative sessions means that users can now work in tangent with each other in shared virtual worlds. This is coupled with an ability for apps to concurrently use front and rear-facing cameras, allowing users to now control the AR world with facial movements. By adjusting facial expressions, winking an eye or even using eye movements, users can now interact with virtual objects in an incredibly fluid and easy-to-use way.

Who can use People Occlusion

Apple is bringing occlusion to all its devices with the A12 chip and beyond. For phones, that means the iPhone XR and above are all occlusion ready whilst, for tablets, the third-generation iPad Pro and Airs, fifth-generation Mini and this year’s new 10.2inch iPad all have occlusion capabilities with iOS 13 or iPadOS, respectively. If you’ve got access to any of these devices, I’d really recommend trying these features out for yourself to understand just how impressive they are.

The team at Twinkl have been working hard for months to integrate Apple’s newest features into our apps and the effects have been staggering. We’ve implemented occlusion into our world-first multiplayer AR education app, Little Red Coding Club (soon to be released), as well as our educational Quick Look models.

Wrap-up

The iOS 13 and iPadOS update has tackled arguably the greatest obstacles to delivering immersive AR yet. People Occlusion has long been something necessary for delivering a truly interactive experience. The fact that it’s now here is amazing news for both developers and users, both of whom will be excited to see how this cutting-edge technology develops. We’re going to start seeing a lot of inventive ways of integrating these latest advancements and we as a team are already working to make the most of People Occlusion. We’re actively working to bring it to our full range of Twinkl educational AR apps and are excited to unveil even more advancements soon. It’s an extremely exciting time to be an AR developer.

About Steven Brammah

Steven is a Lead AR Applications Engineer on Twinkl’s App Development Team. During his time with the company, he’s overseen the development of several internationally-acclaimed AR education apps including Little Red Coding Club, ARchitect and Twinkl Robotics.

--

--

Steven Brammah
Twinkl Educational Publishers

Lead AR Applications Engineer on Twinkl’s App Development Team, creating innovative AR apps for the education sector