Bringing next level AR to the web with iOS 13 and .reality files — the big picture

Thorsten Hary
Published in
9 min readSep 20, 2019

Let´s face it — with Apples rumored GARTA AR Headset not out in the wild just yet, todays AR Headsets are still niche products: Both Microsoft HoloLens and Magic Leap One haven´t exactly set the world on fire: hampered by first generation hardware and focused mostly on business and developers rather than consumers, they remain expensive prototypes, only hinting at the potential of the full fledged AR-worlds of the future.

Without the backing of a giant tech company (Hololens) or multibillion-dollar funding (Magic Leap), AR HMD companies like Daqri or Meta are shutting down left and right.

At zeit:raum, our XR Unit has been cutting their teeth developing for both major AR-HMD platforms, but we are hard pressed to recommend these solutions to our clients based on the current state of the hardeware
with its limited field of view and less than stellar graphics performance.

Want to develop a prototype to get a head start on your competitors? Sure! But for building real, shipping products? Not so fast there, buddy…

Magic Leap One — a first step in a long journey to consumer AR headsets

Hence, for deploying reliable, scaleable AR Solutions, mobile AR is where it’s at for now. These apps might not have the appeal of stereo rendering or fancy hand gesture recognition, but hey — at least the market is not constrained to a few thousand headsets: virtually all modern smartphones are capable of displaying stunning AR content using the phones cameras and vendor provided AR frameworks.

The rise of mobile AR

Mobile AR Apps on iOS and Android are powered by mature frameworks, ARCore and ARKit respectively. These help in bringing great looking AR content to billions of phones using the built in cameras.

The AR frameworks provided by Apple and Google take the most tedious tasks of AR development out of the hands of developers: High quality real time rendering, matching environment lighting, positioning objects, etc…

Where in the past, you would have to rely on printed markers to anchor virtual objects in the real world, these new frameworks automatically detect horizontal and vertical planes in the image and position the content accordingly in the frame, making the concept of markers all but obsolete (except when also communicating an URL or additional information via a QR-Code using the marker)

In addition to planes, both frameworks can detect and augment faces and, in the case of ARKit, also detect 3D objects for object tracking. They are feature rich, powerful and easy to implement.

Write once - thanks to Unity

At zeit:raum XR, writing and maintaining code for two separate platforms on every project often isn´t viable depending on deadlines and budget. Luckily, tools like Unreal Engine and Unity make this process easier: Write once, deploy on all major platforms.

AR Foundation is the Unity framework that is bundling most of the advanced features of both ARCore and AR Kit into a single Framework, allowing native AR-Apps to be deployed on both platforms with minimal fuss.

AR Apps have become more and more popular lately, permeating every facet of brand experiences and advertising from the McDonalds Happy Meal tie-in toys to furniture or eye-ware companies and on trade-show floors: Mobile AR is here to stay, in no small part thanks to ever increasing rendering quality.

Hardware features enable advanced AR on mobile

On iOS in particular, it feels like it´s full steam ahead for AR, with extended abilities like body occlusion (rendering a virtual AR object BEHIND a person without relying on film tricks like greenscreen) or automatic face-tracking introduced in the latest release ARKit 3. Rendering takes camera noise, motion blur and other artifacts into account when matching 3d graphics and real world environments.

The resulting apps are often compelling and feel realistic because of tight hardware integration of the frameworks: Full access to the high-res raw camera feed, gyroscope-support and Machine Learning based algorithms boosted by special hardware (e.g. Neural Engine or even new Ultrawideband Chips on iPhone 11)

What about the open web?

Since web-apps are sandboxed inside the browser, they can’t make use of most of these more advanced hardware features for security reasons and have to rely on slower, less optimized cross platform Javascript frameworks and SLAM-Algorithms.

A clear gap in quality is obvious comparing these solutions to their native app-counterparts. The experience seems stuck in 2017 — not an accident once you realize that most web-based AR-Experiences still rely on ar.js, a Javascript framework for image-tracking that, at it’s core, hasn’t been updated since, well… 2017.

AR.js based Web AR Experience for trade show use by zeit:raum

Commercial Web AR solutions like 8th Wall make the best out of the limited possibilities on the web, deploying their own SLAM-based image analysis for marker-less tracking. The results however still trail the quality of native AR Apps distributed via the major App Stores. Still, for some use-cases, the web simply is where the action needs to be. And things are looking up.

WebGL Frameworks and 3D-rendering quality on the web have come a long way in recent years, with frameworks like Babylon.js, three.js and a-frame leading the way. At zeit:raum, we have deployed major projects, smaller projects and countless tests using most these frameworks in the past year, and the experience and results are encouraging. WebGL is ready for prime time.

Customize e-commerce products in 3D — The WebGL based 3D product-configurator by zeit:raum

Upcoming frameworks like WebGPU and WSL will only further this trend, bringing high end graphics performance to the web.

While rendering quality keeps improving at a rapid pace, the corresponding image tracking components of the open web still rely on pre-machine learning computer visions algorithms and it shows. Web-based AR apps continue to look a bit…janky, their tracking capabilities less reliable and feature rich than their app-based counterparts.

There is hope on the horizon in shape of the WebXR framework however. WebXR is in the final stages of being ratified and is already being implemented by most major browser manufacturers (except for Safari).

Poised to replace the abandoned WebVR framework from years past, the new WebXR framework will focus on VR features first in order to gain parity with the now deprecated WebVR framework and only then move ahead on advanced AR features. A deep dive on the state of WebXR by Brandon Jones can be found on YouTube. He knows a thing or two about WebXR, since he is in charge of it.

Brandon Jones — The State of WebXR

With Google incorporating AR media in search results on Android since summer 2019, the road seems clear: Web based AR will grow stronger in the years to come while still playing catch-up to native AR apps in terms of raw quality, at last near term.

Enter AR Quicklook, part of Apples AR play

AR Quicklook was introduced in iOS 12. As a technology, AR Quicklook embodies the best and worst of Apple software: Enabling quick previews of 3D objects in native ARKit quality graphics with full hardware support, AR Quicklook files based on the USDz file format (originally developed by PIXAR) can be distributed as a download link on the open web, viewable right inside the browser.

At least kind of “in your browser”: The AR view is opening up in a modal window — a walled garden that is leaving the browser and webpage behind until the experience is quit. So, while being technically “on the web” the AR View itself is provided by the operating system and is separated from the DOM tree and website rendering and can therefore not interact with the rest of the site. Yet.

Still, for the first time, App-Quality AR Content could be distributed on the open web without having to download an app.

This is a big deal. For some experiences, people just can’t be bothered to download yet another app to be used once, only to be deleted a few days later. Sometimes, you just want to get in and out of an AR experience quickly.

AR Quicklook makes this possible on Apple devices starting with iOS 12, and does so in a way that is technically far superior to common web based AR solutions.

But at the end of the day, AR Quicklook is just a fancy way to view 3D files in AR. What if we want to do more? Add interactive elements and app-like features?

.reality files on iOS 13 — new adventures in interactivity

zeit:raum AR maintenance showcase using .reality files (german video)

With iOS 12, for the first time, we could distribute high quality, app-grade AR Content on the open web, no app necessary. iOS 13 takes these possibilities even further, providing rich interactions.

Thorsten Hary, CEO zeit:raum

Taking this concept even further in iOS 13 are .reality files. Building on the foundation of AR Quicklook, .reality files cannot just display 3D models in AR, but also add rich interaction.

These files are authored in a new Apple Dev App called Reality Composer.
While clearly a 1.0 release in some regards, Reality Composer makes authoring AR scenes easy and dare I say it, even fun. Functionality that previously required hundreds or thousands of lines of code can now be achieved with a few simple taps on an iPad.

Apple Reality Composer on iOS and Mac. Image Copyright: Apple Inc.

But make no mistake, “easy” is still relative. While adding functionality to existing scenes is pretty straightforward (that alone is a huge win), content creation requires deep knowledge of a 3D editor like Cinema4D, Blender or Maya. While Apple ships a solid amount of 3D “clip-art” as part of the Reality Composer app, this content won´t get you far when it comes to client work.

At zeit:raum, we spent some of our time during this years iOS 13 Beta cycle building up a pipeline for getting 3D files in and out of reality composer.
Things have been a bit rocky on this front since we are still at the bleeding edge of this workflow

Building a pipeline for generating .reality content

Cinema4D GLTF Export

These are the building blocks of our pipeline:
OSX Catalina , Xcode 11, iOS 13, Apple Reality Composer, Apple USDZ Tools, Cinema 4D R20, GLTF Exporter Plugin from Maxon Labs.

Note: You will need C4D R20 since the GLTF Exporter Plugin is not yet supported for R21.

After downloading the USDZ Tools from Apple, you can launch the USD.command in the folder. A terminal will pop up. By doing everything in the new terminal, you avoid dealing with environment variables.

Once you export your GLTF content from Cinema4D, it´s time to convert it to USDz. The Apple provided converter operates automatically in the user folder. For a basic asset the following works out of the box:

usdzconvert myMesh.gltf
usdzconvert myMesh.gltf myMeshName.usdz

This will output the USDZ in the same folder. If you need to add textures you will need to do so manually. For example:

usdzconvert myMesh.gltf -diffuseColor Color.png

You can also set parameters here:
usdzconvert myMesh.gltf -diffuseColor 1,0,0

…which will result in a Red color for the converted asset.

Next up, you can import the generated files into Reality Composer and use the intuitive tools to add interactions. This is the fun part, rejoice!

Reality Composer is a joy to work with

Once you are done go ahead and export the projects .reality file

The result will be viewable on any iOS 13 device and can be distributed on the web — no app download necessary! This instant accessibility is leading to new possibilities on the web. Imagine a world where underneath a product on the IKEA website, you will not only be able to download a PDF instruction manual but also view vital AR instructions in your own space that can be discovered from all angles and use the physical object as an anchor for additional information. This is just one of many possible use cases.

The result — our bike show case

To explore these possibilities, we build our own bicycle maintenance showcase. You can view it live on your iOS 13 device by going here: (sorry, german only for now) and clicking the AR Icon.

.reality files on iOS 13 enable rich AR Experiences in native App-Quality on Apple mobile devices without an App Store Download. Android user will have to wait a bit in order for Google to catch up and provide a similar solution on Android devices.

We are excited to see what Google comes up with in this regard (perhaps a more capable version of the Google Model Viewer web component ?) and where Apple will take their sizable lead on this front.



Thorsten Hary
Writer for

CEO at, passionate about film, CGI and spatial computing