A Look Inside The New World of AR

Augmented Reality applications on mobile devices have long been overshadowed by more traditional mobile apps, which are limited by the boundaries of your phone screen. The game is about to change. The recently-released Apple ARKit and Google ARCore frameworks make it much easier to create highly functional and potentially impactful Augmented Reality applications that get users immersed into the surrounding environment.

Apple ARKit

Introduced in iOS11, ARKit pushes experience beyond the screen by anchoring virtual objects in a real-world environment and providing intuitive interactions between the real and virtual worlds.

The framework uses Visual Inertial Odometry (VIO) and SLAM algorithms, which combines real-time data from motion sensors and the device camera(s) to provide metrically accurate positional tracking. These same technologies allowed NASA’s rovers on the Mars mission in 2004 to successfully orient itself on the red planet!

Here’s where Apple has innovated. Rather than use depth perception sensors hardware, most of the work is done by the software, which uses input from the camera and motion sensors. Both have been a standard component of modern smartphones for years.

Out of the box, ARKit provides key features like positional tracking, lighting estimation, and horizontal plane detection. These address some of the unique challenges of designing and developing for Augmented Reality. For instance, imperfections, such as low lighting or highly-reflective surfaces, have to be accounted for within the experience of an application. But with ARKit, actions like placing a virtual teacup on a real table with the teacup aligned and lit in accordance to the physical environment, are fairly easily.

Exclusive to iPhone X, Apple has introduced Face Tracking. It uses hardware embedded in the “notch” to enable the device to accurately track rich face data and to create an accurate 3D model for use in applications. The technology powers Apple’s newest device security feature, FaceID, which is used in place of TouchID to authenticate the user or authorize transactions. Apple’s Animojis provide a great visual representation of just how sophisticated face tracking is on iPhone X — and how well it can be used for entertainment purposes in augmented reality.

ARKit is expected to reach 500 millions consumer devices over the next year. It will work on a wide range of iOS devices with A9+ processors, nearly every iOS-based device released in the past 2 years. The best tracking quality will be achievable on iPhone 8, iPhone 8 Plus, and iPhone X devices, due to their more optimized hardware — a motivator to upgrade for many this holiday season.

Google ARCore

With its similarities to Apple’s product, ARCore may seem like Google’s response to ARKit for its Android platform. However, all its essential technology existed in Google’s labs long before Apple even started to work on ARKit. It even found its way into a number of consumer devices in the past few years.

Google Tango, as a platform, was originally released back in 2014 and has gone through iterations since. However, it has never achieved wide adoption. One reason is the need for very specific hardware — a depth camera sensor — that Google would have to convince the manufacturers to include. At the moment, only two Android devices on the market are built on the Tango platform, and their sales and adoption rates are extremely low.

While it’s an oversimplification, ARCore is essentially the same technology as Tango, with the depth sensor hardware stripped out of it. Which means that, while it is technically less accurate than Tango, it should find a lot more success, since it can now run on potentially any Android device with a decent camera.

The hardware and software still has to be reasonably well-calibrated to allow for a proper AR experience. With iPhone, calibration isn’t much of an issue, since Apple is the sole manufacturer and creator both the hardware and software. But the Android world is much more diverse. Starting in 2016, Google engineers have been working on calibrating the newest smartphones — particularly Samsung’s S8, Google Pixel and Pixel XL — to fully support Daydream, which made them perfect candidates to initially support ARCore as well.

Going forward, Google is asking OEMs to calibrate their hardware, which will allow for much easier, higher and wider adoption of ARCore.

What’s the difference?

There are many similarities between ARKit and ARCore, such as lighting estimation and horizontal surfaces detection. You can achieve nearly the same results with both. And while ARCore does have better mapping technology (for the user this means that if you lose tracking, it should recover better than ARKit, with minimal impact to the AR scene), in practice, they both work remarkably well.

Both ARKit and ARCore support Unity3D and Unreal Engine for scene rendering, allowing you to create a single AR experience that works equally well on both iOS and Android. This lets designers and developers focus more on innovating, rather than trying to keep track of the differences between the platforms and the different hardware.

Glimpse into the future

Through ARKit and ARCore, Augmented Reality will become an integral part of the next generation of smartphones, just like GPS and location-based services were for the previous generation. According to forecasts, ARKit will achieve higher adoption than ARCore, due to Apple’s full control over its hardware and based on extremely high iOS upgrade rates. But both platforms combined will be installed on hundreds of millions of devices over the next few years. Together, with other companies such as Microsoft, Facebook, Amazon, Kudan, Vuforia, Wikitude, and Escher Reality, there soon will be an abundance of innovative AR experiences.


But few of us dream of walking around, pointing our smartphone at the surrounding environment and interacting with objects through the screen. Rather, as the technology progresses, Augmented Reality will likely move from our hands to our heads — first displayed through glasses and then directly into our eyes, in a form of contact lenses or something embedded even deeper.

While a consumer-ready implementation of something so integrated is likely several years away, several companies, such as Magic Leap, are actively moving in that direction. It wouldn’t be surprising to see early prototypes in the next year or two. And while Apple and Google are staying quiet, it’s safe to assume that they are already performing R&D activities around more advanced technologies. When it arrives, iOS and Android developers already intimately familiar with ARKit and ARCore will be in a position to jump-start adoption of the updated platforms.