2019: The year of “real 3D” mass adoption

Why ARKit and ARCore will move to real 3D

When Apple introduced its first step into Augmented Reality with the introduction of ARKit at WWDC 2017, it set the direction for the whole industry. Google quickly abandoned its project Tango that was based on real 3D sensors to transform it into ARCore that solely uses software to create the Augmented Reality experience. Both approaches are based on the same principle and follow the same goal: Give every smartphone user with an up-to-date smartphone access to Augmented Reality (AR) by using the 2D camera and the accelerometer combined with a lot of image processing to create artificial 3D sensing. Both ARKit and ARCore will provide a critical first generation for AR applications and give it into the hands of a broad audience. Apple is particular open about how they see the future of AR.

“We’re already seeing things that will transform the way you work, play, connect and learn. Put simply, we believe AR is going to change the way we use technology forever.

–Apple CEO Tim Cook, Apple earnings call, November 2, 2017

In retro perspective, Googles Project Tango has been ahead of its time (like Google Glasses) but the market was not ready yet. It will just be a matter of time until more smartphones with 3D sensors become broadly available. Most of the high-end phones already come with a 3D sensor on the front facing side to realize FaceID or Animojis.

What are 3D sensors and how do they work?

3D sensors based on the Time-of-Flight technology offer the highest flexibility for any kind of use case. By pulsing out infrared light into a scene with the help of a VCSEL a 3D sensor can measure the time it takes for the light to bounce off of an object and travel back to the sensor to accurately calculate the distance to the object. The sensor makes use of the pmd principle developed by pmdtechnologies, which measures the difference in energy levels to get exact 3D information from every pixel of the sensor. By combining the depth data, a 3D map of the environment or a single object can be created.

From 2D AR to real 3D

ARKit and ARCore will introduce more and more consumers to AR, as well as establish the ecosystem and test cases for programmers and users to experiment. All this will make features and apps like image beautification, artificial bokeh, FaceID, Animojis, AR Games, smart navigation or measurement a demanded standard.

From 2D AR to real 3D

ARKit and ARCore using the current 2D sensors, however, have drawbacks. As seen with the original Pokémon Go, AR with existing sensors provides an “okay experience”, but far from lifelike as the Pokémon’s did not scale well based on their distance to the phone. Further, they did not interact properly if there were other objects or people in the scene (i.e. did not disappear behind an object (occlusion) or reappear based on the viewer’s angle to another object in the scene).

Additional all applications using the 2D camera are highly depended on the light conditions of the environment. Applications require a lot of processing power to interpret the scene and to identify surfaces and objects. For basic applications, this approach works okay but is limited and quickly reaches its borders. Using the data from a 3D sensor together with ARKit and ARCore will add realism, immersion and details not capable with just a 2D camera. 3D sensors can determine if other objects are in a scene, allowing the virtual object to better interact with real-world objects and people. It will add a new layer of quality to today’s applications and experiences. In short, 3D sensors make an AR scene more lifelike.

Comparison of using 2D sensors vs. 3D sensors for AR

At pmdtechnologies, we are confident that starting 2019 most of the new flagship smartphones will be equipped with a front and/or world facing 3D sensor to realize many different new features and applications.

As official partner of Google’s Project Tango, pmd developed the 3D sensors that went into the Google Tango phones from Lenovo and Asus. The first of its kind on the market. Even today after two years, from a realism standpoint, the AR experience of these phones still beats what an iPhone XS can archive achieve with ARKit.

Today our sensors are even smaller and more powerful to fit into the smartphones top bezels. They have already been integrated into many products like AR headsets, phones, cars, robots, smart home cameras and many more to come soon.

If you are a developer or company and want to start building an application or product which uses 3D depth sensing for a greater experience and more realism we have powerful Development Kits for you which come with our 3D sensor in a compact housing, USB interface and our Software Development Kit.

Learn more about our 3D Development Kits at:
https://pmdtec.com/picofamily.

pmd 3D Development Kit comparison

To make 3D depth sensing technology much more accessible, pmdtechnologies ag is happy to announce partnership with VR First, by being part of the consortium along with Intel, HTC Vive and LeapMotion and SpringboardVR, Mixcast and Futuremark. VR First has successfully been working on the democratization of VR/AR, with a constantly growing network of over 800 universities, 410 Science Parks, founders, students, and 52 VR/AR First labs at universities all over the world. Joining forces with VR First, we are aiming to democratize 3D depth camera tech for developers, researchers, enthusiasts and startups worldwide.

If you are an academic institution, researcher or student, please feel free to reach to VR First for a discount to purchase the device (email info@vrfirst.com).