Build your first AR App

Bhawna Bhandari
Walmart Global Tech Blog
7 min readMay 23, 2019

At Walmart Labs, we research and implement the new technologies coming in the market. This article focuses on building a sample AR App and how it can be used to solve practical industry problems.

Apple announced ARKit framework back in 2017, an augmented reality framework which allows developers to create AR experiences for the iOS devices. The ARKit framework utilizes advanced iOS features like real-time world tracking and hit testing, plane detection and ability to apply real lighting to virtual objects.

But where do you get started? Come with me on an Augmented Reality journey to build your first AR App.

Augmented Reality

Augmented reality (AR) is the ability to place virtual elements into the real-world and interact with these elements as if they were actually present.

“Augmented reality (AR) describes user experiences that add 2D or 3D elements to the live view from a device’s camera in a way that makes those elements appear to inhabit the real world. ARKit combines device motion tracking, camera scene capture, advanced scene processing, and display conveniences to simplify the task of building an AR experience. You can use these technologies to create many kinds of AR experiences using either the back camera or front camera of an iOS device.”

About ARKit

The core of ARKit is about letting you drop objects into an environment, and manipulate them using the touchscreen. Other than recognizing real-world objects, the main aim is to keep track of objects as the device is moved. As soon as it loses track of this, the illusion is shattered.

If you want a more in depth details I highly recommend you read this page About ARKit by Apple or watch their WWDC 2017 talk on ARKit. I would also recommend to watch Understanding ARKit Tracking and Detection talk and ARKit2 video from WWDC 2018.

Practical, Relevant Industry Problems

Use Case 1: Simple use case where customer can see how the item looks and fits in his room before actually buying it.

Use Case 2: Take the measurement of a space and search furniture based on the dimensions.

For the purpose of the demo we will focus on building a demo app which allows user to add a 3D Model into the space.

Lets get started

You’re going to create your first iOS augmented reality app!!

Prerequisites

To implement ARKit, you’ll need a Mac with Xcode 9 or newer. Also, ARKit requires the A9 or newer processors to perform all the computations in real-time. Therefore, you’ll need a device running iOS 11.

Note : ARKit apps won’t run in the Simulator.

Open up Xcode and choose File -> New -> Project. Choose the Augmented Reality App template:

Choose Template

Hit Next, then pick a name for your project (e.g., “FirstARApp”). You can keep the defaults for the other settings.

Click Next then create the project in a folder of your choice.

If you run the app now. Here’s what you’ll see:

Lets remove the auto-generated code for 3D-scene initialization from the ViewController’s viewDidLoad() method, which should now only contain the following lines of code:

Next, in order to use plane detection we need to enable plane detection in ARKit. This can be achieved by setting the planeDetection property on the session configuration object to ARWorldTrackingConfiguration.PlaneDetection.horizontal in the viewWillAppear() delegate method:

To visualize how plane detection works, we need to add the ARSCNDebugOptions.showFeaturePoints value to the scene view’s debugOptions.

With all the setting in place, the session will detect horizontal plane/flat surfaces in the real-world geometry captured by the device’s front side camera.

Running the app will now show feature points, as ARKit starts finding surfaces in real-time:

But how do we know it has detected horizontal Planes. So let’s work on visualizing the detected planes?

When we enable horizontal plane detection, ARKit calls the renderer(_: didAdd node:, for anchor:) delegate method automatically whenever it detects a new horizontal plane and also adds a new node for it. We receive the anchor of each detected flat surface, which will be of type ARPlaneAnchor.

ARAnchor are used for tracking the real-world positions and orientations of real or simulated objects relative to the camera.

ARPlaneAnchor represents a planar surface in the world, defined using X and Z coordinates, where Y is the plane’s normal.

Let’s implement the renderer(_: didAdd node:, for anchor:)method. We will process only anchors of type ARPlaneAnchor, since we’re only interested in detecting planes. To visualize the detected planes, we will use a SCNPlane object. The SCNPlane is a SceneKit type representing a one-sided plane geometry.

Next, we need to set the plane’s width and height to the anchor extent’s X and Y values. The ARPlaneAnchor extent property provides the size of the detected plane. So, the SCNPlane object will have the same size as the detected plane in the world.

A SCNode gets created with the plane geometry. The node’s position will match the anchor’s position to give us an accurate visual. Use the anchor center’s coordinates to create a 3D vector. SCNPlaneis one-sided and would appear perpendicular to the surface by default. We need to rotate the planeNode 90 degrees counter-clockwise to make it display correctly.

Running the app will produce results like this:

ARKit monitors the environment and updates the previously detected anchors. We can get these updates by implementing the renderer(_:, didUpdate node:, for anchor:) delegate method.

Now, we are able to detect a larger and continuous surfaces.

So lets start adding objects into the room.

Download the 3D Model from Apple’s Gallery. “https://developer.apple.com/arkit/gallery/

For this demo i downloaded gramoPhone.usdz. Now add the downloaded 3d model to the “art.scnassets”.

Here is the sceneGraph for the model we just added.

If you check the nodeInspector for the node. The scale is way too high in order to view the model properly lets change the scale to 0.01 for x, y, z. Xcode will complaint to convert the document hit Convert.

Model is converted to gramophone.scn file. Change the scale values for the demo.

Note : We can also use the usdz file directly. But that will be large and not in scale.

Now in order to interact with the scene we need gesture Recognizer. Let’s add tapGestureRecognizer.

What we are interested in is to add objects to a plane, . To use this we use:

existingPlaneUsingExtent type when hit-testing. Using extent makes the hit-test to take the plane's limited size.

Now that we have the hit-test method, we get back a ARHitTestResult object that contains the worldTransform which can be used to get the position in 3D space. We will be creating SCNVector3, which denotes a position in 3D space.

Once we have the position lets add 3d model at the position.

If you end up using the usdz file. Load the scene using usdz.

Call the method in didTap(_:) to add the item into the 3D space.

Run the application. Once plane is detected tap on the plane to place the gramophone.

Conclusions

To summarize, in this post we went through the basics of Augmented Reality and Apple’s ARKit. We created a simple application that adds our 3D model to the world.

References

--

--