ARKit Advance debugging

Bar Fingerman
6 min readJun 10, 2020

--

ARKit bring awesome augmented reality experience to the mobile world, as developers we face several challenges that come with the technological advancement, one of them is debugging. In this post i will try to provide tools that can help the debugging process of AR in production apps more efficient and in scale.

Typical debugging

Usually when debugging our systems we print console messages / log to remote service / add breakpoints.

console logs output
remote service logs (Grafana)

In many cases this can help debug a system, but when we dive into the AR world we have more verticals to cover and in many cases these methods are not enough.

Data Visualisation

A better approach will be to create visualisation of a system behaviour, this way we can see through the “black box” using images/videos or any other visual aid.

Take for example machine learning, it based on the idea that systems can learn from data, but the way these algorithms works make it hard to understand and debug, this is why engineers must come up with smart tools to deeply understand what is happening behind the scenes.

TensorSpace a neural network 3D visualisation is a great example for that.

this tool enable engineers to make deep learning more tangibly in order to understand how the model is trained and predicts the results.

AR debugging is no different, we cant just relay simple logs, we need to think more like data a scientist and create visualisation of our AR experience in order to understand what is going on in our 3D space at any given time.

Today’s AR Debugging

Apple ecosystem gives developers the ability to debug AR experience, here are some examples:

Xcode view hierarchy debugger:

With this you can debug a SCNScene and play around with various parameters like camera position objects lightning and more, this tool is similar to UIKit view hierarchy debugger where you stop the session and able to view frames of specific point where the session stoped. You can find more in apple docs “Debugging with Xcode

ARSCNView debug configuration

ARSCNView have several debug options, here are 2 of them:

sceneView.debugOptions = [
.showBoundingBoxes,
.showFeaturePoints
]

These options give visualisation in real time of feature points and objects bounding box. Find more in apple official docs.

Xcode Instruments

When AR applications become complex with lots of animations and custom physics, this tool can help with performance debugging.

These tools are great, but they focused on debugging while developing, some time we need to profile our apps while our end user is using them. So what if we could take some core functionalities from each one of the tools above and create our own tool for advance production debugging, wouldn’t it be great ?

How Advance Production Debugging Can Look Like

By now it should be clear that in order to debug an AR experience we need more then traditional logs, a good example was provided by Apple in WWDC 2017–2018.

Here you can see visualisation of AR session, it includes the user tracking path, / camera position / feature points and colors of pixels.

ARWorldMap from Apple WWDC 2018

Here you can see a 2D visualisation of live World Tracking in an AR session, we have the world origin in the centre and feature points with hit test visuals.

AR world tracking from Apple WWDC 2017

Apple have demonstrated this visuals as part of ARKit keynote, but make no mistake, these are not photoshop illustration, they are real example from apple internal tools.

It is to bad we as developers don’t have access to these tools but don’t worry, we can have this type of visualisation without much effort as i’m about to show you.

Creating custom visual debugging

First i will like to introduce .ply file (Polygon File Format), the format of the file is very simple and enable developers to set vertices and other attributes to a single file.

Here is how you can generate .ply file in your app using ARWorldMap.

sample code for generating .ply file from ARWorldMap

I used ARWorldMap as my data source but collecting feature point from other sources will work the same.

After creating a .ply file you can used Blender to render it.

import .ply file

Here is a real example of a file i generated by scanning a small section in the office.

A point cloud generated from ARWorldMap

The bottom plane is the floor and the vertical areas are walls (the rest are white walls that are difficult to detect by ARKit)

Next i added world tracking to my file by collecting point cloud while the device is moving.

world tracking visualisation

.ply file support color attribute for each vertex, it can be used to create heat map or mark events, let’s focus on a simple use case where we will like to debug user acceleration and high CPU, for this let’s define the following color language. For acceleration yellow is normal, green is slow and red is fast, for high CPU when over 100mb the point cloud will be red.

Now the results of my scan looks like this:

I now able to detect anomalies in my session and also see the user interaction in space.

Where to go fro here ?

I hope you find some useful capabilities in this post and by now start to crate your own custom visual debugging. I use these methods in my day to day and its really change my way of production debugging.

To conclude this post here are few more things you can do with this new approach:

  • Adding 3d point where events occur in your standard log system, after that you can go to the scan and see the physical point of the log in space.
  • Integrate scans results into your QA process: usually when testing AR applications you take the app to the field and run tests along side automated UI/unit tests, when the manual tests complete you can upload scans to remote service and review them manually by looking at images and verify that there are no colors that indicate errors according to the color language you defined, this way you can scale up your testing and get fast results in short time.

There are more things you can do but i will cover them in the next post.

--

--