SLAM and XR Tracking

Danielle Flinn
desn325-EmergentDesign
3 min readApr 30, 2019

In many XR applications there are two aims for the experience to be successful, creating an accurate digital representation of the real environment and understanding where you are in that environment to maximize the user’s experience. It is not difficult to imagine then, that if an environment is not mapped correctly or a location is misunderstood, or the two are not properly synced, an entire experience would be derailed. Fortunately, SLAM provides a solution to solving this dilemma.

SLAM stands for “Simultaneous Location And Mapping”. This means, when an XR application implements SLAM, it is using both sensor data and location tracking to create the environment it is in. It uses data currently being created and its memory to continuously modify and enhance a digital map of the environment and its location within it.

Through scanning and taking in data to build the environment, SLAM identifies “anchor points” — or areas that have little to no change. It does this by continuously comparing incoming data from the device’s camera with previous data it has “seen”. With these anchor points identified, SLAM is then able to calculate its location in the environment-based on its proximity and position in relation to the anchor points, thus reducing drift and furthering the user’s experience.

SLAM then turns these 2D key points into 3D Landmarks it can continue to reference to adjust its position.

SLAM has been one of the best XR solutions to AR since the industry began. After reading about SLAM, I realize the Amazon View in a Room application I reviewed a few weeks ago, even implements it. Think about it, in just a mater of seconds after opening the app, you can place an item into your “room” and the application will automatically scale it to size; allow you to then rotate it, and then the viewer will adjust as you move around the item. If that isn’t SLAM, I don’t know what is…The only data the app seems to lack, is responsiveness to lighting. Currently many of the products placed into a room do not adjust to match the lighting. This is where the developers could utilize SLAM a bit more to add realistic effects to augment the experience even further.

WITHIN on the other hand, did not seem to implement SLAM of any kind. As a mobile VR experience application/viewer, there is not a lot for SLAM to track. Instead, WITHIN simply tracks the orientation of the device. If the gyroscope is facing forward, you see in front of you; backward, behind; left, left; and right, right. In some experiences, it also tracked the distance the device was from the floor. Moving the environment slightly to adjust to the height. With this being all that is being tracked; SLAM is not being used, since the application is not mapping an environment.

SLAM is a powerful tracking tool for the AR industry; and it will be exciting to see where it goes in future development.

--

--