Using Data to Improve Mixed Reality Apps

Erik Iverson
Vrtigo Blog
Published in
3 min readAug 18, 2017

We released the first version of our Vrtigo VR Analytics SDK about a year ago. Since then, we have collected data on millions of VR sessions, providing content creators important insights into how users are experiencing 360 videos and VR environments. While the field of VR continues to push forward in interesting directions, consumers have recently begun to have access to augmented reality (AR) and mixed reality (MR) applications.

AR and MR Platforms

The primary driver of AR in the consumer space at this point is Apple. With the announcement of ARKit at WWDC 2017, Apple simultaneously turned millions of phones and tablets into AR-capable devices, and delivered developers a user-friendly platform to implement novel applications using familiar development tools and languages.

Microsoft has also released development kits for their new Windows Mixed Reality (WMR) platform, which allows developers to target devices ranging from traditional VR headsets to the HoloLens. We are currently adding support for WMR to our 360 video product and have been exploring ways to build analytics tools for its mixed reality features.

AR and MR Analytics

What sorts of analytics are available in AR and MR environments? These types of applications typically will involve rendering objects and models “on top of” some representation of the “real world”. The representation of the real world may be captured by a camera lens (in the ARKit case), or the human eye (in the HoloLens case). In either case, we can collect data about how users in the space are interacting with the virtual objects that they see.

Some questions prompted by this setup related to analytics are:

  • Are users drawn towards certain objects in a scene?
  • How long do users spend looking at certain objects in a scene?
  • What parts of objects are users most interested in?
  • What role do color and other aesthetic elements have in users
    interest in certain objects?
  • How do users interact with objects in a scene?

ARKit Demo Application

We created a demo ARKit application that collects data about users’ experiences in AR and summarizes them on an analytics dashboard following these instructions.

VRtigo ARKit demo application, collecting data from the tablet and summarizing it on an analytics dashboard

Our ARKit application simply shows a cube in space that the user can rotate around. The analytics dashboard shows a real-time view of where the user is looking at the object, and tracks historical viewing data that can be used for generating heatmaps. You can see the demo in use in the gif above. Notice that as the tablet moves toward and away from the cube in space, the analytics dashboard updates to show the same view as the user. In addition, the dashboard shows a real-time heatmap of where the user is looking at the cube.

We are also able to compute the distance the user is from objects in the scene, allowing quantification of which objects in the scene a user is drawn to. This type of data can be very powerful in AR and MR apps, for example, for quantifying interest in objects displayed in virtual showrooms.

Next Steps

We are just beginning to explore what metrics and summaries are useful to creators of AR and MR applications. We are excited to continue development of AR analytics solutions as users begin to consume these types of experiences more frequently in the coming months.

Vrtigo is the next generation VR analytics platform for content creators, editors, producers, and marketers.

--

--