First steps into an AR Cloud: Relocalization
Back in June of 2017, on the tails of the WWDC 2017 ARKit announcement, when the world’s imagination began to run wild with the sudden availability of smartphone AR features on tens of millions of iOS devices, Ori Inbar of Super Ventures wrote a seminal post about the “AR Cloud”. In his post, Ori explains what is needed to transition from the novelty AR apps we see today toward the AR future of which we’ve been dreaming — a revolutionary new ubiquitous computing interface providing natural interaction with virtual objects and interfaces. @keiichiban’s Augmented City paints a vivid picture:
If you missed Ori’s article, check it out, or watch his supplementary video here:
Since then, many more “AR Cloud” posts have appeared on Medium, largely supporting Ori’s position, with some finer points made about the relationship between AR Cloud and ephemeral shared state solutions like Google Anchors.
As of today, just days after the Augmented World Expo 2018, the most topical and hotly contested AR Cloud technology is relocalization. (Relocalization enables your device to accurately learn its position and orientation in real world space, which in turn enables multi-user and persistent AR experiences.) Until fairly recently, AR positioning has required markers, such as those used by Vuforia:
Markers have been a staple in AR tracking for years, but markerless versions of AR tracking tech are now rapidly emerging. By my count there are now over half a dozen markerless relocalization plays in motion. Here they are, in no particular order:
Google’s Visual Positioning Service (VPS)
On the tails of Apple’s launch of ARKit, Google responded by launching it’s own functional equivalent: ARCore. While it may seem that Google is playing catchup here, ARCore is really a stripped down repackaging of 2014’s Google Tango project. Through the use of “area definition files”, Tango offered relocalization three years ago. Unfortunately, Tango also required an infrared depth camera, which severely limited the number of supported devices.
Once Apple launched ARKit, Google made the decision to drop the depth camera requirement, and pursue RGB-only based SLAM techniques. In doing so, area definition files were dropped from the newly repackaged ARCore, and Android-based relocalization has been off the table ever since. Almost a year later, Google’s “Visual Positioning System” is resurfacing, which promises relocalization without an infrared depth camera.
As of the writing of this post, Google VPS is not yet available to the public.
Vertical’s Placenote SDK
Shortly after the launch of ARKit, Vertical published Placenote AR, a free app demonstrating relocalization as a core feature. I spoke to the founders last year, and learned that their relocalization offering was created out of necessity, in response to Google’s announcement that they were sunsetting Tango.
On it’s own, this free app is not an all in-one AR Cloud solution, but instead offers people the opportunity to experiment with relocalization today. For app developers, Vertical has launched their Placenote SDK so that more specific use cases can be addressed. Unlike many of the other SDKs outlined in this post, the Placenote SDK is openly available now.
To get a sense of the excitement and opportunity surrounding relocalization, look no further than 6D.AI. Less than a year ago, Matt Miesnieks wrote many of the most widely shared and insightful articles about the coming challenges in AR. Since then he has transitioned his full attention to founding and building 6D.AI, which promises fast, cross-platform relocalization:
Additionally, 6D has recently released video demonstrating real-time mesh construction, which has interesting use cases for AR applications that benefit from topological information about the immediate environment.
As of the writing of this article, the 6D.AI SDK is in beta via invitation only. Applications can be submitted here.
Fantasmo’s Camera Positioning Standard (CPS)
Among the contenders in the relocalization space, Fantasmo stand outs in a few ways:
- At first glance, the size of the tracked space appears to be larger than some of the other demo videos.
- They call out autonomous robot navigation as a use case for a “machine readable” 3D map.
- Their pending Camera Positioning Standard (CPS) proposal promises future interoperability (though the standard itself appears to be unpublished at the time of the writing of this article). Details here are scant, but these are the first traces of shared open infrastructure in the service of creating AR common spaces. More on this later.
Fantasmo’s approach is an ambitious one. To learn more, visit camerapositioning.io.
Selerio is unique amongst the SDKs listed here in that the core proposition appears to be recognizing objects in scene, thus enabling virtual interactions with real world objects. Neverthless, relocalization appears to be a part of their proposition:
Selerio is also accepting applications to access their SDK.
In the outdoor category, Scape is offering relocalization in city-sized outdoor spaces. This initially struck me as a larger, more difficult version of the indoor positioning problem (given that the outdoor environment is subject to seasonal changes). However, in conversation with Edward Miller at AWE2018, he explained that though the tech would work just as well indoors, his team’s background and expertise was strategically positioned to have a greater impact on the city-scale version of this problem. For those of us with concerns about receiving our AR services from ad-motivated data-stockpiling tech giants, this is a promising alternative to Google VPS.
Unlike the indoor propositions, Scape’s relocalization is global, meaning that once your device is localized, you know your precise position in world coordinates, not just local, relative coordinates. (More on this in an upcoming post.)
Scape is currently accepting applications for their private beta which offers relocalization in London, England.
Like Scape, Sturfee is offering city-scale outdoor positioning. They appear to be targeting outdoor location based AR game developers, but I have no doubt that a variety of gaming and non-gaming applications are possible.
I can’t be sure, but I believe Sturfee may also provide global coordinates.
Sturfee is accepting signups for its SDK now.
Vuforia’s Object Tracking
Previously, I mentioned Vuforia as an example of a mature marker based AR tracking system. Vuforia continues to develop their marker-based tracking, with support for a variety of different marker types beyond 2D images. Most notably, 3D objects are identifiable as scene markers, meaning relocalization may be achievable by referencing unique 3D objects in your location, such as distinctive architecture or furniture elements.
In previous releases, Vuforia has showcased “Extended Tracking” which like ARKit and ARCore, uses sensor data to track device pose even when markers have left the camera field of view. (It is unclear to me if this work is compatible with ARkit or ARCore.)
The persistence contender I confess to knowing the least about is Jidomaps. Jidomaps differentiates themselves from other relocalization offerings by clarifying that they are not an AR Cloud service. Instead, they describe themselves as a simpler way to achieve multi-user AR experiences, though I imagine their state could additionally be saved to a cloud based service if they chose to pursue that capability. Having not yet popped the hood, I believe Jidomaps shares state for multi-user AR via local networking, but please do correct me if you know better.
Apple’s Arkit 2.0
Apple’s annual WWDC was earlier this week, and the AR community is still processing the announcement of ARKit 2.0. Much like Jidomaps, Apple is enabling developers to share state between two AR-enabled devices, but has not yet created a cloud infrastructure to store and retrieve these 3D feature maps on behalf of the developers. Instead it is up to the developer to serialize, save, transmit, or restore the AR feature maps themselves. Nevertheless, this reaffirms Apple’s position as an important leader in the AR space.
Now, just a few days later, the first multi-user ARKit 2.0 test applications are appearing:
It is important to call out two caveats to everything that has been said thus far:
First, no attempt has been made to test the veracity of any of the claims made by these companies; nor has any testing been done to measure the relative performance of the techniques employed under the myriad of possible environments and circumstances. There is a huge amount of work to be done by the community at large to sift through and sort out how best to use these technologies. In other, more mature industries, I have seen head to head “shoot-out” tests in which third party labs test for accuracy and speed — this style of testing is a much needed next step for those of us collectively trying to build in Augmented Reality.
Second, and yet more important to my own thesis, relocalization is just the most immediate of many obstacles between the novelty pop-up-book AR and the AR future of our dreams (or nightmares). My hope is that the discussion around “AR Cloud” will come to include some of the other necessary services toward internet-scale AR. Challenges surrounding privacy, security, application fragmentation, open standards development, content management, identity management, community management — each in their own right difficult in a world without augmented reality, yet exacerbated in an augmented world.
Hopefully this post will help set the stage for that discussion. Stay tuned for follow up posts and please feel free to reach out with comments or questions.