Enhancing locations with hyper-accurate AR and 5G.

Luke Ritchie
7 min readOct 17, 2019

--

Rendered real-time from an S10 5G phone (on a drone!)

This post is about the experiences and technologies Nexus Studios designed to make the AT&T Stadium the first venue in the world to fully harness the AR-Cloud, with hyper-accurate AR powered by 5G.

Almost 2 years in the making, we now have a rapid process for making locations AR Ready (AT&T is our first public location), so we’re launching a new service for clients to benefit from all the tools and techniques we’ve written to make these experiences or utilities possible in other venues.

This is Gilda

Technically speaking this is the concept of the MirrorWorld, Metaverse, or

a persistent 3D digital copy of the real-world -machine-readable, 1:1 scale model of the world that is continually updated in real-time.

The AT&T Stadium, while currently existing as a “private” AR-Cloud, is now fully mapped and relocalizable. The tools we’ve written and the technology we’ve integrated could now allow anyone to design “realities” on top of the venue.

Our project, the Samsung 5G Fan Experience, was always as much as a proof of concept, as it is an experience for the Dallas Cowboys fans.

Using the Samsung S10 5G phone at the AT&T Stadium, the fan experience is transformed — here are some of the highlights…

Hype Up Chants

On game day the most iconic Dallas Cowboys players stand 100ft high integrated into the architecture of the venue, performing their signature moves to get fans excited.

Time Tackle

Our half-time AR game, Time Tackle, was designed to be played from any of the 102,000 seats in AT&T stadium. Tap to control the Dallas Cowboy, who must weave his way through the giant 80ft tall defensive bots.

Live Stats

For the first time ever, we’ve brought one of the best features of watching sports at home into the stadium. Real-time AR stats are rendered onto the field, these include passing yards, sacks, touchdowns, player stats and many more.

Hall of Heroes

In the Hall of Heroes, fans can now meet their heroes, pose for a photo or video, to share with friends. These iconic players are the highest resolution AR holograms ever created, streamed over 5G to your phone.

Background

Our journey really begins with HotStepper, our lovable dude who can walk you anywhere in the world. You can download the app here, it was released in 2017, shortly after ARKit.

We had two primary ambitions; demonstrate that AR could ‘come with you’ (at the time most AR demos were table-top) and secondly that stories could adapt to your location.

HotStepper haircut changes outside any barbershop in the world

But he also had to be a pretty good guide in AR. This unearthed a whole new world of pain for as we learned about the inaccuracies of GPS (it’s obviously still a great technology) and that Apple compass...

I could say a lot about this, but to keep things short — you have to walk (let’s say 3-5m) before GPS has any usable data to work with. Therefore we don’t know where you are or which direction you might be walking until you’re somewhere down that street.

Funky GPS readings
still learning the high-way code

Ultimately HotStepper was partially fixed through “creative hacks”, my favorite, a ‘confidence score’ based on the GPS data, a sort of invisible dog leash that connects HotStepper to the phone; if he’s confident he walks further ahead, if not, he’s more like a badly behaved dog who needs to be by your side.

Enter Visual Positioning

We were happy with the reception the funny looking fella received but disappointed we couldn’t have exactly what we wanted.

So we started looking at Computer Vision solutions or as it’s called, VPS, a Visual Positioning System.

We started to deep-dive on the ability to use Computer Vision to locate you on a worldwide scale.

The promise was centimeter accuracy from a standing start.

A global VPS would solve Hostepper’s inaccuracy almost overnight (hence the beta-release this year of Google Maps AR (minus the wolf). The ability to locate a device with an absolute position in real-world space is really what underpins the future of location-based AR.

However, while the HotStepper might be able to stop walking into the road, he still wouldn’t be able to occlude behind a corner. It was during this time that we began to realize that if we could combine a dense visual map of the location, with visual positioning, not only could Hotstepper occlude around a corner, or walk through a door, we could build an entire virtual world on top of ours.

So we assumed big-tech was well into building their algorithms to allow for visual positioning (more hastily for self-driving cars and robots than AR though), but in early 2018, it didn’t feel like we might see an API any time soon. So we got to know the start-ups...

The short story is that we got to know the team at Scape Technologies very well and thought they were super talented, so we asked them to collaborate with us and they said ‘yes’ ;)

early results of VPS at the Nexus Studio in London; notice the offset
example of relocalization (currently 2–5 seconds) at the AT&T Stadium

And so began a lot of R&D together for the best part of a year. It’s been a close relationship, engineer to engineer, to refine algorithms and develop processes that would benefit a particular location or building.

Designing for a Location

Right now you can’t just open a hyper-accurate three-dimensional view of the real-world or a particular location. A global public AR Cloud hasn’t been created yet.

Google’s VPS will likely be one of the strongest services, but it’s unlikely to include a super detailed mesh and render of a location —this will require a more bespoke process.

We want AR characters or objects to occlude behind balconies, doors or rooftops, we want AR to know the physical dimensions of the real-world, we want 80ft football players to jump up and touch the jumbotron!

an AR player touching the real 7-story high jumbotron

Also if our future is filled with giant photo-real holograms, they should be able to interact with real-world architecture; like when Ezekiel Elliott pulls himself up from the real iron beams…

Holograms integrated into the architecture

So you must be able to design to the location, otherwise, it will be floating contextual information that doesn’t know or interact with the environment in an intelligent way.

So we’ve designed a mapping solution that is fast to capture and process, so we can open up the location inside an editor (like a Unity or Unreal) and design and position AR content right on top of the world.

Holograms, 5G and compression.

The emergence of 5G was an exciting addition to our own research in hyper-accurate location-based AR, and Samsung (our client) wanted to demonstrate the power of 5G and AR together.

With 5G you quickly get to photo-real holograms, perhaps because of their sci-fi ancestry or just because, from a data perspective, they’re impossible to load on a 4G connection.

We’d had experience with volumetric stages in the past, even built our own, but this project presented a very real opportunity to render the best quality AR hologram you’d ever seen. So, we went to talk to our friends at MetaStage in L.A. an official Microsoft Capture stage. Short story, they pushed their capture to the limits and we left with fantastic high res files ;)

As part of our suite of tools, we’ve developed a proprietary compression algorithm for volumetric footage, named .VEG that is able to compress the data down to about 3% of the original data size. It is also perceptually lossless, meaning it retains information down to 0.01 of a centimeter (or ~0.004 of an inch), which on a full-body shot is very small. It also decompresses very efficiently, which means that it happily carries on decompressing more frames, while playing back 80k polygons per frame, and a 3k video texture, on a single thread on a modern smartphone, in AR, with full artistic re-lighting (and with normals).

Oh, and its a streaming format, so there’s no download (on 5G). These giant larger than life holograms are mesmerizing to see and technically, even though it’s still early days for 5G, it’s pretty exciting to see gigabytes of data stream effortlessly.

Last Bits

So we’ll continue to refine the experiences and technology at the AT&T Stadium while hoping for more speedy rollout and adoption of 5G.

Our tools, processes, and technology are designed for 4G as well, and we’re hoping to bring more venues and exciting locations to life.

We’re also very excited about the new experiences and services that this type of technology will facilitate; so we’re pressing ahead on that too.

As a gift for reading this far, here’s a video shot from a drone with a Samsung 5G S10 strapped to it. I wanted to see if it could relocalize from hundreds of feet in the air. Of course, the AR was likely to drift eventually, but would it work, even just for a second? I can’t imagine anyone’s ever tried this.

So ignore the camera shake, I need a better drone-phone, it worked, it drifted, I don’t care — this is really really cool.

thanks for reading

--

--

Luke Ritchie

Currently working on something new. Previously I led #AR #VR #realtime Nexus Studios.