Developing A Mixed Reality App for MetaQuest 2 — Pt. 1: Getting Signals

Jad Meouchy
badvr
Published in
9 min readDec 29, 2022

(If you’re the type that likes to skip ahead, you can find links to part 2 and 3 at the end of this article!)

Remember when you were a kid and you saw your first magic show? Maybe it was live and you were lucky — or maybe you watched the magic trick happen on TV. Regardless, that sense of wonder at seeing something seemingly magic happen right in front of your eyes is a memory that stays with all of us.

But! What if you could step back into that magical realm? What if stepping into the magic was as simple as putting on a pair of goggles?

Well, thanks to Meta — and their amazing device, the Meta Quest 2 —you can now enter the magic once again. You can even give yourself special powers, like the ability to “see” wireless signals all around you, like x-ray vision! How rad would it feel if you could reach out and move your hands across your WiFi network, literally feeling the radio waves under your fingertips?

Download SeeSignal today — via the MetaQuest App Lab!

Wave your Wi-Fi hands through space to “feel” the signals

Good news! This IS possible! Our app, SeeSignal, uses mixed reality and holographic projections to paint your real-world signal data onto the real world, empowering you with the awe-inspiring magic powers of an ancient, elemental god! The god (or goddess) of signals.

Documented in this 3-part blog series is our grand vision — and a magical user manual — showing you how to develop and create your very own signal superpowers!

Summary

The ideal audience for this series is enthusiasts as well as aspiring developers. We chronicle the technical journey of building and launching a non-gaming VR app into the App Lab. Be aware there is jargon spread throughout the post.

But don’t let that stop you, if you’re a non-technical reader. There’s enough here to get the gist of things without needing to be a high-end developer whiz. Plus, we have a bunch of rad gifs and images! So keep reading. Regardless, you’ll find some value (and inspiration!) in our journey. We promise!

Vision

This originally started as a concept application in the Magic Leap AR headset store, designed for telecommunication companies deploying next-generation 5G networks. Check out this explainer video to get a visual understanding of that specific use case. But for Meta Quest, we are bringing this industrial technology out of the lab and into the home.

Imagine using enterprise-grade signals tech to map out the WiFi in your den or living room, seeing exactly where the signal is strongest and weakest. Figure out the best part of your home to play Rec Room, Virtual Desktop, or to use Link for extreme VR gaming.

As social VR is growing bigger every day, a productivity tool like SeeSignal helps improve existing experiences and ultimately enable new ones. If nothing else, let it inspire you to the endless possibilities of the magical medium of mixed reality!

Let the Signals Flow

The first step in seeing Wi-Fi was actually gathering the raw data, which required that the user be connected to the wireless network they wanted to visualize. Luckily, the Quest is a lot like an Android smartphone, and almost all devices are already be connected to Wi-Fi before SeeSignal is even launched. After all, you need to be connected to tap into the hundreds/thousands of other gaming and productivity experiences inside the App Lab and Quest Store!

Once the application is launched, it’s constantly monitoring the active wireless connection, listening into the subtle and not so subtle messages flowing across the wire. These data packets get queued into batches and handed off to a machine learning algorithm for the rapid construction of a complex three-dimensional signal model.

The first major technical challenge we encountered was enabling a full house-scale Wi-Fi analysis to happen so quickly that you could walk around and see it update live, right before your eyes. While the Quest is battery powered and effectively a mobile platform, its integrated Qualcomm XR2 SoC has a surprisingly intense amount of computing power available. Still, we needed to be frugal so we could dedicate every spare cycle to modeling radio signals. Great performance would be critical to a smooth user experience.

Unity in “Deep Profile” mode which amplifies performance issues for closer analysis

Our solution was to think tactically about what aspects of rendering were absolutely necessary. Since this would be a mixed reality application, a lot of the visual field would be filled with the video passthrough of the user’s environment. They’d see their walls, ceiling, couch, tables, lights, etc. Realistic shadows that typically add immersion to a VR experience might actually break immersion in mixed reality if they didn’t perfectly match the ceiling lights in the real world. Bye bye, shadows and lighting!

Disabling shadows for a mixed reality passthrough application yields no noticeable decrease in user experience

Mixed Reality Passthrough

The next challenge was enabling the video passthrough mode and tuning it to perfection. Inside Unity, using the Meta Quest SDK, there were multiple technical steps to both enable and position the video layer. At the code level, it’s a well-named component called OVRPassthroughLayer with a few options to toggle.

Then, there are a few extra settings to enable in the build player settings, and figuring out whether this would be a layer on top or below the holograms. Since we wanted the signals to appear on top of the real world, we selected to use the under-layer. This meant anything we added would be rendered last, on top of the video feed.

Since the Quest’s display panels are quite bright, the passthrough layer can be intense. This is why the Opacity value was adjusted ever so slightly to 0.9 to effectively blend in a 10% solid black background. Typically, a full screen transparency layer would cause overdraw on a tiled GPU, and a noticeable negative impact on performance. However, there was no such effect in this case; the OVR Passthrough Layer is very performant and you should feel comfortable implementing it into your own application.

At the time of development, we were constantly rebuilding and re-deploying the application to the headset because the passthrough layer was still experimental. Fortunately, Meta answered our prayers and built a full integration into the Oculus Link system such that we could fully test at our desks, inside Unity, without having to go build and deploy each time.

We highly recommend going through the steps linked here to setup both Unity and the headset for this capability. If you’re interested in building passthrough applications also, these resources will make your life so much easier.

Testing an intermediate build with and without passthrough

A Field of Data Sticks

Signals are represented by a field of floating holographic sticks. Or are they candles? Where the signal is strong, the sticks are colored green. Where the signal is weak, the sticks are red. Yellow and orange are the in-between. Gray, colorless sticks show areas that have yet to be explored or predicted.

As the user walks around the perimeter of their area, the application uses that finely tuned (and patented) AI/ML system mentioned above to build a visual representation of the entire room’s signal strength. Almost like a 3D grid made up of familiar Wi-Fi or cellular signal bar icons.

This stick field visual works in augmented reality because most people don’t typically have colorful candles hanging in mid-air in their room… There are just enough of them to fill the room but not too many to add clutter. When people encounter rooms that are larger or smaller, they can fine tune the density of sticks through the application’s settings interface, which will be covered in the next post.

Explore the whole room to find all signals

Use Your Hands

Another major technical challenge was user input. The Quest supports both handheld controllers and full hand tracking, but very few applications support both modes simultaneously. To maximize the potential impact of this application and really show what’s possible, the decision was made to implement both controllers and hands.

Special settings had to be enabled inside the application to tell the Oculus subsystem to listen for both, and also listen for the user switching. On the app side, the different methods were abstracted away into a standardized interaction model. A finger pinch and a controller trigger squeeze would cause the same activation to occur, trying to be as similar as possible to the well-designed Oculus home screen experience.

The one deviation was in keeping the hands as regular hands rather than moving laser pointers. In SeeSignal, you need to actually move around to reach out and tap on holograms. You can’t just point your hands, you have to walk to the button and push it! This kind of interaction model suited the application experience by encouraging people to get up and explore.

Buttons physically respond to your direct touch

If the whole user interface could be controlled from afar, one might opt to remain seated throughout, and then the application would struggle to measure that signal across the room. With mixed reality, the room is always visible and almost becomes a character in the experience. Stand up, walk around, don’t be shy!

Tap on the floating stick to inspect signal details

Technical Tips and Tricks

  • Rapidly test passthrough mode— Speed up development by enabling passthrough inside Oculus Link, which allows you to preview the experience (and take screenshots) from inside Unity.
  • Render only what’s needed— To maximize the performance and speed, cut down to one or no lights, use emissive textures or “fake” lighting shaders, and disable shadows completely.
  • Target 90+ FPS —The Oculus cameras operate at a high frame rate to minimize any chance of discomfort. When you start compositing or overlaying holograms on top, any disparity between the framerate of the background and foreground can induce motion sickness. Consider bumping up your FPS target to make the mixed reality app more palatable.
  • Seamlessly switch between controllers and hands — The Oculus SDK now provides prefabs/templates for automatic controller switching and hand activation, but for optimizing the user experience of an app or game that leverages mixed reality, consider a more physical interface rather than the default “laser beam” fingertips.
  • Enable “Stage” Origin Mode — The default “Eye Level” and “Floor Level” settings for Tracking Origin Type are straightforward implementations of what the 0,0,0 origin point represents, either the midpoint of the user’s eyes or the center of their Guardian-assigned floor. However, mixed reality applications need a stronger anchor in the real world, and the “Stage” option allows the virtual room to always start in the same real world position. This will disable recentering, which is intentional and desirable, to prevent the virtual data from drifting away from its real world anchor.

What’s Next

Stay tuned for the next post, where the user interface design and development process will be covered, including a variety of new engineering challenges and best practice solutions. The app really starts to take shape during this step!

Download SeeSignal today — via the MetaQuest App Lab!

Feedback

  • Where would you find this application most useful, at home or in the office?
  • Is the red/yellow/green color palette easy to understand, or would you like other options (like for color blindness)?
  • How are you using mixed reality, and what application is your favorite?
  • How does it feel to have a magical superpower?

Please share your thoughts and honest feedback (positive and negative) in the comments below. It ain’t always easy being a magical superbeing, we get it. We’d like to improve however we can and welcome that open dialogue!

What’s Next? Parts 2 and 3!

Interested in continuing to follow our development journey? Follow the links below to read on:

Part 2 — https://medium.com/badvr/mixed-reality-for-meta-quest-user-interface-3b9a084198c3

Part 3 — https://medium.com/badvr/mixed-reality-for-meta-quest-finishing-touches-ffdc54590311

--

--