Getting started with mixed reality in Unity

Setting up a mixed reality project for Quest

Kimberly Siva
4 min readJan 11, 2024

This article is part of a series in which we build a mixed reality app from scratch. See the previous part here.

Our world puzzle app is designed for mixed reality. I’m picturing my kids working together on a large globe in the middle of the living room. Or, a smaller globe that sits on my desk and becomes part of my office decor. There’s joy in experiencing virtual content and noticing when your cat enters the room. Mixed reality allows us to still be present in our environment, which is why I’m so excited by it.

Lucky for us, Unity has made it much easier to get started with developing mixed reality apps in recent months. We’re going to use the new Mixed Reality template as our starting point.

Creating the Unity project

I’m using Unity 2022.3.16f1 for this project. We also need a recent version of the Unity Hub. From the hub, create a new project and choose the Mixed Reality template.

Once the project loads, go to Build Settings and switch the platform to Android.

That’s it! We now have a Unity project set up to use OpenXR on the Quest. It’s also configured with AR Foundation and Unity’s XR Interaction Toolkit. These are all cross-platform tools, so it should be easy to port this project to other mixed-reality headsets in the future.

The template comes with a sample scene showing how to use passthrough, plane detection, hand tracking, interactable objects, and interactive UI. It’s a great starting point for any mixed-reality project.

Setting up your space

We’re going to build and run the sample project in just a minute. But first, we need to run Space Setup on the Quest. This will prepare our environment for mixed-reality applications. On the device, go to Settings > Physical Space > Space Setup and follow the guide.

The system will automatically detect your walls and floor. After that, you can add furniture manually. I suggest adding at least one non-floor surface, such as a desk.

Deploying to Quest

Make sure your Quest is ready for development. Developer mode should be enabled and it should be connected to your computer. I like using the Meta Quest Developer Hub to ensure my device is connected and ready. Here, you can also enable ADB over Wi-Fi for wireless connectivity.

Now we can Build and Run from Unity. This should install the app on your headset and automatically launch it. For mixed-reality apps, passthrough and plane detection only work in a build (not over a link cable) so we’ll be deploying to the device regularly.

Play around with the sample app for a bit. Try out hand tracking, toggle passthrough on the wrist-flip menu, and test the affordances of the different interactable models. There’s a lot we can learn and reuse in this sample!

Tracking Performance

One final note for today’s tutorial. After developing many VR apps, one lesson I’ve learned is to monitor performance from day one. That way we can catch bottlenecks as they are introduced.

Let’s go ahead and turn on the Metrics HUD in the Meta Quest Developer Hub. Connect your Quest, go to the Device Manager, and toggle on the Metrics HUD. You may be prompted to install it and restart your device.

You can launch the OVR Metrics Tool on the Quest to reposition the HUD to a comfortable position. Use the Pitch and Yaw settings to move it around.

Now we can watch the FPS change in real time as we develop our app.

That’s it for today! We have a good starting point for developing our app in Unity. We’ve created our project and have a sample app with a wealth of features to explore. We can deploy to Quest, run our app, and monitor its performance. Next time we’ll bring in our custom puzzle pieces and turn them into interactable elements.

Part 4: Making the world interactable

--

--

Kimberly Siva

Augmented and virtual reality developer since 2008. Master's in computer science from Georgia Tech, experience across industry from Qualcomm to the CDC.