Creating a Mixed Reality App for Meta Quest 3 Using Unity: Part 2

Satya Dev
Antaeus AR
Published in
3 min readJan 1, 2024

Implementing Plane Detection

Source: AI-generated

Welcome back to our tutorial series on creating mixed reality experiences for the Meta Quest 3! In our previous session, we set up a Unity project, incorporated the XR Interaction Toolkit and Unity OpenXR Meta package, established an XR rig, and introduced a virtual character (Animal) into our mixed reality scene. If you haven’t completed these steps, refer to the first tutorial here.

This tutorial focuses on a fascinating feature called plane detection. The Meta Quest 3 can scan your environment, creating a 3D mesh that maps the room’s contours, including floors, ceilings, walls, and furniture. This data is saved as 3D planes in a room model, crucial for plane detection. Before diving in, ensure your room scan is saved in your Quest headset, including furniture, doors, and windows.

Step 1: Physical Space Setup

Ensure your Quest 3 has a complete scan of your room saved, including furniture and other objects like doors and windows. This scan will serve as the basis for plane detection.

Physical space setup in Quest 3

Step 2: Importing Starter Assets

Head to the Unity Package Manager and import the starter assets from the XR Interaction Toolkit, which includes the AR Feathered Plane prefab, essential for visualizing planes.

Import ‘Starter Assets’ and ‘AR Starter Assets’ from package manager

Step 3: AR Foundation and Plane Detection

Add the AR Plane Manager component to your XR origin in Unity. This component will pull planes from the Quest 3’s spatial data, creating game objects in your scene to represent these planes. Drag the AR Feathered Plane prefab into the Plane Prefab slot of the AR Plane Manager.

Add component -> AR Plane Manager
Search and drag ‘AR Feathered Plane’ to Plane prefab

Step 4: Touch Controller Input Setup

In your Unity scene:

  1. Expand the XR origin and find the left and right hand controllers.
  2. Add XR Controller components to these controllers and select the appropriate presets for input actions.
  3. Add ‘XRI Default input actions’ to input action manager.
Add ‘XRI Default Left controller’ to Left controller
Add ‘XRI Default Right controller’ to Right controller
Add ‘XRI Default input actions’ to input action manager

Step 5: Building and Deploying the App

Once you have made these changes, build and deploy the app to your Quest headset. You should now see the newly visualized colored planes representing different elements of your room.

Connect Quest 3 to your laptop and press Build and Run
Dotted plane detection on walls, floor, ceiling and furniture.

Additional Tips

  • Ensure your Quest headset has an updated room scan for accurate plane detection.
  • Customize the plane prefab for better visualization suited to your app’s style.
  • Continuously test the app on your headset for the best user experience.

Conclusion

This tutorial has guided you through enhancing your mixed reality app with plane detection, improving visualization, and setting up controller input for interaction. Stay tuned for the next part, where we’ll explore further uses of plane data in mixed reality experiences. Keep experimenting and enhancing your MR skills!

--

--