How To Instantiate a Prefab in a Mixed Reality Environment with Unity

Taikonauten
Taikonauten  Magazine
3 min readFeb 16, 2024

👀 Stumbled here on accident? Start with the introduction!

📚 The objective of this article is to demonstrate how to instantiate a Prefab — in our example, a door — and subsequently anchor it within the MR environment. This process is key to effectively integrating virtual objects, like the door, into the MR space, ensuring they are correctly positioned and maintained within the real-world context.

ℹ️ If you find yourself facing any difficulties, remember that you can always refer to or download the code from our accompanying GitHub repository

First, let’s create the door Prefab required for this article. Download the following archive door.rar and extract its contents into your Assets/3DModel folder. Confirm the following dialog with Fix now after dropping the files.

Confirm with Fix now to mark the texture as a normal map

Then, create a new Prefab in your Assets/Prefabs folder using the same method as we did for the reticle in a previous article. Once you’ve completed these steps, your Prefab should look like this:

The Door Prefab in edit mode

We are now set to instantiate the door Prefab and anchor it within the environment. This action will occur when the user hovers over a valid plane and presses the Trigger button.

Here’s how the updated MRArticleSeriesController script will look:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.InputSystem;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
using UnityEngine.XR.Interaction.Toolkit;

namespace Taikonauten.Unity.ArticleSeries
{
public class MRArticleSeriesController : MonoBehaviour
{
public ActionBasedController controller;
public InputActionReference buttonAction;
public XRRayInteractor rayInteractor;
public ARAnchorManager anchorManager;
public GameObject door;

void OnEnable()
{
Debug.Log("MRArticleSeriesController -> OnEnable()");
buttonAction.action.performed += OnButtonPressedAsync;
}

void OnDisable()
{
Debug.Log("MRArticleSeriesController -> OnDisable()");
buttonAction.action.performed -= OnButtonPressedAsync;
}

private async void OnButtonPressedAsync(InputAction.CallbackContext context)
{
Debug.Log("MRArticleSeriesController -> OnButtonPressed()");

if (rayInteractor.TryGetCurrent3DRaycastHit(out RaycastHit hit))
{
Pose pose = new(hit.point, Quaternion.identity);
Result<ARAnchor> result = await anchorManager.TryAddAnchorAsync(pose);

result.TryGetResult(out ARAnchor anchor);

if (anchor != null)
{
// Instantiate the door Prefab
GameObject \_door = Instantiate(door, hit.point, Quaternion.identity);

// Unity recommends parenting your content to the anchor
// instead of adding a ARAnchor component to the GameObject
\_door.transform.parent = anchor.transform;
}
}
}
}
}

Let’s review the updates quickly:

  1. A public GameObject member named door has been added, enabling us to select the door Prefab in the Inspector.
  2. Instead of simply logging the anchor instance, we now verify if the returned anchor is not null.
  3. The door Prefab is instantiated at the ray cast hit point.
  4. The door (our content) is parented to the anchor, following Unity's recommended practices.

With these updates to the MRArticleSeriesController Script, don't forget to select the door Prefab in the Inspector to complete the setup.

Selecting the door Prefab in our MRArticleSeriesController Script

To test the application, choose Build and Run, as detailed in the previous articles. Hover over a Plane and press the Trigger button. If all configurations are correct, you should see a similar result as seen in the next video:

Selecting the door Prefab in our MRArticleSeriesController Script

Next article: Voice SDK

In our forthcoming article, we will take an exciting step in our MR journey by integrating the VoiceSDK. This addition will enable voice interaction within our MR environment, particularly focusing on using voice commands to interact with the door we've created. We'll guide you through the process of integrating the VoiceSDK into your Unity project, setting up voice recognition, and configuring it to recognize specific commands -- such as "open the door".

Click here for the next article “Voice SDK.

--

--

Taikonauten
Taikonauten  Magazine

We Create Digital Products & Services Users Love. Strategy, Concept, Design & Engineering