Building Reality…

Neelarghya
XRPractices
Published in
8 min readAug 1, 2020

As pain requires to be felt reality requires to be perceived.

Disclaimer: This article doesn’t deal with the philosophy around perception and reality, if you are here for such things, turn back… or find me... :P

Ok, now that the romantics are gone… This article outlines a detailed technical approach to creating a virtual reality (VR) experience and designing interactions around it.
We would be using Unity3D and building to Google Cardboard for the same.

Let’s get this party started…

1. Project Setup

Start by setting up a basic Unity project, if in doubt refer to below article.

You can find the complete code at the following Repository (I’m using Unity3D 2019.4.4 but any 2019.4 should work). To get synced up you can use the following…

git clone git@github.com:Neelarghya/reality-vr.git
git checkout 1429b49

2. Setting up of SDKs for VR.

Once the project is ready we will set up the SDKs required for VR.

  • [Alternative] If you want to use GoogleVR (Since it simple to start with has good examples and docs, literally a dev’s … dream! XD) for the legacy GoogleVR (Legacy but Matured) feel free to follow the below article…
  • For our case we would be targeting Google Cardboard using Cardboard SDK, refer setup below…

[Commit: 43a9fca]

3. Setting up a basic Scene

  • Start by organize the Scene into Player (0, 0, 0) > Head (0, 1.8, 0) > MainCamera (0, 0, 0) structure. Note the Head is at an height from the body we will come back to this later.
Player and MainCamera set up
  • Add an environment to your Scene, I personally am a sucker for a Plane with a Grid texture and Pro-builder’s bathroom tile like wall material, so… :P
    Let’s also add something that looks intractable for good measure.
Game view for the basic scene setup

[Commit: 36f900d]

4. “How is this VR..?”

  • Start by adding a TrackedPoseDriver component to the MainCamera, this will sync up the camera’s orientation to the user’s head movement.
MainCamera components
  • Next we need to set up the whole VR context by configuring the XR-Cardboard Plugin. We will add a basic setup script to check for updates in the device parameters and to manage closing of the app, and add it to a GameObject.
...
using Google.XR.Cardboard;
public class CardboardSetup : MonoBehaviour
{
public void Start()
{
Screen.sleepTimeout = SleepTimeout.NeverSleep;

if (!Api.HasDeviceParams())
Api.ScanDeviceParams();
}

public void Update()
{
if (Api.IsGearButtonPressed)
Api.ScanDeviceParams();

if (Api.IsCloseButtonPressed)
Application.Quit();

if (Api.HasNewDeviceParams())
Api.ReloadDeviceParams();
}
}
Adding CardboardSetup script
  • Next thing we need is the basic intractability.
public class GazeInteractionSource : MonoBehaviour
{
[SerializeField] private float intractableDistance = 10;
...

We will start a basic script (GazeInteractionSource) with an exposed field (intractableDistance) which would define how close you need to be to the object to interact with.

    private GameObject _gazedObject;
private PointerEventData _eventData;

private void Start()
{
_eventData = new PointerEventData(EventSystem.current);
}
...

We will also define two variables one to to keep track of the object we are focusing/gazing at (_gazedObject) and another would be the pointer event data we use to Invoke pointer event (_eventData).

public void Update()
{
UpdateInteraction();
}
private void UpdateInteraction()
{
if (Physics.Raycast(transform.position, transform.forward,
out var hit, intractableDistance))
{
if (_gazedObject != hit.transform.gameObject)
{
if (_gazedObject)
_gazedObject.GetComponent<IPointerExitHandler>()?
.OnPointerExit(_eventData)
;

_gazedObject = hit.transform.gameObject;
_gazedObject.GetComponent<IPointerEnterHandler>()?
.OnPointerEnter(_eventData)
;
}
}
...

So with every update we will check if a ray cast from the current object hits any other GameObject, if so that would be out newly gazed GameObject. We could call the OnPointerExit for the object that loses focus and OnPointerEnter for the object that gains focus.

    ...
else if (_gazedObject)
{
_gazedObject.GetComponent<IPointerExitHandler>()?
.OnPointerExit(_eventData)
;
_gazedObject = null;
}
...

And if the raycast misses we let the previously focus object lose focus and get rid of its reference.

    if (_gazedObject != null && 
Google.XR.Cardboard.Api.IsTriggerPressed)
{
_gazedObject.GetComponent<IPointerClickHandler>()?
.OnPointerClick(_eventData)
;
}
}

Finally when ever the cardboard api’s trigger is pressed we want to perform the click actions on the focused object.

Finally Let’s add the script to our MainCamera, and an EventTrigger to out Intractable Object, also make sure it has a collider.

Setting up interactions (GazeInteractionSource and EventTrigger)

[Commit: e9c95e3]

VR view, (screen recording)

So what we get is the ability to look around and interact with different objects. This is where you can make a build and experience it in your VR Box/Cardboard.

5. “How am I not gazing at it..?”

That’s seems like an odd question, but that’s a very valid query when it comes to immersive designing, simply because gaze is subjective (unless you are using eye tracking). What I mean is the intractable point on the screen is only at the centre but as a user I am free to gazing at let’s say the top right corner of my FOV (field of view). So even if I’m gazing at any intractable object I might not be gazing at it (i.e. the raycast might not hit it).
Thus visually representing the point of interaction is paramount! This point is represented by a Reticle/Gaze Pointer in most XR mediums. That’s exactly what we are missing…

  • Start by adding a World Space Canvas as a child to the MainCamera.
Setting up Canvas to display Reticle

Notice the Event Camera and the RectTransform properties. Also there is a more advance way of displaying the reticle at the point of the raycast hit along with scaling based on distance to get around the parallax effect the current method has. But that would just bloat things.

  • Let’s design a Reticle then!
    All you need though is some UI and animation to start with… For our purposes we will set up a basic Reticle with 2 animations one to hover and one to click. All in all things should look kind of like…
Reticle’s Animator Controller
Reticle Roll (in and out) and Click Animations

One call out don’t spend too much time on designing the reticle (…I spent way more time than I’m willing to accept on these god awful designs :| …)
And more importantly you will need to add a material with a Custom Shader to the reticle that that will turn off the ZTest, i.e. render the reticle above the rest of the objects regardless of the Z depth that it has in the scene.

  • What’s more important than spending 4hrs designing and animating a damn reticle is making it actually work with the gaze interactions… :P
    Let’s start by adding the events to our GazeInteractionSource class.
public class GazeInteractionSource : MonoBehaviour
{
[SerializeField] private UnityEvent onFocusIntractable;
[SerializeField] private UnityEvent onLoseFocus;
[SerializeField] private UnityEvent onClick;
...

We will update the Update function (:P) to accommodate for these events

...
if (_gazedObject != hit.transform.gameObject)
{
if (_gazedObject)
{
_gazedObject.GetComponent<IPointerExitHandler>()?.
OnPointerExit(_eventData);

if (IsGazedObjectIntractable())
onLoseFocus?.Invoke();
}

_gazedObject = hit.transform.gameObject;
_gazedObject.GetComponent<IPointerEnterHandler>()?
.OnPointerEnter(_eventData);

if (IsGazedObjectIntractable())
onFocusIntractable?.Invoke()
;
}
}
else if (_gazedObject)
{
_gazedObject.GetComponent<IPointerExitHandler>()?
.OnPointerExit(_eventData);
_gazedObject = null;
onLoseFocus?.Invoke();
}

if (_gazedObject != null && Api.IsTriggerPressed)
{
_gazedObject.GetComponent<IPointerClickHandler>()?
.OnPointerClick(_eventData);

if (IsGazedObjectIntractable())
onClick?.Invoke();

}
...

Also we will add a function to check if the currently hovered object if Intractable. (I will skip optimizations not to bloat this article)

private bool IsGazedObjectIntractable()
{
return _gazedObject.GetComponent<IEventSystemHandler>() != null;
}

Link the reticle’s animator to these events and you should be good.

[Commit: 534acc3]

6. “But I want to explore!”

The worst feeling is to be in a new world but not be able to explore it! In the current state the user can only look around and interact with objects it feels restrictive so let’s add some movement to it…
Let’s start by setting up a NavMesh for out Environment, and a NavMeshAgent to the Player (keep angularSpeed as 0). (Ref: https://docs.unity3d.com/Manual/nav-BuildingNavMesh.html)

Baking NavMesh and adding NavMeshAgent to Player

Next we need the Player to do is move or should I say navigate… So Let’s add a NavigationController

public class NavigationController : MonoBehaviour, 
IPointerClickHandler
{
[SerializeField] private NavMeshAgent playerNavMeshAgent;

public void OnPointerClick(PointerEventData eventData)
{
playerNavMeshAgent.SetDestination(
eventData.pointerPressRaycast.worldPosition);
}
}

But we also need to set this worldPosition before we can use it properly. So in GazeInteractionSource.Update()… We update the _eventDate.pointerPressRaycast

...
if (_gazedObject != null && Api.IsTriggerPressed)
{
var clickHandler = _gazedObject
.GetComponentInParent<IPointerClickHandler>();

if (clickHandler != null)
{
_eventData.pointerPressRaycast = new RaycastResult
{
worldPosition = hit.point
};


clickHandler.OnPointerClick(_eventData);
}

if (IsGazedObjectIntractable())
onClick?.Invoke();
}
...

Also we can change all the GetComponents to GetComponentInParent keeping in mind child object can contribute to the colliders.

[Commit: 3b9162a]

With all that we should be ready to experience the Reality that we have built!

A few lines of code here while a new skybox there and we have…

Post polishing, view for user
Look and feel, source: https://www.youtube.com/channel/UC64M3FR1UBOdn6cL9PnXbgA (self)

[Commit: 36f08a9]

…Wow still here, quite the journey! Hoping this gave you a head start in VR and looking forward to see the exciting things you folks would come up with! :D

--

--

Neelarghya
XRPractices

Stuck between being the fly on the wall and the eye of the storm…