Porting Daydream’s 3DOF Arm Model to Oculus Go and Beyond

Arm Model Demo From Daydream Elements.

Unlike Oculus Go, Daydream offers an open source project that details best practices for it’s platform: Daydream Elements. One of the most useful things that Daydream Elements provides is an “Arm Model” script. That exterpilates arm movement from device rotation. This script allows developers to fine tune tracking parameters to create a sense of presence, while only receiving the controllers rotational input. However, their script only supports the Daydream device, so how would we port this mechanic to, say, the Oculus Go?

Porting the Daydream “Arm Model” to Unity XR Nodes

Screenshot taken from the asset pack at the end of the tutorial.

Creating the Base

We will use a base class for all our arm models, so they can be referenced generically. This is useful when extending the arm calculations. For example, the XRArmVisulizer (example ins asset pack) can reference an XRArm or a XRTransitionArm. We will call this class XRBaseArmModel.cs

Used for easy referencing.

Getting Device Data

We start by creating a script called XRArm.cs. This script will calculate the Players arm position and rotation based on the headset rotation and their 3Dof controller. This version of the Arm Model uses the following information from the device:

The dominant hand of the player (left or right hand)

Hand Rotation

Gaze Direction

Headset’s angular velocity

Headset’s potion (Interpreted by Unity’s Input Tracking)

To get access to the tracked devices sensors, we implement the UnityEngine.XR namespace. Unity uses “XRNodeState” to get the state of a tracked device, whether it’s a headset, controller, or pointer. You can read more about this class in the documentation : https://docs.unity3d.com/ScriptReference/XR.XRNodeState.html

To store the of states of the devices, we create a list to which the data can be assigned. We call the list “nodeStates”. To get the data from the tracked devices, we call InputTracking.GetNodeStates() in the Update Function. To make sure that we are updating our data once per frame. This function accepts a list of XRNodeStates, to which the data can be assigned. We will use the previously created “nodeStates” list. After our data is assigned, we can iterate through the notes and “try” to get the data that we need. We use the “Try” function because the Nodes are generic and certain values may not be tracked, depending on the device. The “Try” function on Node States creates a lot of Garbage and it can be eliminated if the device you are targeting has a function to get the Headset Velocity. For example Oculus’s OVRManager allows developers to access the headset velocity using OVRManager.display.angularVelocity.

The script debugs the data that is received from the XR nodes.

Interpreting the position of the hand based on our data

In this section we will go over interpreting the arm rotation and position based on the device data gathered in the last section.

Handed Multiplier

Now that we have all our device data, we can calculate the Arm position and rotation. We start by creating a function that accepts all of our devices parameters mentioned earlier and we call it `UpdateArmData`

public void UpdateHandData(bool isLeftHanded,Quaternion controllerRotation, Vector3 gazeDirection, angularVelocity Vector3 headPosition){}

Here we will perform a series of function that calculate our arm position. We start creating a Vector 3 that equals 1 if the player is Right Handed, 0 = Center Handed/Not provided and -1 when the player is left Handed. We will store this value as “handedMultiplier”

Example of code that includes handedness

Storing the local controller rotation

We will reference the controller rotation a few times in our calculation. So we will store it’s rotation in a separate function called “UpdateControllerReferenceRotation”, which accepts a quaternion, for easier readability. We call the stored variable “localControllerRotation”

Interpreting the player’s torso direction

Usually a person’s torso faces the direction of their gaze. However, when the player is moving. The direction of the torso is affected by the user’s rotational acceleration. Because of this, we must use an algorithm to calculate the user’s torso direction. We will do this calculation in a function called UpdateTorsoDirection, which accepts the gazeDirection and angularVelocity values. We will store the calculated values in two variables: Vector3 torsoDirection & Quaternion torsoRotation

Merge our script to Daydream Elements

Now that our code looks fairly similar to the Daydream elements code, and because the code is well commented, I decided to skip the explanation of the calculations and provide the code along with additional ports from the Daydream elements package.

Download Package and Source Code

https://github.com/Babilinski/3DofArmController

The scripts attached are almost exact copies from the Google Daydream Elements Arm Model. These scripts include a visualizer and a script that transitions between two separate arms. This can be useful in scenarios if you want to toggle between two arm calculations like pointing and throwing.

Important Notes

You will have to use your device’s SDK to set the IsLeftHanded value.

The reason we use the UpdateArmData function instead of doing the calculating directly in Update, is so that this system can be ported to Unity’s new Entity System, or into a manager that passes the data to the XRArms. This is particularly helpful if you are using the XRTransitionArm since every XRArm is doing their own calculation.

--

--

Krystian Babilinski is an experienced Unity developer with extensive knowledge in 3D design. He has been developing professional AR/VR applications since 2015.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Krystian Babilinski

Krystian Babilinski is an experienced Unity developer with extensive knowledge in 3D design. He has been developing professional AR/VR applications since 2015.