Recording Animations with the Vive Trackers, Part 2

David Wallin
3 min readAug 28, 2017

--

This is a followup to my earlier blog post on using the Vive Trackers to record animations. In that post, I shared my method for puppetting characters for the purposes of recording animations for my in-game characters. My requirements were that I animate the upper body, but I don’t particularly care about the legs since they won’t be visible in game. These samples use two trackers, but with more trackers you could use similar techniques to animate the rest of the body. Or you can blend traditional walking animations in with the tracked animations using unity’s state system. But that’s a topic for another blog post. 😉

One thing a few people requested was a way to make the setup a bit more automatic. Previously you had to manually set up rotational offsets so that your tracked objects matched up with the bones in your models. In my latest iteration, you simply assume a similar pose to your model and click a button. From then on, the controlled parts of your model will assume an orientation relative to your tracked objects starting orientation. We’re using the rotation and ignoring the positions (though we do use the positions to figure out some faked ‘IK’ for the shoulders and chest of your character). The advantage of this approach is that you can puppet a character with different proportions than yourself. This is something traditional VR IK based approaches struggle with (often resulting in wonky looking characters). The down side to this method is that the hand positions won’t match up exactly with my own hands which can be a deal breaker for presence in VR, if used for the player’s character. My use case is animating other characters so this isn’t a problem for me.

To make it easy for other developers to try this out, I put together a sample project. You’ll need two Vive trackers for this, worn on your upper arms. The important script is TrackerPuppet2.cs. Depending on how your bones are named, you may need to edit some of the model paths in this file. Otherwise, things are fairly automatic. Just drop the script on the root of your model, and drop in the TrackedObjects prefab and you’re good to go. Stand inside the character and assume a similar pose then click the trigger on the controller.

The script will find the bones in your model, and automatically assign the tracker device ID’s based on positions, so no fumbling around with hard coded ID’s that change every time you restart SteamVR.

To record animations, I’m still using Unity Runtime Animation Recorder. Fellow VR dev @Verbatetim made a few improvements to this script which I’ve included — namely fixing some issues with blending quaternions and adding a tolerance factor to cut down on the number of keyframes recorded. Also, see my past blog post about setting the ‘legacy’ animation mode if you plan to use these animations on your own models. I think there’s still room for improvement with lowering the file size as these animations can tend to be a bit large. For now, you can play with the Tolerance parameters and/or lower the FPS variable if you want to try to cut down on the file size (see UnityCurveContainer.cs).

Download the project on Github:

--

--

David Wallin

I work as a researcher and developer at The Archer Group (http://www.archer-group.com) and work on games and music apps in my spare time.