Motion Recording for Oculus Avatars & Game Objects in Unity With VR Labs’ Free MotionTool

Hristo Zaprianov
Telerik AR VR
Published in
4 min readMay 16, 2019
The sample scene that come pre-built with the package

This article is the sequel to the article written by fellow VR dev Deyan Yosifov about his endeavors trying to record a high-quality (stereo) 360° video. This time, I will go into more detail about the tool used for recording the motions of the Oculus avatars and some “common” objects like the chart and input pointer, prior to recording the video. MotionTool was written out of necessity, since there were no ready-to-use tools, plug-ins or assets in Unity’s Asset Store or elsewhere (and still aren't today), which get the job done. So if you want to record the movement of Oculus avatars or plain Unity objects, or are curious how this can be accomplished, then download our free MotionTool and give it a try.

Recording Avatars, but why?

One possible reason, as mentioned above, is the need to record a stereo 360° video containing Oculus avatars among other stuff. But being able to record and replay the actions of avatars opens up a whole new world of possibilities. You can:

  • have a prerecorded avatar that greets the user and guides him through your experience.
  • simulate different scenarios in environments like classrooms, workplaces, public institutions, banks, hospitals and so on.
  • use avatars to teach or demonstrate certain actions inside your application
  • prerecord avatars and replay them as reactions to your user’s actions
  • use recorded avatars for aiding your development and testing your application
  • use Avatars as NPCs

There are many more ways in which to make use of prerecorded avatars and in the end it’s up to your own, specific use-case. But all those cases require the same prerequisite, namely the ability to record and replay them. This is where our tool comes into play and gives you value by saving you time and pain.

But why a tool?

Why would you bother using a tool in the first place? If you are reading this article, then you probably know the answer already — Oculus do not have a tool of their own that does that, neither did we find any 3rd party tool that does it. The other, more important reason is, that it’s a bit of a complicated process and it’s lacking transparency. Oculus have created a mechanism to transfer avatar pose information over the network, as the core idea of the avatar construct as such is to give its users an appearance, which improves the social experience and interaction between them. But the way they accomplish this is by writing binary “packets” at fixed time intervals (most commonly 30 times a second) and sending them over the network, where they are read by the other participants or clients, “decoded” and applied to the avatar that represents the person that sent them. The process of encoding and decoding the packets and applying them to some avatar is completely opaque to the developer, so they have to use those packets themselves, whether they like it or not. This might be a bit too “low-level” for many developers, especially when they have other tasks at hand and their time is precious. Based on those packets and a couple of examples from the Oculus SDK, we created our tool, which can store all the avatar pose information for a given period of time in a separate file. It can later apply the stored poses on one or more avatars, thus creating something like “canned” avatar animations.

How to use it?

The tool comes ready with examples and there is a series of tutorial videos, which I encourage you to watch before using the tool for the first time:

  1. Baseline tutorial
  2. Working with Avatars

It contains different Unity components that you can attach to GameObjects in your scenes and assign them the objects that you want to record or play back on. The tool stores the data in its own .asset file formats. It also has a couple of synchronization components in case you want to record or play back to/from multiple data files at once. If your use-case is similar to ours, i.e. you want to play back the motions over the same avatars and objects that you record from, then you can use our special “Record Director” editor utility. It can automate the setup process entirely, you just need to tell it which avatars and objects you want to record and it takes care of everything.

At the time of writing this article, you can use the MotionTool for recording only in the Unity Editor. You can’t record in a built Player, no matter the target platform. You can though, play back recorded data in a Player.

Share your feedback

Share your experience with the tool, send us suggestions for useful features or improvements, what you would like to see and what would be beneficial to you (for example record-in-player functionality) or report issues or bugs that you have stumbled upon — every feedback is greatly appreciated and will help us improve even further!

We are looking forward to get in touch, you can reach out here!

--

--