How to Capture High-Quality Motion Data With Your Phone And Import Into Omniverse

Move.ai’s motion capture extension allows 3D artists and developers to access markerless motion data for their creative pipelines, absolutely free.

NVIDIA Omniverse
4 min readJan 27, 2023

By: Paul Cutsinger, Director of Omniverse Exchange, NVIDIA

Creating animation data can be costly and time consuming, but with the new Move.ai Omniverse extension, anyone can generate high-fidelity motion data from video captured on their phone.

The extension plugs into Move.ai’s database of motion assets to allow users to import into NVIDIA Omniverse with just one click. These animations can be easily retargeted to characters and linked with rigs in Autodesk Maya or scenes in Unreal Engine and Unity game engines.

“We firmly believe that NVIDIA Omniverse is enabling the pipework to create expansive virtual worlds,” said Niall Hendry, Head of Partnerships and Delivery at Move.ai. “Human motion is essential to bringing these digital experiences and environments to life.”

Connecting to Omniverse enables Move.ai’s motion data to proliferate across an array of software ecosystems, allowing the company to reach new audiences. It has enabled them to cross connect with other software instances without the need to build a plugin for each. For example, Omniverse users can apply animations from Move.ai to characters generated from other extensions like in3D.

Building the Extension

To build the extension, Denis Cera, a developer at Move.ai, used Omniverse Kit’s Python-based SDK along with video tutorials for creating Omniverse extensions and the Omniverse Kit documentation.

To create the user interface, Cera added a [VGrid] to a [ScrollingFrame] and populated it with image previews of the animations. For each image, he also created a button that allows the user to import the corresponding animation.

After creating the UI, he assigned different functions with different arguments to the buttons. Here is how he did it:

```python
button = ui.Button(
text="Import motion",
name=motion["title"],
alignment=ui.Alignment.RIGHT,
width=0,
height=0,
asset_path=self.asset_path,
clicked_fn=partial(
download_motion,
Path(self.asset_field.model.get_value_as_string()),
motion["title"],
)
```

For every button in the UI, there is a corresponding function that has its own set of arguments, such as the download path and animation name. This allows the function to perform different actions or operations depending on which button is clicked.

Omniverse is built on top of the Universal Scene Description (USD) interchange, but Move.ai’s API currently only provides access to FBX (Filmbox) data. Fortunately, Omniverse has a built-in converter that can convert `.fbx` files to `.usd` files. You can either use this converter explicitly or import the `.fbx` file directly and let Omniverse handle the conversion automatically.​

To import the ‘.fbx’ data into Omniverse, they used `CreateReferenceCommand` to directly reference an ‘.fbx’ file in ‘.usd’, as you can find in their `moveai.assets.extension.utils.import_file_to_scene()` function:

```python
def import_file_to_scene(usd_path: Path):
stage = omni.usd.get_context().get_stage()
if not stage:
return

name = usd_path.stem
prim_path = omni.usd.get_stage_next_free_path(stage, "/" + name, True)

omni.kit.commands.execute(
"CreateReferenceCommand", path_to=prim_path, asset_path=str(usd_path), usd_context=omni.usd.get_context()
)
```

For explicit conversion of files to `.usd` you can check the `moveai.assets.extension.utils.convert_asset_to_usd()` function.

“USD will ultimately become the industry standard for metaverse applications and working with NVIDIA helped fastrack our integration with the USD interchange paradigm,” said Hendry.

The source code for Move.ai’s extension is available to everyone on GitHub.

Democratizing Access to High-Quality Motion Data

Move.ai’s mission is to lower the barrier to entry for using high-fidelity motion capture and animation data. With Omniverse, anyone can access Move.ai’s high-quality motion data library with just a single click.

The connection to Omniverse also helps Move.ai’s licensing team to release Motion Packs of free animation automatically to the creative community, allowing them to utilize the motion data for their projects easily. In the future, this will allow users to add their own data to Omniverse.

Launching this March, Move.ai’s iPhone app will be available in the App Store for anyone to use. You can sign up for the Beta version at www.move.ai and download the NVIDIA Omniverse extension here.

Watch a video walkthrough showing how to get started below:

To learn more about building Omniverse applications, extensions, and microservices, check out Omniverse sessions at GTC, a developer conference for the era of AI and the metaverse March 20–23, 2023. Move.ai will be presenting at the conference along with numerous startups and industry leaders advancing the metaverse. Register free and add their session to your calendar today.

Move.ai is a member of the NVIDIA Inception program, which empowers more than 13,000 of the world’s cutting-edge startups. You can apply to be a member today.

Get started developing your own Omniverse extension by visiting the Omniverse Developer resource center and download Omniverse for free today.

Visit the Omniverse Developer Resource Center and the USD page for additional resources, view the latest tutorials on Omniverse, and check out the forums for support. Join the Omniverse community, Discord server, and livestreams to chat with the community, and subscribe to get the latest Omniverse news.

Follow NVIDIA Omniverse on Instagram, Twitter, LinkedIn, YouTube and Medium for additional resources and inspiration.

Meet the Author

Paul Cutsinger is director of Omniverse Exchange at NVIDIA, where he’s focused on tooling for real-time, true-to-reality simulation. With a career spanning Amazon, Disney, and Microsoft, Paul’s work centers on enabling creators to take their ideas into production.

--

--

NVIDIA Omniverse

Learn from the developers and luminaries leveraging Universal Scene Description (OpenUSD) to build 3D workflows and metaverse applications.