How to Become a Virtual YouTuber/Influencer

List of technical solutions from avatar creation to motion capture

hyprsense
8 min readSep 23, 2019

According to ‘Think with Google’ article, “as the technology used to create VTubers becomes more accessible, their numbers are expected to rise.

Example of a virtual human, Avery, animated in real-time using Hyprsense facial motion capture software

Becoming a virtual character is a charming alternative for those who seek for fame, yet, do not want to reveal their face. One question arises — how can we become a virtual celebrity in a practical sense? What kind of technical solutions can we utilize?

Related article 1: ‘How Many Virtual Celebrities Are There?
Related article 2: ‘Digital Human is Becoming Prevalent
Related article 3: ‘Is the Virtual Celebrity Industry Still on the Rise in 2020?

Edit(Aug 24th, 2020): Along with the researches, we finally released a VTuber PC app on our own! Currently, the app, Hyprmeet, is free to download as a beta app so check it out and let us how you think.

hyprmeet.com

It contains a built-in avatar customizing tool, a virtual webcam feature for streaming/video call software such as OBS, and our real-time facial mocap technology. We are planning to add more features such as VRoid 3D model support, button-triggered animation, etc. so stay tuned!

The Process of Becoming a Virtual Celebrity

We all know that Kizuna AI and Lil Miquela is not real, but a few know what powers them. The appearance of virtual celebrities has been keeping in pace with the advance of real-time graphics and motion capture technology as it enables the creation of graphic content instantly. Grafting onto the traditional SNS influencer industry, virtual humans have become celebrities posting cool photos of themselves every day just like any other influencers.

Technology/Service Landscape for Virtual Celebrities

Let’s move on to the technical solutions in practice — which solutions do virtual celebrities utilize? We created a landscape map of the solutions and categorized them into avatar creation, motion capture, live streaming, and management.

Technology/service for virtual celebrities landscape

As the virtual celebrity ecosystem is in the early stage, existing solutions are somewhat fragmentized and there are no dominant solutions yet. This means that each solution has different supporting platforms/formats. Depending on who the user is — individual or enterprise — the price range, quality, and installation process substantially vary.

There is a brief description and video attached to each solution on the raw data page.* To learn more, click the ‘open as a page’ next to the solutions as below.

Brief description and video about the solutions

*We collected and listed technical solutions/services/software related to virtual celebrities including avatar creation tools, motion capture systems, live streaming platforms, and talent management agencies. Our raw data page is available here.

Can a Non-expert Create a Real-time 3D Model?

Avatar creation tools for virtual celebrities

First, take a look at avatar creation tools for 3D modeling and rigging. Traditionally, you would have to utilize tools like Maya and Blender to create 3D characters from scratch. These tools give you the freedom to create your own unique avatars without any constraint. If you are not a professional 3D modeler or a rigger, however, it is difficult even just to start, not to mention of how time-consuming it is to create a character.

Hyprsense’s Characters built with Maya

There are tools such as VRoid and Live2D for illustrators to create anime-style characters as if you are drawing. The best part of these tools is that you do not need to work on the rigging process. For example, VRoid provides its own pre-rigged base model so that a user can add details by drawing over the base model.

Tools that are optimized for VTuber: (left) VRoid Studio (right) Live2D

Another way is using simple customizing applications that will remind us of the dress up games of our childhood. One can mix and match pre-made options and utilize the created avatar for games or animation. Unfortunately, it is less than ideal for vtubers to use these tools since they do not provide an exporting feature of the 3D model and constrain the avatar use only in their own platform.

Simple customization tool: (left) Facemoji (right) Vkatsu

As the VTuber trend emerged, anime-style avatar creation tools became more accessible. We expect that other types of avatar creation tools will be accessible for the mass public soon when the virtual character trend spreads to other cultures and gains more popularity.

Is a Bodysuit Necessary for the Body Mocap?

Regarding motion capture, you have to make a choice — do you value tracking quality over cost(time, labor, and budget) or do you prefer efficiency over pure performance? High-quality solutions require complicated pre-setup in both hardware and software, often meaning extra bundled equipment. For example, you have to order smart gloves and bundled plugin to enable finger tracking.

Full-body motion capture setup example

An affordable and easier way is holding VR positioning trackers instead of wearing a bodysuit. It might not be as natural as the bodysuit, VR tracker helps you to animate your hands and feet easily.

Give a chance to free tools, too, if your budget is low while wanting to achieve similar functions. When live streaming, importing your 3D character in VRChat is a great option to utilize the virtual space as your studio. You can also control your finger by pressing preset buttons from the free software called Wakaru.

(left) Wakaru hand control with buttons (right) VTuber related free software list

Bodysuit means higher level tracking for now. However, the future might be going in the opposite direction. We have been witnessing camera-based body tracking demos from computer vision/graphics exhibitions — CVPR, Siggraph, AWE. We believe that vision-based solutions that can track the entire body including finger, joint, and facial expression will be out in the market very soon.

(left) Wrnch’s & (right) Octi’s body tracking demo at AWE

Facial Motion Capture with a Webcam

Facial motion capture is a crucial element for virtual celebrities to livestream. As the market provides the demand, there are lots of different solutions available from free open-source software to a high-end solution bundled with a $10,000 camera. Using a sound-based lip-sync software is also one way to easily enable facial animation.

Facial motion capture tools for virtual celebrities

What we discover through the research is that facial capture equipment is becoming lighter and easier. Some solutions still require markers on your face, wearing a helmet with an order-to-made camera to elevate the production quality. But the market seems to move to the other way — supporting regular streamer setups that use 2D webcams. With minimum requirements, the solutions allow anyone to use facial mocap anywhere.

Two examples of 2D webcam-based facial mocap solutions (left) Hitogata (right) Hyprface

Wakaru and Hitogata are powerful examples. These two free software provides an open-sourced facial tracking feature that can be integrated into a 3D character easily. By the nature of the open-sourced algorithm, the tracking quality is not the market best, but the convenience overpowers the cons at least for beginning VTubers.

If you are looking for higher quality paid webcam solutions, there are Hyprface SDK and Facerig. Hyprface can be especially useful to integrate any pre-rigged 3D character models.

What is Holding Back Individuals From Becoming Virtual Celebrities?

Despite all the solutions in the market, it seems to be tough for an individual to become a virtual celebrity right now. One major issue is that there is no solution that can provide the entire pipeline.

An Overview of Creating 3D Virtual YouTuber

All the steps for virtual celebrities are fragmented, supporting format/platform varies, and sometimes, bundled hardware/software is needed. This makes integrating different solutions in one piece harder than as expected.

What should be done for individuals to become virtual celebrities?

  1. We need more avatar creation tools for non-experts that supports diverse art styles.
  2. We need an all-in-one mocap solution from the face, body, to finger using only a webcam. This will dramatically reduce the inconvenience and cost of hardware sensors.
  3. We need closer cooperation among solution providers to make the integration process easier. Each solution should unify or extend the supporting platform or bundle other solutions together.
  4. We need easier UI for non-developers to utilize solutions as easy as playing a game.

Providing the Right Solution for Potential Virtual Humans

The four requirements above are also our team’s major considerations for improving our product. At Hyprsense, we aim to provide the easiest facial mocap solution so that everyone can truly reveal themselves as the virtual humans that they have dreamed of.

Our additional guide document for retargeting process

That is why our team strives to make the tool simple and easy to use. As a part of this, we started to support platforms optimized for VTubers. Recently we announced that our SDK supports VRM and Mixamo while MMD and Live2D are on the way to be supported soon. Also, the SDK provides a guide and internal tools to take weight off our users for utilizing our SDK in 100%. We will keep improving our product and will stay involved in the virtual celebrity community by closely listening to their feedback.

We hope that this article was helpful for potential virtual celebrities who are looking for the right tech solutions! If you are also in developing solutions for individual virtual celebrities, let’s get connected and work together to bring a total solution to the market.

Hyprsense develops real-time human sensing technology. Hyprface is our product-ready software fully built in-house to track expression values and remap them into a 3D character in real-time. The SDK supports iOS, Android, Windows, Unity, and Unreal. If you are interested, feel free to ask us for a free 30-day trial SDK.

--

--

hyprsense

Hyprsense develops real-time facial expression tracking technology to light up the creation of live animation.