Stories in code

I’d Like To Introduce My Metahuman

The Metaverse will be inhabited by metahumans. Don’t worry. They’re mostly friendly.

F Bavinton
Predict

--

Headshot of metahuman Fin with animation face rig next to her

When asked about metahumans, Lex Luthor exclaimed:

Yes, the Metahuman Thesis! More likely than not, these exceptional beings live among us, the basis of our myths. Gods among men upon our, our little blue planet here! (DC Comics Wiki, 2021)

These are not the metahumans this article will be discussing. Nor are they Deepak Chopra’s Metahumans, who are supposed to go beyond human constructs and allow us to get in touch with our innate beings (Gutterman, 2019).

This is not to suggest that Metahumans don’t have the potential for maniacal world domination or helping us achieve inner enlightenment, but at this early stage in their careers, it’s probably expecting too much.

The Metahumans that are the subject of this story are the inhabitants of the projected new immersive Internet called the Metaverse (Bavinton, 2021).

In April 2021, Epic Games released early access to Metahuman Creator, a cloud-based app for creating real-time, photo-realistic fully rigged digital humans (Epic Games, 2021). This coincided with the early access release of Nvidia’s Omniverse, an open platform for virtual collaboration and real-time physically accurate simulation (Omniverse, 2021). The Omniverse not only provides a space for collaboration, but it is also a platform for the creation and use of tooling for making immersive worlds and characters. One such application is Audio2Face, an application that generates facial animation from an audio files (Audio2Face, 2021).

The Omniverse is touted as the machine shop for the metaverse as it sits atop the Universal Scene Description (USD), a framework created by Pixar for the interchange of 3D computer graphics data, (Pixar, 2021). This provides the different technology vendors a mechanism that allows their application to talk and exchange data with other applications in a collaborative immersive workspace. It allows Unreal Engine developers and Unity developers, for example, to work on the same scene in real-time even though they are using incompatible frameworks. Metahumans are USD compatible.

As mentioned earlier, however, these technologies are still in early release; but as they emerge and evolve, so do the storytelling possibilities. It feels a little like what I imagine the pioneers of early cinema experienced when the first movie cameras appeared. We are still in the early stages of this journey and metahumans are one of the exciting potentials.

Metahumans can take on many forms: a scripted animation being driven by a computer; an avatar being puppeted by a person; or a digital agent being driven by AI. In turn, each of these present potentials for novel engagements with audiences. They also mean that as story creators and tellers, we need to develop a more sophisticated understanding of the emerging impacts metahumans will have on our relationship with audience. This includes questions such as does the origin of the movement or speech impact our interaction with the digital humans we encounter? This will become more pressing as metahumans become increasingly hard to distinguish from real people.

Such questions may be getting a little too far ahead, particularly if you have not yet met a metahuman. In case you haven’t, let me introduce you to our metahuman, Fin.

Meet Fin, our metahuman. Here we run her through a range of actions from dancing to reciting poetry.

Part of Fin was created with Epic’s Metahuman Creator but that was just the beginning of her evolution. Fin and other metahumans are helping us to understand how to develop believable characters for 3d environments. It is not just a question of achieving a ‘photo-real’ face and appearance. The way the character moves, their speech, expressions, and body animations are immensely important. The complexity of the animation required to achieve kinematic photo realism is computationally expensive and requires a significant amount of skill (and patience).

Fin is also helping us address a range of emerging questions and concerns about the future use of this type of technology in storytelling. These include the increased use of digital doubles in film and the rise of what could be termed necromantic cinema, where we use digital doubles to resurrect dead actors.

Digital doubles have been used in film and entertainment for some time, notably in franchises, to for example, de-age the original actor. This was a feature in Tron Legacy (2010), where Jeff Bridges, who played Flynn in the first movie released in 1982, has a digital double depicting him as he appeared then. Most recently we have witnessed the virtual reinvigoration of ABBA (ABBA, 2021).

In 2012 audiences attending a Snoop Dogg concert were surprised when the dead rapper Tupac appeared on stage in holographic form and performed a duet (SnoopDoggTV, 2012).

In Star Wars: Rogue One (2017), the actor Peter Cushing reprised his role of Grand Moff Tarkin. Thing is, Peter Cushing died in 1994. The 2017 Grand Moff Tarkin was a digital double.

With the death of Carrie Fisher, loved for her role as Princess Leia, a debate about the ethics of using digital doubles, particularly for deceased actors, emerged. Notably it highlights that characters in well-loved stories tend to outlive the actors that played them. Digital doubles are clearly an option for keeping that character alive and working.

But what about when we are talking about a digital double of the actor themselves so that they can continue working? Recently it was announced that 1950s heart throb James Dean is being digitally resurrected to star in a new film. Describing the James Dean project, Travis Cloyd, CEO of Worldwide XR, who is leading the design stated: “Our focus is on building the ultimate James Dean so he can live across any medium” (Dalton and Kemp, 2021).

Apart from broader questions about whether we should be resurrecting deceased actors for our entertainment, there are also questions about who owns their likeness, what types of roles are appropriate e.g., digital doubles in sex scenes? Who will puppet and voice the double? How will the double, the motion / voice actors or AI be credited and who gets paid?

Fin is also assisting us to navigate this era of what is termed fake news, where we are being troubled by questions about determining truth and trust. While the popular media is scaring us with deep fakes and rampant AI, there are more considered discussions being had amongst scholars and artists about shifting notions of realism in film, VR and notably, documentary. Is photo-realism and film more truthful than animation? How and why have we come to trust images produced by cameras over other forms of image making? What does this tell us about the types of truths we are seeking? Fin is made possible by the same types of technologies that produce deep fakes. She will hopefully help us learn to understand them and contribute to future development.

Without wishing to overburden her, Fin is also helping us to better understand usability and audience engagement. This includes trying new things with story form (watch this space) and tackling issues such as uncanny valley, an hypothesis that is still poorly understood in this space.

The concept of the uncanny valley was created by robotics professor Masahiro Mori in 1970. His original hypothesis was that as the appearance of a robot was made to look more human, some observers’ emotional response to the robot would become increasingly positive and empathetic, but only up to a certain point. As the robot approached humanness responses turned from positive reception to strong revulsion (Mori et al., 2012).

While this hypothesis might hold for physical robots, how does it apply, if at all, to metahumans? Revulsion and disgust are complex. As William Miller discusses in The Anatomy of Disgust, revulsion and disgust brings order and meaning to our lives even as it horrifies and revolts us (Miller, 1998). There is a profound connection between inculturation and our visceral responses to something.

That is not to dismiss the idea that CGI characters can produce negative feelings. The movie Cats (2019) reveals that we are capable of having very strong reactions to created characters (Tayag, 2019). But we also need to be mindful of the difference between liking and disliking (a preference) and revulsion and disgust (potentially an enculturated unconscious bias). Fin will help us to explore these dimensions.

We hope you enjoy meeting Fin. If you would like to ask her a question relating to the issues raised above, please add them to the comments section. You never know, she might answer.

You can catchup with Fin on Instagram @finsplace and Twitter @FinRenaissance

Stories in code is made for sharing. If you enjoyed this article, please show your support by giving it a clap, following me and sharing with others who you think might be interested.

References

ABBA (2021). ABBA Voyage Official Website — 2022 ABBA Concert in London [online]. Available from: https://abbavoyage.com/ [Accessed 26 November 2021].

Audio2Face, O. (2021). Omniverse Audio2Face App [online]. Available from: https://www.nvidia.com/en-gb/omniverse/apps/audio2face/ [Accessed 22 November 2021].

Bailey, S.W. (2020). Applications of Machine Learning for Character Animation. UC Berkeley [online]. Available from: https://escholarship.org/uc/item/6zm942z6 [Accessed 22 November 2021].

Bavinton, F. (2021). “The Metaverse Is Coming!”, Cried the Straight White Billionaire Dude. Medium [online]. Available from: https://fbavinton.medium.com/the-metaverse-is-coming-cried-the-straight-white-billionaire-dude-5cba2238d6a5 [Accessed 25 November 2021].

Dalton, A. and Kemp, M. (2021). James Dean revival spurs debate on raising the digital dead [online]. Available from: https://apnews.com/article/careers-james-dean-us-news-ap-top-news-movies-f9786493b3d029dc18be88025b51298c [Accessed 26 November 2021].

DC Comics Wiki (2021). Metahumans [online]. Available from: https://dcextendeduniverse.fandom.com/wiki/Metahumans [Accessed 25 November 2021].

Epic Games (2021). Digital Humans | MetaHuman Creator [online]. Available from: https://www.unrealengine.com/en-US/digital-humans [Accessed 22 November 2021].

Failes, I. (2020). Real-Time CG Humans Hit the Big-Time. VFX Voice Magazine [online]. Available from: https://www.vfxvoice.com/real-time-cg-humans-hit-the-big-time/ [Accessed 21 November 2021].

Foward kinematics, W. (2021). Forward kinematics. Wikipedia [online]. Available from: https://en.wikipedia.org/w/index.php?title=Forward_kinematics&oldid=1054686237 [Accessed 22 November 2021].

Gutterman, A. (2019). Deepak Chopra Wants You to Have a More Meaningful Life [online]. Available from: https://time.com/5706092/deepak-chopra-metahuman/ [Accessed 25 November 2021].

Inverse kinematics, W. (2021). Inverse kinematics. Wikipedia [online]. Available from: https://en.wikipedia.org/w/index.php?title=Inverse_kinematics&oldid=1056373182 [Accessed 22 November 2021].

Miller, W.I. (1998). The Anatomy of Disgust. Harvard University Press.

Mori, M., MacDorman, K. and Kageki, N. (2012). The Uncanny Valley [From the Field]. IEEE Robotics & Automation Magazine. Vol. 19. pp. 98–100 [online]. https://doi.org/10.1109/MRA.2012.2192811.

Motion capture, W. (2021). Motion capture. Wikipedia [online]. Available from: https://en.wikipedia.org/w/index.php?title=Motion_capture&oldid=1056363385 [Accessed 22 November 2021].

Omniverse, N. (2021). Omniverse Platform for Virtual Collaboration [online]. Available from: https://www.nvidia.com/en-gb/omniverse/ [Accessed 22 November 2021].

Pixar (2021). Introduction to USD [online]. Available from: https://graphics.pixar.com/usd/docs/index.html [Accessed 15 April 2021].

Procedural animation, W. (2021). Procedural animation. Wikipedia [online]. Available from: https://en.wikipedia.org/w/index.php?title=Procedural_animation&oldid=1014650463 [Accessed 22 November 2021].

SnoopDoggTV (2012). Tupac Hologram Snoop Dogg and Dr. Dre Perform Coachella Live 2012 [Film].

Tayag, Y. (2019). ‘Cats’ Trailer: Uncanny Valley Researchers Explain Why It’s So Creepy [online]. Available from: https://www.inverse.com/article/57870-cats-trailer-filmed-in-the-uncanny-valley [Accessed 26 November 2021].

--

--

F Bavinton
Predict

Storyteller and technologist. Revelling in the heady mix of algorithms, film and game engines. I love telling stories with and about code.