Bringing Deep Learning to Unreal Engine 5 — Pt. 1

Introduction to Neural Network Inference

Weird Frames
6 min readAug 23, 2022

by Giuseppe Vecchio

Note: This is Part 1 of a multi-installment series on how to bring deep learning to Unreal Engine 5. It focuses on the underlying technology, with particular attention to the ONNX standard, the NNI plugin for UE 5, and some tips about finding the right pretrained model for your task.

If you’re already familiar with those concepts and want to skip to the parts where we get our hands dirty, jump to Part 2 of this series.

1. Introduction

The last decade has seen the rise and affirmation of artificial intelligence as a leading technology in many fields, including robotics, computer graphics and gaming.

AI, or to be more correct deep learning in its various declinations, has proven to be extremely good at solving repetitive, pattern-based tasks, becoming increasingly present in everyday life. While a Terminator-like scenario is still far-fetched (even if desirable), AI applications are now pervasive in modern technology, with some examples being the text-to-image generator Midjourner, Photoshop neural filters, Alexa and many more.

The advancements in such technologies have met the interest of the gaming industry where deep learning could play a significant role to achieve better image quality (nVidia DLSS), or to develop smarter NPCs AI (Sony AI’s Sophy), or to support the creation of assets and resources (DeepMotion).

With the increasing interest in deep learning technologies, more and more platforms have adopted solutions to embed intelligent algorithms. One of the latest additions to the extensive list of tools supporting deep learning models' inference is Unreal Engine with its latest release UE5. This iteration of the famous engine comes with the, still experimental, Neural Network Inference plugin, which allows import and execution of neural networks directly in the engine.

With this article we want to share our knowledge and understanding of this amazing tool, hoping to fill with it the lack (at the moment of the witing) of proper documentation and/or tutorials.

2. Neural Network in production using ONNX

First, let’s try to understand what’s the common solution for running inferencing neural networks, then in Par. 3, we will learn how Unreal makes use of this technology.

While the development and training of neural networks can rely on an immeasurable number of deep learning frameworks (mostly available for python, e.g.: PyTorch, Tensorflow, etc…), releasing neural networks to production was not as strait forward. Even if python is great for quick development and experimenting (excellent job, there, python 😎), it is not as good for production, especially in conditions where resource usage matters (😞). In some cases, you might want to run your network in C#, or Java, or C++ to achieve the absolute best performances, but here’s the caveat: your framework of choice might not have a proper backend for that language or, even worse, it might be an absolute nightmare to use (believe us, we tried).

To reduce the gap between development and production, in 2017, Facebook and Microsoft released the first version of the Open Neural Network Exchange (ONNX), an open standard designed for machine learning model interoperability. ONNX is now supported by several industrial partner including Microsoft, Facebook, nVidia, Intel, Amazon and is available for most programming languages and platforms.

But what exactly is ONNX? Citing the website, “ONNX defines a common set of operators — the building blocks of machine learning and deep learning models — and a common file format to enable AI developers to use models with a variety of frameworks, tools, runtimes, and compilers.”
Basically, ONNX provides an intermediate representation for ML models and pipelines that makes it easier to exchange models between different frameworks.

Without diving too deep into ONNX technical details (you won't need those), what you actually need to know is that UE5 relies on the ONNX runtime to run neural networks and can read ONNX networks. This means that if you're training a custom network in, let’s say, PyTorch, you first need to export it as an .onnx model to be able to import it inside of your Unreal Project.

3. Neural Network Inference plugin in UE5

Starting from version 5.0, Unreal Engine includes a native plugin, called Neural Network Inference, designed to evaluate neural networks in real time inside the engine. NNI is relies on the ONNX model format and can run any model exported as ONNX from standard ML training frameworks (PyTorch, TensorFlow, MXNet, etc). This enables users to take deep learning models from anywhere and run them directly in the engine. For the creation of the NNI plugin, Epic Games worked with Microsoft to use their ONNX Runtime project as the core of the plugin’s inference system.

Citing the official documentation, NNI is focused on:

  • Efficiency: Underlying state-of-the-art accelerators (DirectML, AVX, CoreML, etc).
  • Ease-of-use: Simple but powerful API.
  • Completeness: All the functionality of any state-of-the-art deep learning framework.

At the moment of writing NNI is only available via C++. In Part 2. of this series we will focus on the neural network loading, data processing and inference inside the engine. Note that NNI is not meant for training, neither supports it. Additionally keep in mind that the current version of the plugin supports CPU inference on PC (Windows/Linux/Mac) and Consoles (PS5/Xbox Series X), while GPU evaluation is only supported for Windows DirectX 12.

4. Finding the right network (or training your own)

Screenshot of Hugging Face landing page.

Unreal Engine support for deep learning model is, at the moment, limited to network inference (actually you can train your model inside of Unreal using another plugin, but it only works in Edito and cannot be deployed in a build). For this reason, you first need to train your network, then export it as an .onnx, then import inside of Unreal. Luckly there’s an alternative to training your own network (which generally requires a lot of time and resources). Pretrained models are available online to solve most of the tasks you can think of. Platforms like Hugging Face or Models Zoo (that we can compare to GitHub for deep learning models) provide thousands of pretrained models grouped by task.

Screenshot of hugging face models page.

Spoiler: the GitHub repo linked at the end of the article contains the ONNX files for the models we will be using.

Finally, if you’re still not convinced about using pretrained models, or if the task you aim to tackle is so novel that no pretrained network is available you can still train your own model. Explaining how to train a deep learning network is out of the scope of this article, and if you chose this path, we assume that you know what you’re doing.
Still, we want to give you some tips, so here’s our two cents’ worth: most modern deep learning frameworks now support ONNX export of networks, so you should be fine with whatever you choose to go with, still we would suggest PyTorch for the wide community, extensive support, and official and unofficial methods implementation available. Additionally, we would recommend checking the ONNX documentation, as not all operations are supported yet. That said, exporting a model as ONNX (in PyTorch) is as simple as:

import torchnet = MyAwesomeNetwork()
net.load_state_dict(torch.load('trained.pt'))
net.eval()
ex_input = torch.rand(1, 3, 244, 244)torch.onnx.export(net, ex_input, "onnx_model.onnx")

5. Conclusion

In conclusion, we should now have a basic understanding of what ONNX is, how it is used by Unreal Engine to run inference on neural networks and, last but not least, where to look for pretrained models or, eventually, how to train our own and export it as ONNX.

See you in Part. 2! 😉

P.S.: Oh! And don’t forget to check our other social media (links below) to keep updated. New resources, articles, tips-and-tricks, and tutorials are coming soon. Stay tuned!

Instagram: @weirdframesofficial
YouTube: WeirdFrames
Website: https://www.weirdframes.com

6. Resources

Part 2.: Bringing Deep Learning to Unreal Engine 5 — Pt. 2 | by Weird Frames | Sep, 2022 | Medium

GitHub repo: https://github.com/WeirdFrames/UE5-NNI-StyleTransfer

Pre-built demo: “Rain Princess” demo · WeirdFrames/UE5-NNI-StyleTransfer

Microsoft’s introduction to NNI: https://www.youtube.com/watch?v=7aJQlOe1QTA

--

--