Bringing Deep Learning to UE5 — Pt. 2

Real-time style transfer with NNI

Weird Frames
10 min readSep 25, 2022

by Giuseppe Vecchio

Note: This is Part 2 of a multi-installment series on how to bring deep learning to Unreal Engine 5. It goes through the steps to create a new Unreal project and use the NNI plugin to run network inference at real time for style transfer on game frames.

This chapter will show the key steps to implement a real-time style transfer using Deep Learning. Some steps and implementation details are specific for this task, but the main goal is to give an overview of what you need to use effectively the NNI plugin. We hardly recommend you try to understand what we’re actually doing, rather than just copy and paste the code. Each step is as documented as possible, but We also tried to not make it a 2 hours-long read. If, in some parts, you feel lost or unsure about what we’re doing always refer to the code.

Final disclaimer, this chapter is intended for people with a decent understanding of C++ and Unreal Engine. The basics won't be covered, and We’ll give for granted that you some experience with Unreal.

Good luck!

Table of content:

  1. Project setup
  2. Neural Network class
  3. View extension for real time style transfer
  4. Blueprint function library and view manager
  5. Putting everything together

1. Project setup

The first step in our journey to bring deep learning into Unreal Engine is to set up the project. For this tutorial we will start with a simple third person template. Make sure to make it a C++ project.

Now let's get over with the boring stuff. Once the project is created, head to the plugin window and enable the NeuralNetworkInference and OpenCV plugins, then restart the editor.

The last step in the project configuration is to add the required dependencies, namely OpenCV and rendering. Head to the ProjectName.Build.cs file in your source folder (in my case the file is named NNIStyleTransferTest.Build.cs), the dependiencies we need to add are for Renderer, RenderCore, RHI, RHICore, D3D12RHI, OpenCV and OpenCVHelper.

Additionally, we need to add the include path for the Renderer headers, which is located under EngineDirectory/Source/Runtime/Renderer/Private.

Now your Build.cs file should look like this:

Source code for Build.cs

1.1 Importing ONNX models into Unreal

With the plugins active, it is now time to import the onnx models inside project. This operation couldn’t be easier: just drag and drop the file in the content browser and the game is done!
If you want to use one of our models you can download it at this link: https://tinyurl.com/rainprincess-onnx.

ONNX models are imported as Neural Network asset. Opening it we find several parameters controlling the device type for input and output data, as well as the device for the network itself. It can either be CPU or GPU. We can additionally choose if the network should run synchronously (blocking the game until execution is completed, or asynchronously. Finally, if the execution is asynchronous, we can set if the callback delegate should be called by the Game Thread or any thread.

The below screenshot shows the windows prompted when opening a Neural Network asset.

With the project fully set up it is now time to dive deeper into the interesting part!

2. The Neural Network class

The first step is to create a C++ class for handling the neural network and executing the inference step. Since we are implementing real-time style transfer, let's call the new class STNeuralNetwork and put it inside a StyleTransfer folder to keep everything tidy and clean.

Creation of the STNeuralNetwork C++ class

The USTNeuralNetwork class should have a Network attribute (to store the reference to the ONNX network) and a RunModel method, as well as a constructor. The RunModel method should receive as input an image, represented as an array of floats, and should output another image, this time represented as an array of int. Note that this input and output configuration is valid for the specific model.

The STNeuralNetwork.h file should look like:

Source code for STNeuralNetwork.h

In the STNeuralNetwork.cpp file we implement the class methods. The constructor initializes the Network attribute to a nullptr (we will then set it from the blueprints).

The RunModel contains the logic to run our network and perform the style transfer. The steps performed in the RunModel method are the following:

  • Check if the network is set and loaded.
  • Set the network input.
  • Run the network.
  • Get the output tensor from the network.
  • Convert the network output (a float tensor) to an array of colors that can be displayed.

To perform the last step, we need to define a new function called FloatToColor. This function receives each float value from the output and convert it into a color, represented as a uint8, by clamping it between 0 and 255.

Note that at the end of the URunModel method we build the output image by manually setting the pixels’ values. This is done by a for loop. To optimize the execution, it is implemented using the Unreal Engine ParallelFor functionality. This loop is equivalent to:

for (size_t i = 0; i < channelStride; i++) {
results.Add(FloatToColor(OutputTensor[channelStride * 2 + i]));
results. Add(FloatToColor(OutputTensor[channelStride + i]));
results.Add(FloatToColor(OutputTensor[i]));
}

Throughout the entire project, most of the sequential for loops have been replaced, when possible, with parallel loops gaining several fps w.r.t. the non-parallel implementation.

Below you will find the full code for STNeuralNetwork.cpp:

Source code for STNeuralNetwork.cpp

3. View extension for real time style transfer

To be able to read screen pixels and replace what is displayed with the network prediction, we need to create a custom scene view extension. Pay attention, this is where things get a little bit more complicated.

First, we need to create a new C++ class. The class cannot be created (for some mysterious reason) from the Unreal Editor, so we will need to create it from Visual Studio.

To do so select AddClass from the Project menu or by right-clicking on the project folder in the solution explorer. In the AddClass window input the class name and the base class. make sure that both the .h and .cpp files are in the correct folder.

In our case we want to create a class called FRealtimeStyleTransferViewExtension (I know, it’s a long name). This class should inherit from the FSceneViewExtensionBase class.

The new class should override a couple of methods from the ISceneViewExtension interface. Have a look at the code under the //~ ISceneViewExtension interface comment to see which methods we are overriding.

Additionally, we need to define a public method to set the style by passing the chosen neural network.

The class attributes include: a parameter to control if it’s active; a neural network object; the raw captured image; input and stylized images stored on CPU; model input and output images; model input size and height.

Internal operations, like image transfer from and to GPU, image resize, and style application are performed through a set of private methods.

In the FRealtimeStyleTransferViewExtension.h file include the previously created STNeuralNetwork header as:

#include "StyleTransfer/STNeuralNetwork.h"

The implementation for the is FRealtimeStyleTransferViewExtension quite long and complex. Including the whole file would be quite sterile and make the reading of the article much more boring. You can have a look at the full code in the repo or at the link below. In this paragraph we will focus on some relevant implementation details of the view extension.

Link to the code: RealtimeStyleTransferViewExtension.cpp

To enable or disable the custom view, we listen for a console variable, namely r.RealtimeStyleTransfer.Enable, that will be set to 1 or 0 (respectively for enabled or disabled) in blueprints. Have a look at the docs to better understand how it works: Console Variables in C++ | Unreal Engine.

namespace RealtimeStyleTransfer {
static int32 IsActive = 0;
static FAutoConsoleVariableRef CVarStyleTransferIsActive(
TEXT("r.RealtimeStyleTransfer.Enable"),
IsActive,
TEXT("Additional rendering to apply a neural style.\n")
TEXT("=0:off (default), >0: on"),
ECVF_Cheat | ECVF_RenderThreadSafe);
}

To set the neural network to use, and consequently the style, we implement a SetStyle method. It receives a pointer to a UNeuralNetwork as input, sets the device type to GPU and stores the reference to the network in the myNetwork object.

void FRealtimeStyleTransferViewExtension::SetStyle(UNeuralNetwork* Model){
myNetwork = NewObject<USTNeuralNetwork>();
Model->SetDeviceType(ENeuralDeviceType::GPU);
myNetwork->Network = Model;
}

The core of the style transfer is the AddStylePass_RenderThread method, where the screen pixels are read and provided to the network to apply the style.

In order, the methods called inside AddStylePass_RenderThrea are:

  • CopyTextureFromGPUToCPU, which reads the screen pixels and copies them from the GPU to the CPU;
  • ResizeScreenImageToMatchModel, which intuitively resizes the sceen view to match the network;
  • ApplyStyle, that runs the network over the collected frame;
  • ResizeModelImageToMatchScreen and CopyTextureFromCPUToGPU, doind the inverse of the first two steps.

At the moment of the writing, CopyTextureFromGPUToCPU constitutes the bottleneck, leading to low fps, especially at high resolutions. An update will be posted in case we find a better solution.
The implementation for the CopyTextureFromGPUToCPU is shown below. Surface data is read as an array of FColor using the ReadSurfaceData function (which is very slow… 🥱), then each pixel is copied in an array of uint8 using a ParallelFor to achieve better performances.

void FRealtimeStyleTransferViewExtension::CopyTextureFromGPUToCPU(FRHICommandListImmediate& RHICmdList, FRHITexture2D* Texture) {    const int PixelCount = Width * Height;
RHICmdList.ReadSurfaceData(
Texture,
FIntRect(0, 0, Width, Height),
RawImage,
FReadSurfaceDataFlags(RCM_UNorm, CubeFace_MAX)
);
InputImageCPU.Reset();
InputImageCPU.SetNumZeroed(PixelCount * 3);
ParallelFor(RawImage.Num(), [&](int32 Idx) {
const int i = Idx * 3;
const FColor& Pixel = RawImage[Idx];
InputImageCPU[i] = Pixel.R;
InputImageCPU[i + 1] = Pixel.G;
InputImageCPU[i + 2] = Pixel.B;
});
}

Implementation of the other methods should be quite straightforward. Have a look at the source code for a better insight.

3.1. Registering the view extension

Once the custom view extension class is created, we need to instance it in our project. To do so we need to edit the ProjectName.h and ProjectName.cpp files. If you followed the preview steps those two files are called NNIStyleTransferTest.h and NNIStyleTransferTest.cpp.

In the NNIStyleTransferTest.h we need to store a reference to the view extension by adding a protected shared pointer to a FRealtimeStyleTransferViewExtension object. In the NNIStyleTransferTest.cpp we init the object. The two files will look like this:

4. Blueprint function library and view manager

4.1 STFunctionLibrary

Finally, the last piece of C++ we need is a function exposed to Blueprints to set the desired network. To do so we created a Blueprint Function Library called STFunctionLibrary (what an original name! 😂), which will contain the method to set the style. This method receives as input the reference to a UNeuralNetwork. By changing the neural network, we will be able to change the style!

First, in the STFunctionLibrary.cpp file we need to include the previously created view extension. The “include” line should look something like:

#include "StyleTransfer/RealtimeStyleTransferViewExtension.h"

The implementation for the SetStyle function is quite straightforward. We add the passed Model object to the Root component, then we call the SetStyle function of the view extension. The .cpp file should look something like this:

This was the last bit of C++ code we needed. Now it’s time to jump to the unreal editor and make the magic happen! 🪄

4.2 BP_StyleTransferManager

In the editor we need to create a single blueprint actor called BP_StyleTransferManager, which will be responsible for setting the style and enabling the view extension. The new actor should contain a NeuralNetwork, instance editable, variable (just enable the eye icon near the variable nabe in the blueprint editor 👁️).

Now, we need to implement the logic to enable/ disable style transfer. The blueprint logic is quite simple: 1) we check if the neural network is valid; 2) then we enable the style transfer executing a console command; 3) finally we set the style by calling our function from the function library.

The blueprint implementation is shown below.

And it’s done! 🎆
Now you simply need to instance a BP_StyleTransferManager in the scene and set the Neural Network.

5. Final comments

If you’re still alive at this point, you truly deserve a medal!

Jokes aside, in this article we discovered how to use the NNI plugin to run a real time style transfer network. This implementation is far from perfect, and anyone is welcome to work to make it better, but it serves as a good showcase of what can be achieved with this incredible plugin.

Now it’s your time to create something awesome using neural networks in Unreal Engine.

Finally, we want you to know that this episode is not the end of this series. Future installments will dive deeper into less explored areas, like the view extension and the use of OpenCV. If you have questions, suggestions, or something specific that you think we should cover let us know.

Remember to follow us on Instagram and Youtube to bet the first to get the latest updates.

See you in Part 3!

Instagram: @weirdframesofficial
YouTube: WeirdFrames
Website: https://www.weirdframes.com

--

--