Behind our new integrations with Unreal Engine and Unity

Alex Borghi
WildMeta
Published in
3 min readSep 7, 2021

Earlier this year we unveiled our very first public demonstration with our bots playing Dota 2 using our reinforcement learning tech stack. Since then, we’ve been keeping busy making our work compatible with technologies commonly used by video game developers. Today we are excited to unveil our new library, fully written in C++, that enables easy integration with most modern game engines. We also developed our own integrations for Unity and Unreal Engine so we can spend more time on what’s specific to each game.

Our bots exploring their world during training with our new integrations in Unreal Engine (left), Unity (centre) and Xonotic/Darkplaces (right). UE and Unity are run on Windows while Xonotic is run on Linux.

To showcase our recent work, we integrated our C++ library via our C wrapper into Xonotic, a free and open-source video game using the Darkplaces engine (derived from Quake), in less than a day without any prior knowledge of the codebase of this game and engine. We also implemented a simple game in Unity and modified the Third Person sample project of Unreal Engine to demonstrate our integrations (see videos above).

When working with studios, one of our goals is to seamlessly fit into their codebase so we designed a game developer friendly API which hides machine learning details and creates a bridge with our internal machinery. This way we use concepts that are well known by game developers without requiring machine learning expertise, which helps both ease of access and future maintenance.

For games developed with Unity or Unreal Engine we can use our dedicated integrations for these engines, while our generic version in C++ can be used for games using custom engines. We also have an additional C wrapper for extra compatibility with less commonly used languages in game development. Our game-side tech stack is designed with portability and performance in mind. Our library only uses a very small number of dependencies that are all open-source, notably for the inference of our ONNX models, and are easily replaceable if necessary.

On the machine learning side, we made sure to improve the flexibility of our initial work so that integration would be faster and would require as little custom development as possible. We now support more flexible and more advanced observation and action spaces. Our bots can take in input any number of handcrafted features and/or images (possibly with different resolutions and number of channels) and can output discrete, continuous, discrete-continuous or hierarchically structured actions.

In this blog post, we have shown 3 demonstrations of our integrations with game engines, one in C++ via our C wrapper, as well as 2 using well-known game engines: Unity and Unreal Engine.

Journos friends you can find our press release blog post here and the full press kit available there.

Want to hear more? Do not hesitate to reach out at contact@wildmeta.com or meet us IRL at Develop: Brighton this October.

Be sure to also follow us on Medium, Twitter or LinkedIn!
WildMeta, AI for video games.

--

--

Alex Borghi
WildMeta
Editor for

CTO at WildMeta | Machine learning research scientist | Ex Graphcore, Imagination Technologies & Feral interactive