How AI Beats Physics at its Own Game

Kaedim
Kaedim
Published in
5 min readAug 3, 2020

Have a look around you and you’ll see the laws of physics busy at work.

If you drop your phone out of your hand, you’ll see it fall towards the ground, accelerating as it goes, before coming to sharp halt as it hits the floor.

As we go about our day-to-day lives, we are constantly witnessing how objects interact according to these rules.

It follows, that to create a virtual game world that feels realistic to the player, the way virtual objects interact in this virtual world should replicate how they do in real life.

This is no easy feat, with huge amounts of time and money being spent on creating accurate physics engines for this task.

From the rudimentary calculations of Atari’s Pong, which was limited to determining the direction at which the projectile bounces given it’s incoming direction, to the way your characters cape wavers in the wind as you dash around in today’s AAA game, game physics has come an extremely long way.

With the increasing realism of video games nowadays, physics is something that developers are paying scrupulous attention too.

For instance, in FIFA 18, developers were required to fix code related to the drag coefficient of air, due to complaints about the way the football moved through the air in previous games.

But while developers want to implement more and more complex in-game physics, our technology is struggling to keep up.

So, what exactly is a Physics engine?

A physics engine is computer software that provides an approximate simulation of certain real world physical systems and phenomena.

While they have important uses in other industries such as film and TV, their main uses are in video games.

Physics engines in fields like CGI have the luxury of being able to be run the required calculations over a prolonged period of time, but gaming physics engines require those calculations made almost instantaneously, so that the result can be output in real-time.

Physics engines employed in modern video games include:

Ragdoll Physics — Simulates the interaction of a physical body with its geometric environment, e.g. a body falling down a flight of stairs, with an individual calculation for the collision and motion of each limb and stair.

Particle Physics — Simulates the motion of many small things from a common source, e.g. the shrapnel from a bomb

Cloth Modelling — Simulates the realistic motion of cloth, e.g. a flag or cape.

How do they work?

Physics engines are complex pieces of software that run lots of complex calculations. Below is a simple example of the type of computation they have to do, in this case simulating an objects motion in any number of dimensions:

1. Figure out what the forces are on an object (e.g. gravity, drag)

2. Add those forces up to get a single “resultant” or “net” force

3. Calculate the object’s acceleration due to those forces (using F=ma)

4. Use the object’s acceleration to calculate the object’s velocity

5. Use the object’s velocity to calculate the object’s position

6. Since the forces on the object may change from moment to moment, repeat this process from #1, forever.

As you can would expect, the continuous loop of calculations that physics engines have to do can become very taxing on your computer when doing multitudes of complex calculations, such as accurate fluid dynamics for a river.

And so, when creating video games, developers have to take into account the capabilities of the hardware their game will run on, making sure that the physical simulations in their games are able to run in real-time.

There is no point in implementing lots of complex and interesting physics phenomena into a game if the GPU simply cannot do all of the mathematical calculations required fast enough for real-time output.

A new type of Physics engine?

As games become more immersive and realistic, and we attempt to implement more intricate in-game physics, we often find ourselves focussing on how we can improve our hardware to be able to run the increasingly complex calculations required.

However, what if we were to take a step back, and instead think about making these physics engines themselves easier to run, rather than just relying on improvements in hardware to keep up.

An exciting project by Montreal-based research team proposes to do just that.

Instead of using a simulator to run the laws of physics, they used a new learning-based method.

Using this new method, the standard physical simulations can be sped up and made 300 to 5000 times faster.

How does it work?

Below is an example of how this new method is used to model the soft body interaction between a ball and rabbit:

Training data X and Y are acquired offline, and is compressed into Z and W respectively.

This compressed training data is then used to train the neural network to recurrently predict the compressed states of the objects.

The output from the neural network can then be used to compute the interaction between the ball and rabbit, which is then rendered on the screen.

Since this is a neural network-based project, many hours of simulation data to train on are needed. But this only needs to be done once.

And then once the neural network is trained, we can give it all the positions, forces, and other information, and it will be able to give us the outcomes far more quickly than the traditional method.

However while this new method may seem great, it is not without its limitations.

While the neural network is able to extrapolate slightly from the examples it has been trained on, if we present it with something that is far outside of what it has seen in the training domain, it will fail.

This does not happen with the conventional handcrafted method we use.

What does this mean for the future of gaming?

The implementation of AI for in-game physics is definitely an exciting prospect.

By enabling developers to implement the features they desire without having to worry “will it run smoothly?”, we can expect a huge surge in the level of realism we see in games.

Whether it’s the way leaves fall off the trees or the way your boots make their footprints in the mud, gamers are definitely going to be in for an immersive gaming experience in the future.

Here at Kaedim, we are enabling the creators of the future, by using using AI to accelerate the 3D creative workflow.

Click here to see our current available job vacancies.

For frequent updates and useful information from the team at Kaedim, follow our LinkedIn page here.

--

--

Kaedim
Kaedim
Editor for

Game-ready, on-demand 3D models. Kaedim's machine learning and in-house art team combine to deliver production-quality assets in minutes.