Physics-based simulation via backpropagation on energy functions

Junior Rojas
May 10, 2020 · 6 min read
Image for post
Image for post

These days, most neural network frameworks are essentially collections of functions on multidimensional arrays with added support for a particular type of automatic differentiation (backpropagation).

These features are generally useful for any task that requires numerical optimization (deep learning happens to be one of them) and, in this article, I want to show how these features can be particularly useful for physics-based simulation. I believe that being able to implement physics engines using deep learning frameworks can be very convenient to prototype new environments in the context of reinforcement learning. That is how the physics engine for this project I presented at NeurIPS 2019 was implemented.

Image for post
Image for post

This engine was implemented using the finite element method and implicit numerical integration in PyTorch. There aren’t many implementation details about the engine in the paper, so in this article I want to explain the main ideas behind it and why not only was backpropagation useful to train the neural network controllers, but also to implement the physics engine itself.

The key idea is that we can use differentiable energy functions to define state transitions. This also has some interesting connections to what Yann LeCun refers to as energy-based models in self-supervised learning and could address some inefficiencies of current supervised learning and model-free reinforcement learning algorithms (there’s a good talk online about it).

Image for post
Image for post
Energy-based Approaches to Representation Learning — Yann LeCun

Funnily enough, my physics engine was implemented using energy functions, but all the locomotion controllers were optimized using model-free RL. There might be some opportunities to use some of the energy-based modeling ideas for training in the future, and understanding how energy functions fit into physics-based simulation and deep learning might provide some guidance for future work.

Physics-based energy functions as loss functions

Mass-spring systems are one of the simplest examples I can think of to demonstrate how this works. The parameters that we can optimize for are the vertex positions of the system and our goal is to find some configuration that minimizes the total elastic energy. Every spring has some preferred rest length and they naturally tend to recover their rest shapes over time.

Image for post
Image for post
Mass-spring system

We can define the energy of a spring using Hooke’s law (this is not the only option, but it’s probably the most common). Considering a spring connecting two vertices with positions a and b, and rest length l0, its elastic energy E is defined by:

Image for post
Image for post
Hooke’s law

Computation graph

Image for post
Image for post
Spring energy as a computation graph

Note that in my PyTorch implementation I used a dense matrix for the linear layer (x to d) for simplicity, but there are opportunities to optimize this with sparse operations. Although sparse matrices are available in some deep learning frameworks, support for them is limited in general. Something interesting to note is that if the topology of the mesh were a regular grid, the linear layer could be implemented using convolutions, which is a particular type of sparsity that is well supported in most deep learning frameworks.

You might have seen physics update rules before expressed in terms of forces instead of energy minimization. Defining update rules in terms of forces is also an option, just remember that the force is the negative of the derivative of the potential energy and we are running backpropagation on the energy to automatically compute the force.

Minimizing the elastic potential energy is not the whole story, however. The animation shows multiple Adam iterations, which produces a reasonable simulation, probably because Adam includes momentum in its implementation. However, there are more principled ways to introduce momentum in the simulation, also in terms of minimization, but I’ll leave that for another article.

Optimization methods for physics-based simulation

I am most familiar with simulation methods developed for computer graphics and physics-based animation, where second-order optimization methods are often preferred over first-order methods. This is in contrast to common practice in deep learning which tends to favor first-order methods such as SGD and Adam. In computer animation, Newton and quasi-Newton methods are more common, but in my experience they are harder to implement, especially without automatic differentiation.

Regardless of the optimization method used, I think that expressing the loss or energy we want to minimize as a composition of differentiable operations is a good first step. Having this implemented in a system with automatic differentiation features could be helpful to prototype new optimization methods.

Next steps

I presented mass-spring systems as energy-based models in this article to illustrate how one would use backpropagation to implement physics, but this is just a very simple example, we can do much more. I have not talked about how to properly incorporate mass or inertial effects in the simulation, I only mentioned particle positions, but I never mentioned particle velocities. We can also implement collisions, gravity, friction and elasticity models based on the finite element method. For now, I can say that implementing all these additional features boils down to defining new energy functions, but the simulation will still look a lot like an energy minimization loop where backpropagation allows us to automatically compute forces as illustrated in the mass-spring system example.

The Startup

Medium's largest active publication, followed by +771K people. Follow to join our community.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store