Physics Informed Machine Learning — The Next Generation of Artificial Intelligence & Solving Optimization Problems

Dominik Andrzejczuk
The Quantum Data Center
7 min readJun 5, 2023

--

Isaac Newton’s Work on Calculus — De analysi(1711)

The invention of calculus by Isaac Newton and Gottfried Wilhelm (von) Leibniz in the 17th Century transformed our understanding of the world (We still don’t know who should take credit so we’ll be sure to mention them both) and paved the way for advancements in various fields such as physics, chemistry, biology, economics, pure mathematics, and all areas of engineering. The genius of calculus lies in its ability to deal with changes — it enables us to analyze how a quantity changes in relation to another variable, typically time.

Differential equations, a central part of calculus, encapsulate these relations. They’re used to depict how things change and evolve over time, under the influence of various factors. Think of a double pendulum, for instance. The motion of this system is intricate and unpredictable. However, with differential equations, we can represent this complexity and study how the system evolves over time. In essence, calculus, and by extension, differential equations, have become our primary tool for understanding dynamic systems in nature and technology.

The equations mentioned earlier represent a double pendulum system’s dynamics, illustrating the interconnection between different parts of the system and how it evolves over time. However, they depict an ideal situation, devoid of real-world elements such as friction or air resistance. Consequently, when these models are applied to practical scenarios, the results deviate from the theoretical predictions due to these omitted factors.

On the other hand, machine learning (ML) approaches these problems from a different perspective, aiming to decipher the system’s physics from the data it is given. Despite its ability to start from scratch and approximate anything, machine learning’s strength can also be its weakness. It requires substantial datasets for training and considerable computational resources. Moreover, because it’s a generalist tool — capable of addressing a wide range of tasks — it may not excel at solving specific, complex problems as efficiently as specialized approaches.

Imagine the relationship between machine learning and first principles physics, and their implementation complexity, being represented on a graph:

It illustrates that pure machine learning, despite its generalist nature, is difficult to implement for modeling complex physics systems. At the same time, capturing minute discrepancies using only first principles physics — the foundational mathematical equations that describe our system — also presents a significant challenge.

By capitalizing on the advantages of both first principles physics and machine learning, we create a synergistic approach that optimizes model training. This integrated method demands significantly less data and computational resources, delivering solutions more rapidly compared to exclusive reliance on either first principles physics or pure machine learning. This harmonized approach presents a highly efficient and effective path for solving complex problems.

(using first principles physics only — idealized model)
(using first principles physics combined with machine learning to model discrepancies)

How does this translate into real world applications?

Modeling a double pendulum using physics-informed machine learning is a fun proof of concept, but is still only an interesting science experiment. The real question is, how do you use these concepts to solve real world problems?

Traffic management presents a compelling application of physics-informed machine learning. If we envision traffic as a fluid flowing within a conduit — vehicles being the ‘fluid’ and roadways the ‘conduit’ — we begin to see the underlying physics at play, namely fluid dynamics.

Traffic flow shares common variables with fluid flow, including density, pressure, and flow velocity, all of which fluctuate over time. The Navier-Stokes partial differential equations, which are fundamental to fluid dynamics, can serve as our base model:

However, real-world traffic is more complex than fluid in a pipe. It’s here that machine learning steps in to capture the discrepancies. For instance, whereas fluids tend to flow faster in narrower conduits, traffic flow slows down as the number of lanes decreases. Machine learning can model these inconsistencies, refining our traffic model.

This enhanced, physics-informed machine learning model can then be integrated into a product designed to monitor, predict, and manage the impact of disruptions like construction or road closures on traffic flow. It could optimize detour routes and manage traffic in a way that minimizes disruption, leading to significant time and cost savings for all road users. This example exemplifies how the fusion of first principles physics and machine learning can offer tangible, real-world benefits.

One Small Caveat

Physics-Informed Machine Learning (PIML) indeed holds great potential, but when it comes to its implementation, we hit a stumbling block. There’s a well-known communication gap between physicists and computer scientists, primarily due to their different lingos and approaches. A simple pendulum motion, described by a straightforward differential equation in physics:

translates into a somewhat less intuitive script in Python, a popular high-level programming language:

def recomputeAngle(self):
scaling = 3000.0/(SWINGLENGTH**2)

firstDDtheta = -sin(radians(self.theta))*scaling
midDtheta = self.dtheta + firstDDtheta
midtheta = self.theta + (self.dtheta + midDtheta)/2.0

midDDtheta = -sin(radians(midtheta))*scaling
midDtheta = self.dtheta + (firstDDtheta + midDDtheta)/2
midtheta = self.theta + (self.dtheta + midDtheta)/2

midDDtheta = -sin(radians(midtheta)) * scaling
lastDtheta = midDtheta + midDDtheta
lasttheta = midtheta + (midDtheta + lastDtheta)/2.0

lastDDtheta = -sin(radians(lasttheta)) * scaling
lastDtheta = midDtheta + (midDDtheta + lastDDtheta)/2.0
lasttheta = midtheta + (midDtheta + lastDtheta)/2.0

self.dtheta = lastDtheta
self.theta = lasttheta
self.rect = pygame.Rect(PIVOT[0]-
SWINGLENGTH*sin(radians(self.theta)),
PIVOT[1]+
SWINGLENGTH*cos(radians(self.theta)),1,1)

Fortunately, a relatively new language named Julia offers a solution. Julia is designed to make complex mathematical concepts more readable, while still retaining high-level programming attributes. Consider the example of simple harmonic motion we mentioned earlier:

# differential equations
simplependulum(du, u, p, t) = (θ=u[1]; dθ=u[2]; du[1]=dθ; du[2]=-(g/L)*sin(θ))
bc2(residual, u, p, t) = (residual[1] = u[end÷2][1] + pi/2; residual[2] = u[end][1] - pi/2)
bvp2 = BVProblem(simplependulum, bc2, [pi/2,pi/2], (tspan[1],tspan[end]))
sol2 = solve(bvp2, MIRK4(), dt=dtframe) # use the MIRK4 solver for TwoPointBVProblem

(These code snippets were taken from Rosetta Code for animating a pendulum, for both Julia and Python)

The Julia equivalent is not only easier to read and understand, but also performs comparably to low-level languages like C and C++. This is a game-changer, as traditionally you had to trade-off readability for performance, and vice versa. Julia breaks this norm, enabling us to leverage the best of both worlds, and thus becoming a promising tool for the realization of PIML’s potential.

Bringing Scientific Computing (and Quantum) to the Masses

The realm of scientific machine learning is poised for a transformation with Julia, unlocking the power of high-performance scientific computing for machine learning engineers, data scientists, and non-technical users for the first time. This revolution doesn’t just stop at high-performance computing; it ushers us into the quantum-ready future. At QDC.ai, we’re pioneering this seismic shift, crafting a range of products designed to make High-Performance Computing (HPC) and quantum computing accessible to all.

One of today’s key challenges is the intimidating learning curve associated with HPC, making it difficult for an average developer to tap into the potential of mathematical solvers for tackling complex combinatorial optimization problems. QDC.ai is bridging this divide, making large-scale optimization problems more approachable by merging the universes of HPC and conventional machine learning, while keeping our eyes firmly on the quantum horizon.

Optimization and Modeling Neural Intelligence System (OMNIS)

Imagine the power of integrating our product, OMNIS, with Snowflake, defining your optimization problem in natural language, and then witnessing the system autonomously generate all the differential equations that describe your problem within hours. This process is not just streamlined and efficient, but it also lays the groundwork for quantum computing, making your operations quantum-ready right from the get-go. Execute your algorithm, retreat for a coffee, and return to gauge the outcome and refine as necessary.

This quantum-ready vision is the future we foresee for scientific computing and large-scale optimization problem-solving. The necessity to recruit specialized math and physics PhDs to model your optimization problems is fading away. With the power of generative A.I., you can expedite your processes, economize, and eliminate countless consultation cycles to define your optimization problem, all while being prepared for quantum computation.

The blend of Generative AI and scientific programming is not only set to enhance our capability to solve intricate optimization problems, but it also paves the way for quantum-ready solutions, all while substantially reducing computational resource demand. As AI continues to expand, we need sustainable solutions that aren’t reliant exclusively on escalating computing power. In our data-abundant world, where decision-making processes can be unbearably protracted, we’re developing a solution that enables swift, efficient decision-making and alleviates analysis paralysis.

At QDC.ai, we’re crafting a future that democratizes access to high-level, quantum-ready scientific computing, and makes the world more efficient. Our mission is to revolutionize the approach to large-scale optimization problems and shape the future of decision-making in a data-centric, quantum-ready world.

--

--