Formulating mathematical representations of 3D rotations
There’s this TikTok filter involving a horizontal reflection that’s been getting a lot of attention. Here’s an example of TikTok user @eatmytoonies using it to add some flair to a lip sync cover.
This filter is interesting, not only because it makes slight facial asymmetries pop — human faces are bilaterally symmetric on the most part — but also since it makes evident the way physicists think about symmetries. The concept of a symmetry is likely familiar to most because of its ubiquity in nature, but how do you model them with mathematics?
Mathematical modeling often reduces to pinning down a suitable definition for the concepts at hand. Let’s start by considering what we can say about symmetries in general. One thing you might notice from the above image is that symmetries are associated with a sense of ‘sameness’. In the case of the fly — or with humans — the left side looks sort of the same as the right side. The aforementioned TikTok video provides some extra intuition here: the appearance of the fly is left the same after a reflection about its bilateral axis. A conclusion to draw is that symmetries are associated with transformations on objects that leave them looking the same. Notice that this association carries over to the other examples in the above image. Not only is the same reflection symmetry is present in the flower, but we also have the further ability to leave its appearance the same by rotating it through an angle of 72⁰. The bacteria has even more symmetry since it will look identical after rotating it by any angle, no matter how small or large.
This is an important step towards understanding how we use symmetries in math and physics, but how do we actually write them down? The answer is that it depends on the mathematically representation of your objects. Let’s consider a still image from that TikTok video. We can drop a coordinate axis on the image and label pixels based on their positions. A bilateral symmetry is association with reflecting about the y axis, resulting in a negation of all the x values on the pixel labels.
This association is made fully concrete through the language of matrix arithmetic. Our pixels are 2D vectors, and a reflection about the y axis is just a matrix operation on those vectors
You can also chain operations together by multiplying matrices. Two reflections should map a pixel’s location back onto itself so we should expect two reflection matrices to multiply to the identity matrix; which they indeed do.
The 𝕀 in the above equation is the identity matrix.
Rotational Symmetry
Let us now consider the significantly more symmetric case of our spherical bacteria.
What are the transformations that leave it looking the same? One thing you may notice is that its appearance remains unchanged no matter how you rotate it. This tells us to associate spherical symmetry with something that is invariant under rotation.
We will take the lesson learned from the TikTok image and label the points of the bacteria as locations in 3D space centered on the origin. This will allow us to represent our transformations as matrices like we did last time. A sphere can then be defined by all the points that lie at some constant distance from the origin. Rotations leave spheres looking the same, so we need to consider the mathematical transformations that leaves our definition of the sphere invariant. The transformations to consider will be those that preserve the length of vectors.
The length of a vector is given by the Pythagorean theorem, and can be represented with vectors in terms of matrix multiplication like so.
The vector on the left is a row vector, while the one on the right is a column vector. If you are familiar with matrix arithmetic, then you might have seen the operation of transposition. A transposition on a matrix simply switches all the rows with the columns. Transposition allows you to think of a row vector as a transposed column vector. The previous equations is therefore the same as saying,
where the ‘T’ indicates transpose.
Normally the the T is omitted on the left vector in most texts since multiplication on the left can only be done by row vectors. I have made it explicit here.
Vectors can be transformed into other vectors via multiplication with a matrix. Schematically this looks like:
for a matrix M, and a vector v. v’ is our newly ‘transformed’ vector. An example looks of this is:
The question therefore becomes, “what are the matrices that leave the length of a vector invariant?”
We will constrain this problem by using previously stated fact about transposition, and the definition of length in terms of the Pythagorean theorem. Here is our plan of attack, we will consider 2 vectors v and v’, who are related by some matrix transformation, and demand that they have the same lengths.
This equality holding requires that
Now we have a constraint on the behavior of our matrices. We want to consider the class of matrices whose transpose the same as their inverse.
Generating Rotational Symmetry
It would be nice if there was someway to generate all the matrices that represent a rotation. Perhaps we can impose additional constraints that would allow us to do so? One aspect of rotation that we can leverage is that it is normally consider it to be a continuous transformation. Continuity has various rigorous mathematical definitions, but when a physicists say ‘continuous’, they usually mean ‘analytic’. We will therefore assume that our rotation is continuous in some parameter θ and do a power series expansion in that variable. Let’s cover some background first.
Taylor Series Expansions
Analyticity is a concept usually covered in a second course in calculus. The idea is that certain functions — and most of the ones you ever deal with in physics — can be represented in terms of a series through the equation,
known as a Maclaurin Series expansion (more generally a Taylor series expansion). This equation tells us that if we know all the derivatives of f at x₀, then we can figure out what value f will take at x.
We can see how this works for eˣ as an example. Since the derivative of eˣ is eˣ, then;
3Blue1Brown has more on this in his video on Taylor Series
A fun fact about derivatives is that they can be thought of generators of transformations themselves. To see this, lets write the definition of a Maclaurin series expansion in the following way:
We can therefore define an ‘operator’ that translates f from its value at 0 to its value at x₀ by:
The last trick up my sleeve is that I will use the fact that,
to write our translation operator as:
The above formulation is called an exponential map. One thing to notice here is that this operator seems to only be dependent on powers of d/dx, so we say that d/dx is a generator of translations.
Back to Rotations
Let’s use the same logic that we did for translations to generate rotations. Let’s call the generator of rotations, L, and try to figure out what it is.
Now if we take our previously derived constraint, we’ll have that;
which tells us that,
These are anti-symmetric matrices. We can build any anti-symmetric matrix through linear combination of any the following 3 generators,
Notice that Lx seems to swap the y and z components of a vector, so this must then be the generator of rotations about the x-axis. We can then generate any finite transformation by exponentiating the generators — this follows from the power series assumption.
a finite rotation is thus
We can repeat the same procedure for the other generators to get other rotation matrices. If you would like to dig deeper into this topic checkout Lie Algebras in Particle Physics”, by Howard Georgi.