Linear Algebra 101 — Part 7: Eigendecomposition when symmetric

Sho Nakagome
sho.jp
Published in
5 min readOct 31, 2018

--

Today, we are studying more advanced topics in Linear Algebra that are more relevant and useful in machine learning.

We are building this knowledge on top of what we have already covered, so if you haven’t studied the previous materials, make sure to check them out first.

For the materials and structures, I’m following the famous and wonderful lectures from Dr. Gilbert Strang from MIT and you could see his lecture on today’s topic:

I would strongly recommend watching the video lectures from him because he explains concepts very well. Also, there are some minor materials I’m skipping in these stories (but also adding something that he didn’t cover!) so it’s better to watch his videos nonetheless.

That being said, let’s get started.

Materials covered in this story:

  • Symmetric Matrix
  • Eigendecomposition when the matrix is symmetric
  • Positive Definite Matrix

We have stepped into a more advanced topics in linear algebra and to understand these really well, I think it’s important that you actually understand the basics covered in the previous stories (Part1–6). So if you feel some knowledge is rusty, try to take some time going back because that actually helps you grasp the advanced concepts better and easier.

Symmetric Matrix

First, let’s recap what’s a symmetric matrix is. I hope you are already familiar with the concept!

It’s just a matrix that comes back to its own when transposed.

Let’s take a quick example to make sure you understand the concept.

So the question is, why are we revisiting this basic concept now?

The thing is, if the matrix is symmetric, it has a very useful property when we perform eigendecomposition. Before showing how it is useful, let’s first understand the underlying properties when a matrix is symmetric.

If a matrix is symmetric, the eigenvalues are REAL (not COMPLEX numbers) and the eigenvectors could be made perpendicular (orthogonal to each other).

Why do we have such properties when a matrix is symmetric? Let’s take a look at the proofs.

So this proof shows that the eigenvalues has to be REAL numbers in order to satisfy the comparison. Dr.Gilbert Strang is also explaining it in this way in the video so check it out if you don’t understand this really well.

The proof for the 2nd property is actually a little bit more tricky.

You could also take a look this awesome post.

Here’s the proof for the 2nd property.

From here, we get

OK, that’s it for the special properties of eigenvalues and eigenvectors when the matrix is symmetric. By using these properties, we could actually modify the eigendecomposition in a more useful way. Let’s take a look at it in the next section.

Eigendecomposition when the matrix is symmetric

If the matrix is symmetric, the eigendecomposition of the matrix could actually be a very simple yet useful form.

Notice the difference between the normal square matrix eigendecomposition we did last time? Yes, now the matrix with eigenvectors are actually orthogonal so the inverse of the matrix could be replaced by the transpose which is much easier than handling an inverse.

Positive Definite Matrix

This is a very important concept in Linear Algebra where it’s particularly useful when it comes to learning machine learning. I will be covering this applications in more detail in the next story, but first let’s try to understand its definition and the meaning.

First, the “Positive Definite Matrix” has to satisfy the following conditions.

It might not be clear from this statement, so let’s take a look at an example.

Another example for the third condition is as follows:

So to summarize, if the matrix is symmetric, all eigenvalues are positive, and all the subdeterminants are also positive, we call the matrix a positive definite matrix. Try defining your own matrix and see if it’s positive definite or not.

Summary

To summarize:

  • Symmetric Matrix

It’s a matrix that doesn’t change even if you take a transpose.

  • Eigendecomposition when the matrix is symmetric

The decomposed matrix with eigenvectors are now orthogonal matrix. Therefore, you could simply replace the inverse of the orthogonal matrix to a transposed orthogonal matrix.

  • Positive Definite Matrix

If the matrix is 1) symmetric, 2) all eigenvalues are positive, 3) all the subdeterminants are also positive

I hope this helps! See you next time!

--

--

Sho Nakagome
sho.jp

A Neuroengineer and Ph.D. candidate researching Brain Computer Interface (BCI). I want to build a cyberbrain system in the future. Nice meeting you!