Linear Algebra — Part 6: eigenvalues and eigenvectors

Sho Nakagome
sho.jp
Published in
5 min readOct 14, 2018

--

Welcome to this series of stories towards understanding Linear Algebra. You could take a look at previous stories from here:

I’m following the basic structures and materials from Dr. Gilbert Strang from MIT and this story is about “Eigenvalues and Eigenvectors”! You could see his lecture on YouTube and I’m posting the relevant video from his lectures:

Before going into the detail of the eigenvalues and eigenvectors, I would like you to watch 20 mins video from 3blue1brown. This video definitely helps you grasp the concepts and make it smooth to understand the equations. So try to take some time watching the video.

That being said, I think we are ready to step into this important concept in Linear Algebra. Let’s get started!

Materials covered in this story

  • Eigenvalues and Eigenvectors
  • Diagonalization & Eigendecomposition
  • Underlying assumption behind the diagonalization and eigendecomposition

These are the concepts used a lot in machine learning when deriving some of the key concepts such as PCA and optimization plane, so make sure you understand them by heart!

Introduction to eigenvalues and eigenvectors

If you watched the video from 3Blue1Brown, you should know the meaning of eigenvalues and eigenvectors by now.

In equation, it’s written like this:

In a matrix form, it looks something like this:

Just as a recap, eigenvectors are the vectors that does not change its orientation, but just scales by a factor of its corresponding eigenvalue.

To solve for eigenvalues and eigenvectors, here are the steps you need to take.

Let’s take a quick example using 2 x 2 matrix.

By solving the determinant = 0, we get the eigenvalues. Now we just need to consider each eigenvalue case separately.

Now you got one of the eigenvectors. Moving on to the next.

Great! Now you solved the eigenvalue and eigenvector problem!

Let’ take a look at the results and make sure that what 3Blue1Brown video was saying makes sense.

As you can see above, no matter what kind of transition matrix “A” you have, if you managed to find its eigenvalues and eigenvectors, the transition using the matrix “A” on eigenvectors does not change its direction, but just scales by a factor of the corresponding eigenvalues. This is very important so make sure you understand this!

Introduction to diagonalization and eigendecomposition

Next, let’s get on to one of the very useful application of the eigenvalues and eigenvectors. It’s called the “diagonalization”.

The cool thing about diagonalization is that as long as your square matrix “A” has same number of linearly independent eigenvectors as the rank, you could make it to a diagonal matrix! Diagonal matrix is very easy to deal with because it only has elements in its diagonal line and the rest of the elements are zeros.

Let’s see why this is possible.

From here, we could derive two equations:

  • diagonalization
  • eigendecomposition

Note that depending on which side you multiply by the inverse of “S”, you get the equations for “diagonalization” and “eigendecomposition”.

Eigendecomposition is another very useful application of eigenvalues and eigenvectors. As long as the matrix satisfies the underlying assumption I’m going to explain in a minute, you could easily decompose the matrix in a more usable way.

Underlying assumption behind the diagonalization and eigendecomposition

One thing to be careful about diagonalization or eigendecomposition. There are two assumptions behind these techniques. We already went through the first one. The matrix “A” must be a square matrix. The second one is the one we just skimmed through. But let’s check again.

So to summarize, in order for the matrix “A” to be either diagonalized or eigendecomposed, it has to meet the following criteria:

  • Must be a Square matrix
  • Has to have linearly independent eigenvectors

In this case, there’s only one eigenvector.

Therefore, the matrix “S” we were using is going to be singular.

One thing to be careful is that full rank does not necessarily guarantee that the matrix has linearly independent eigenvectors. See the example below.

Alright that’s it for today!

Summary

  • Eigenvalues and Eigenvectors

Eigenvectors are the vectors that does not change its orientation when multiplied by the transition matrix, but it just scales by a factor of corresponding eigenvalues.

  • Diagonalization & Eigendecomposition

A few applications of eigenvalues and eigenvectors that are very useful when handing the data in a matrix form because you could decompose them into matrices that are easy to manipulate.

  • Underlying assumption behind the diagonalization and eigendecomposition

Make sure that the matrix you are trying to decompose is a square matrix and has linearly independent eigenvectors (different eigenvalues).

I hope this helps! See you next time!

--

--

Sho Nakagome
sho.jp

A Neuroengineer and Ph.D. candidate researching Brain Computer Interface (BCI). I want to build a cyberbrain system in the future. Nice meeting you!