What happens when a Matrix hits a Vector?

Understanding Eigenvector, Eigenvalues and Eigen decomposition. Let’s understand a Matrix better.

Parth Patel
Analytics Vidhya
4 min readJan 6, 2020

--

Have you ever wondered how Google can crawl through thousands of web pages to provide you with better search results? Or How will you calculate the 100th power of the matrix given its properties? Let’s try to answer some of these questions in this series.

What makes a matrix so important?
Properties of the matrix make its operations computationally fast. Go to your memory and fetch how easily you were able to solve the system of linear equations with the help of matrix operations. Do you know how images stored by your computer? Set of Pixels, RGB channels? It’s a matrix again.

Let’s get back to my Question. Let’s say I have a 2x1 vector which is multiplied by a 2x2 matrix. For Illustration purpose, I have represented vector on a Cartesian plane. Before and after multiplication.

(figure 1 : multiplication of Matrix A and Vector v)

A = [[1,2],[2,1]], v = [[1],[3]], x = [[1],[1]]
Av = [[7],[5]]

(figure 2: multiplication of Matrix A and Vector x)

Ax = [[3],[3]] = 3[[1],[1]] = λx where λ=3, λ is a scalar.

I would suggest you to take 2 minutes of your time and observe what happened when we multiplies matrix A with vector v and What happened when we multiplied A with vector x. can you spot any difference?

Upon multiplication of A with v :
1. Magnitude of resultant vector changes.
2. Direction of resultant vector also changes.
we can say that the resultant vector is deviated as well as scaled(increase/decrease in magnitude)

Upon multiplication of A with x:
1. Magnitude of resultant vector changes.
2. Direction of resultant vector remains the same.
we can say that the resultant vector is scaled but not deviated.

The vector x, is called an eigenvector. Mathematically, an eigenvector x of a matrix A is a vector which satisfies the condition. :

Ax = λx, ______(1)

where, λ is a scalar and is known as eigenvalue of matrix A. x is a non-zero vector.

TO-DO : Can you prove that Eigenvectors can be defined only for square matrix? (hint : heard something called dimension of a matrix?)

Now that we know what is eigenvector and eigenvalue, let’s understand how to obtain them.

We can write Ax = λx as
Ax − λx = 0 ⇔ Ax − λIx = 0 ⇔ (A − λI)x = 0 , I is an Identity matrix.
Remember we are looking for a non-zero vector, and hence

The equation Ax = λx has nonzero solutions for the vector x if and only if the matrix (A − λI) has zero determinant.

(A-λI is singular). i.e. solving the equation

det(A − λI) = 0 ______(2)

by solving the equation we can obtain values for λ. we can have up to N values of λ for a NxN matrix.

By solving equation (2) we will get n degree polynomial in λ. upon solving the polynomial for zeros, we get eigenvalues for corresponding matrix A. By substituting eigenvalues obtained in equation (1), we get eigenvector x corresponding to the eigenvalue λ.

Let’s take the above example. And solve it manually.

calculation for eigenvalues

so now we have got the eigenvalues, lets solve equation 1 and find eigenvector x.

calculation for eigenvector.
eigenvectors corresponding to eigenvalues 3 and -1 respectively.

Observed something? No? look at eigenvectors again… Yes. A matrix can have infinitely many eigenvectors possible. I would suggest you to just play around a bit with eigenvalues and eigenvectors and come up with similar interesting observation? What happens to eigenvalues if a matrix is squared? what would be the eigenvalues for nth power of matrix A? what is eigenvalue of inverse of A?

NumPy implementation for the same is shared in google colab here

In next part we will learn about denationalization and eigen decomposition of the matrix. stay tuned. Feel free to give any suggestions. Got any ideas? Let’s discuss. (LinkedIn). Thanks Venktesh(aka Venky) for constantly nudging me towards ML.

--

--