Mathematics for Artificial Intelligence (AI)

EigenVectors and EigenValues in Machine Learning

Vectors are the most fundamental part of Linear Algebra and widely used in machine learning and deep learning. Eigenvectors and eigenvalues are in fact have the widest applications in machine learning and computer vision.

In this tutorial, we will walk through how eigenvectors and eigenvalues of linear algebra can intensively solve many problems of the machine learning task and especially in computer vision. I hope you are excited to learn about its application and mathematical background because I am too excited.

What are eigenvector and eigenvalue?

An eigenvector is a vector whose direction remains unchanged when a linear transformation is applied to it.

Image Credits: Computer Vision for dummies

Eigenvector does not change its direction (represent in red color )after applying the transformation.

Application and its domain

Let's pause and take a while to appreciate its beauty and how wide its domain is. Ranging from machine learning, quantum mechanism, mathematical problems it solves many engineering problems. Wait! how can I forget to mention about the use in Google PageRank.
The well-known example of PCA( Principal Component Analysis) for dimensionality reduction. Moreover, the eigendecomposition forms the basis of the covariance matrix.

Machine learning has its ability to extract the principal component of information. Eigenvector and eigenvalue help to extract that information. It helps to reduce noise to get valuable information.

Mathematical View

A vector is an eigenvector if it satisfies the following equation.

A . v = lambda . v
therefore, (A- lambda(I))v =0

This is an eigenvalue equation. From about equation, A represents a square matrix, v is an eigenvector of the matrix, lambda is a Greek letter which is used to represent eigenvalue and I represent the identity matrix. You can represent this equation without dot matrix as well. This means that the linear transformation on the vector can be completely defined as lambda.

Assume v is a non-null vector, an equation can only be defined (A- lambda(I)) is not invertible, then the determinant would be zero.

det(A- lambda(I))v =0 marked as “a” equation

Eigenvalues are the coefficient applies to the eigenvector.

Calculating eigenvalues: Let say we have a matrix A as defined

Substitute A into the equation “a” would give

Calculating the determinant would give lambda1 =-1 and lambda2 =4.

Now we have eigenvalues which specify the size of the eigenvector.

Calculating the first eigenvector using the equation (A- lambda(I))v =0. In this, A matrix is the same as defined above. lambda = -1 and solving the equation we get

Now , lets calculate the second eigenvector using the equation (A- lambda(I))v =0. In this, A matrix is the same as defined above. lambda = 4 and solving the equation we get

Hence we came to know how to calculate eigenvector from eigenvalues.

Since we have already talked about the decomposition of eigenvectors and eigenvalue under applications part. So, let's see how we can approach this in python. The eigendecomposition is calculated in NumPy using eig() function

The eigendecomposition is calculated on the square matrix

The eigendecomposition is calculated on the matrix returning the eigenvalues and eigenvectors.

Image source: Search Techniques for Multimedia Databases

Conclusion

In this tutorial, we just scratched the surface. From the above discussion, it can provide us the view of the importance of eigenvector and eigenvalues. Its techniques are used in Computer vision, machine learning and also in face recognition by eigenfaces.

Hope you enjoy the article. If you like this article, please clap 👏 , it will provide me a motivation to keep going. Thanks for reading.

--

--

Seeratpal Jaura
Secure and Private AI Math Blogging Competition

Applied Computer Science Student, strong programmer, Enthusiastic about AI and deep Learning, facebookUdacity scholar