Linear Algebra for Machine Learning Part 4 — Linear Transformation and Eigendecomposition(EVD)

Falguni Mukherjee
2 min readJul 26, 2018

Linear Transformation :

Linear Transformation: anticlockwise rotation of vectors about origin

Additional notes & references:

Also there are many other operations that can be achieved by linear transformation matrices. For example, “Scaling”(multiplication by a diagonal matrix), “Reflection” etc. Refer below pages if you want to know more.

Eigendecomposition (a.k.a. Eigenvalue Decomposition or EVD):

Additional notes & references:

The eigenvectors of a symmetric matrix are orthogonal to each other(proof). That means they form “basis” of the orientation of the dataset. We will see the use of this property in the next blog about PCA.

Graphical Representation

Sign up to discover human stories that deepen your understanding of the world.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Falguni Mukherjee
Falguni Mukherjee

Written by Falguni Mukherjee

Experienced Software Engineer, Thinker, Learner

No responses yet

Write a response