Book note: Linear Algebra for Machine Learning (Jason Brownlee)

Solomon Xie
Machine Learning Study Notes
4 min readJan 8, 2019

“Linear algebra is a pillar of machine learning.” — Jason

Check THIS LINK for reading book: Jason-Brownlee-Basics-for-Linear-Algebra-for-Machine-Learning-Discover-the-Mathematical-Language-of-Data-in-Python-2018

Linear Algebra Is Important in Machine Learning

Study Linear Algebra Too Early

Study Too Much Linear Algebra

Study Linear Algebra Wrong

A Bette Way To Study Linear Algebra

What will be learnt in this book

  • Vector norms
  • Matrix multiplication
  • Matrix properties
  • Tensor & its operations
  • Matrix factorization: Eigendecomposition & Singular Value Decomposition (SVD)
  • Principal Component Analysis (PCA)
  • Linear Least Squares Regression

Types of Matrices

  1. Square Matrix
  2. Symmetric Matrix
  3. Triangular Matrix
  4. Diagonal Matrix
  5. Identity Matrix
  6. Orthogonal Matrix

Matrix Operations

  1. Transpose
  2. Inverse
  3. Trace: Gives the sum of all of the diagonal entries of a matrix
  4. Determinant
  5. Rank: To estimate of the number of linearly independent rows or columns in a matrix.

Sparse Matrix

Matrices that contain mostly zero values are called sparse, distinct from matrices where most of the values are non-zero, called dense.

Very large matrices require a lot of memory, and some very large matrices that we wish to work
with are sparse.
In practice, most large matrices are sparse — almost all entries are zeros.

Matrix Decompositions

Most common types of matrix decomposition:

  • LU Decomposition
  • QR Decomposition
  • Cholesky Decomposition

LU Decomposition

The factors L and U are triangular matrices. The factorization that comes from elimination.

LUP Decomposition

QR Decomposition

Cholesky Decomposition

The Cholesky decomposition is for square symmetric matrices where all values are greater than zero, so-called positive definite matrices.

Where L is the Lower triangular matrix, and Lᵀ is its transpose.
Or

Where U is the Upper Triangular matrix, and Uᵀ is its tranpose.

Eigendecomposition

Eigendecomposition of a matrix is a type of decomposition that involves decomposing a square
matrix into a set of eigenvectors and eigenvalues.
One of the most widely used kinds of matrix decomposition is called eigendecomposition, in which we decompose a matrix into a set of eigenvectors and eigenvalues.

Not all square matrices can be decomposed into eigenvectors and eigenvalues

The parent matrix can be shown to be a product of the eigenvectors and eigenvalues:

Almost all vectors change direction, when they are multiplied by A.
Certain exceptional vectors x are in the same direction as Ax.
Those are the “eigenvectors”.

Singular Value Decomposition (SVD)

The Singular Value Decomposition is a highlight of linear algebra.

The singular value decomposition (SVD) provides another way to factorize a matrix, into singular vectors and singular values. The SVD allows us to discover some of the same kind of information as the eigendecomposition. However, the SVD is more generally applicable.

Pseudoinverse

Dimensionality Reduction

-

--

--

Solomon Xie
Machine Learning Study Notes

Jesus follower, Yankees fan, Casual Geek, Otaku, NFS Racer.