Singular Value Decomposition (SVD)

Kien Duong
4 min readAug 22, 2023

My blog: https://ai-research.dev/singular-value-decomposition/

1. What is SVD?

The SVD of a matrix A is the factorization of A into the product of three matrices.

U, V are orthogonal matrices. Sigma is diagonal matrix. Let’s see a simple example

Based on the graph, we can see the transformation from vector x to vector y includes two actions: “rotating” & “stretching”.

For the next step, we have an orthogonal matrix V(2x2) and a matrix A(mxn). The transformation should be:

u1, u2 are unit orthonormal vectors. sigma1, sigma2 are “stretching”. So now, we have A v1 = sigma1 u1. The formula should be:

Because V is an orthogonal matrix so V^-1 = V^T. Therefore the formular is:

Theorem: Every matrix A has a Singular Value Decomposition (SVD)

Reference: https://www.cs.princeton.edu/courses/archive/spring12/cos598C/svdchapter.pdf

2. Eigen Decomposition

Given a matrix A, if we can find lambda and v satisfying the below equation so v is eigenvector & lambda is eigenvalue of A

Back to SVD, we have

3. Find SVD manually

3.1. Find left singular vector U

The equation (3.1) has 2 solutions 25 and 9

3.1.1. lambda = 25

Using the elimination method to row reduce that matrix

The next step, we will find the unit vector. Unit vector is a vector that has magnitude is exactly 1 unit.

So the first vector of U should be

3.1.2. lambda = 9

Similar to the above we can calculate the second vector of U should be

So

3.2. Find right singular vector V

3.2.1. lambda = 25

That vector row reduces to

So the first vector of V should be

3.2.2. lambda = 9

So the second vector of V should be

3.2.3. Find last eigenvector

As above, we already find the eigenvectors v1 & v2 of orthogonal matrix V. So now, we need to find the eigenvector v3. Based on the properties of orthonormal vectors. We have:

The next step, we find the unit vector

So

4. Compute SVD by using Pytorch

PyTorch is a machine learning framework based on the Torch library, used for applications such as computer vision and natural language processing, originally developed by Meta AI.

Here is the Pytorch method to compute SVD: https://pytorch.org/docs/stable/generated/torch.svd.html Let’s write some basic code to compute SVD by a given matrix that based on above example:

import torch
import numpy as np

# Create A matrix
A = torch.Tensor(np.array([[3, 2, 2], [2, 3, -2]]))
print('A matrix: ', A)

# Compute SVD
U, S, Vh = torch.linalg.svd(A)

print('U: ', U)
print('S: ', S)
print('Vh: ', Vh)

The result:

--

--