# Linear Algebra 101 — Part 9: Singular Value Decomposition (SVD)

Feb 10, 2019 · 5 min read

Today, we are going to study another famous decomposition called “Singular Value Decomposition (SVD)”.

We covered Eigendecomposition in the past so in case if you have missed it, check about the topic from the below links:

You could also learn about today’s topic from the famous and wonderful lectures from Dr. Gilbert Strang from MIT.

I would strongly recommend you actually watch his videos to understand the topic in depth.

# Materials covered in this story

• The definition of SVD
• Unitary matrix
• An intuitive explanation of SVD
• How to calculate SVD

# The definition of SVD

Singular Value Decomposition (SVD) is another type of decomposition. Unlike eigendecomposition where the matrix you want to decompose has to be a square matrix, SVD allows you to decompose a rectangular matrix (a matrix that has different numbers of rows and columns). This is often more useful in a real-life scenario since the rectangular matrix could represent a wide variety of data that’s not a square matrix.

First, let’s look at the definition itself.

As you can see, SVD decomposes the matrix into 3 different matrices. Two of the matrices are a unitary matrix which I’m going to explain in a few mins. And the middle matrix is a diagonal matrix. Note that I wrote “Real” or “Complex” here. We already covered “real” case in eigendecomposition, but what happens when the decomposed matrix contains complex numbers?

Don’t worry if you are not familiar with a complex number. It’s just a way to expand the space of a matrix so that it could represent something that real numbers cannot. We do this by introducing an imaginary number called “i” where i² = -1. There’s no “real” number that equals to 1 when you multiply it by itself, right? That’s when “complex” number comes in handy. If you want to know more about complex numbers, check the below link for more information.

Now, let’s talk about the “Unitary matrix”. Unitary matrix by its definition is shown below.

If you remember the orthogonal matrix we covered before, think of a unitary matrix as “complex number” version of the orthogonal matrix. So when you are dealing with “real number” version, the decomposed matrix resulting from SVD is nothing but a normal orthogonal matrix.

OK, I hope you got some ideas about the decomposed matrices. Make sure to check multiple sources for information and answers to your question so that it becomes easier to grasp the concept. I personally don’t find a single source of information to be the best as it only provides you with one aspect of the information. This goes the same for this article too. It will only provide you with one aspect of the information, so if combined with other materials, hopefully, that will cover various aspects of the topic for you to fully understand the concept.

OK, that being said, let’s dive a little bit more deeply into understanding SVD.

Here, “I” is the identity matrix. So the decomposed matrices have nice properties. “U” and “V” can be converted into the most easily handled matrix, the identity matrix when multiplied by its transpose. Also, as we saw, the middle matrix “Σ” only has elements in the diagonal line, so this is very easy to handle too.

Maybe by now, you are thinking “Yeah, right. I kind of get the math, but what is SVD actually doing? Can you explain it more intuitively?”.

Sure thing!

Below is a simple explanation of what SVD is actually doing. So you have this matrix “A”, which is the matrix you were decomposing using SVD. This is a transformation matrix that transforms a group of vectors to new space. Let’s define the original orthogonal vectors as “v”s and the transformed orthogonal vectors as “u”s. At last, let’s normalize the transformed matrix so that it becomes easier to handle the results. As a matter of fact, this is what SVD is doing. It’s basically dividing different transformations into each matrix “U”, “Σ”, and “V”.

At last, how do you get each decomposed matrix in practice? Of course, you would usually use a pre-implemented version of the SVD, but if you want to implement it by yourself, how would you do that?

Basically, the below calculation makes the problem into solving eigendecomposition. So now you could just use the methods you used to solve the eigendecomposition to solve the problem.

# Summary

• The definition of SVD

A = UΣV* where U and V is an orthogonal or unitary matrix and Σ is a diagonal matrix.

• Unitary matrix

A complex version of the orthogonal matrix.

• An intuitive explanation of SVD

Transformation, Scaling, and another transformation operations divided into 3 matrices.

I hope this helps! See you next time!

## sho.jp

Let's make the cyberbrain system from Ghost in the Shell

## sho.jp

Let's make the cyberbrain system from Ghost in the Shell

Written by

## Sho Nakagome

A Neuroengineer and Ph.D. candidate researching Brain Computer Interface (BCI). I want to build a cyberbrain system in the future. Nice meeting you!

## sho.jp

Let's make the cyberbrain system from Ghost in the Shell

## Beginners Guide to Probability and Statistics for Data Science

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app