My Machine Learning notes — Week 2

Kalyan Dechiraju
3 min readAug 5, 2018

--

Follow the series here.

It’s already been a week and a lot has happened already. I stated learning Machine Learning from the famous course by Andrew Ng on Coursera. Well it’s a fantastic course. But it’s a lengthy course taught using Matlab. Then I stumbled upon this video by Siraj Raval and it actually inspired me. Which is why I have changed my plan according to that curriculum. Have a look at that video for more details.

Let me take a step back and make my mathematical foundations strong before jumping into Machine Learning concepts.

Linear Algebra — Solving Linear Equations

Let’s get clear with the notations and the representations. Our goal is to solve n unknown values given n linear equations.

We need to find the unknowns in the above equations x and y.

Coefficient Matrix

In linear algebra, a coefficient matrix is a matrix consisting of the coefficients of the variables in a set of linear equations. For the above equations, the coefficient matrix will be like:

Coefficient Matrix

Augmented Matrix

An augmented matrix is a matrix obtained by appending the columns of two given matrices.

In our above example an augmented matrix will be:

Augmented Matrix

Permutation Matrix

A matrix that exchanges rows/columns in any given matrix when multiplied by it. Permutation Matrix exchanges rows when multiplied from left and exchanges columns when multiplied from right.

permutation matrix exchanging rows
permutation matrix exchanging columns

For an nxn matrix, there will be n! (n factorial) permutation matrices.

Inverse Matrix

A matrix when multiplied with a matrix E results in an identity matrix I is called Inverse Matrix of E.

Inverse of a matrix E exists only if E is a non-singular matrix. More on Singular matrices here.

The point all these concepts is to find an algorithm that can solve a set of linear equations.

Gaussian Elimination

This is the most used algorithm in solving a set of linear equations in software tools. It is quite hard to explain in a medium post. To put it simply, the process involves two steps.

  1. Forward elimination: reduction to row echelon form. Using it one can tell whether there are no solutions, or unique solution, or infinitely many solutions.
  2. Back substitution: further reduction to reduced row echelon form.

Learn more about the algorithm here.

Gauss-Jordan Elimination

This is the modified version of Gaussian Elimination to solve linear equations.

More about the algorithm here.

That’s it for this week. To be continued…

Follow this series here: My Machine Learning Notes

--

--

Kalyan Dechiraju

Mobile and Web Developer at Adobe, excited about the possibilities of Gen AI. I love traveling and finding inspiration from different cultures.