# Index:

• Basic Rules Of Derivation.
• Gradient Descent With One Variable.
• Gradient Descent With Two Variables.
• Gradient Descent For Mean Squared Error Function.

Gradient Descent is a machine learning algorithm that operates iteratively to find the optimal values for its parameters. It takes into account, user-defined learning rate, and initial parameter values.

• Calculate cost.
• Update values using the update function.
• Returns minimized cost for our cost function

Generally, what we do is, we find the formula that gives us the optimal values for our parameter. But in this algorithm, it finds the value by itself! …

# Definition :

A Sine wave or sinusoidal wave is a mathematical curve that describes a smooth periodic oscillation. A Sine wave is a continuous wave, it goes from 0 to 360 degrees.

# A Comprehensive Guide To Logarithmic Regression

A function that increases or decreases rapidly at first, but then steadily slows as time moves, can be called a logarithmic function.

For example, we can say that the number of cases of the ongoing COVID-19 pandemic follows a logarithmic pattern, as the number of cases increased very fast in the beginning and are now slowing a bit.

The logarithmic function is defined as…

Where:

Y = Output feature

X = Input feature

a = The line/curve always passes through (1,a)

b = Controls the rate of growth/decay

# Features of the Logarithmic Function:

Example:

Example:

Here you can see that our graph passes through the point (1,a)—that is, (1,100). …

# Complete Guide On Linear Regression Vs. Polynomial Regression With Implementation In Python

In my previous articles I wrote about how we can plot regression line for our dataset . That was cool right! But there was also a problem that we were getting less accuracy for our model. Our ultimate goal is always to build a model with maximum accuracy and minimum errors. So In this article we’ll see how we can implement polynomial regression that best fits our data by using curves.

Before going there, here are some basic polynomial functions with its graphs plotted. This will help you understand better on which polynomial to use for a specific dataset.

Enjoy the article!

# Multivariable Linear Regression Using Normal Equation

## Prediction based on more than one features.

Hey guys, In my previous articles I show you how we can implement Simple Linear Regression using “Normal Equation” in python. Now what does it mean by Simple Linear Regression? It means that we are going to predict the value of output using only one feature from our dataset. Now it’s sometimes hard for us to predict the value from only one feature, right? So that’s why we need to use Multivariable Linear Regression. Now what happens in Multivariable Linear Regression? In MLR we predict the value of output from more than one input features. For example we can predict the price of house using it’s area, neighbourhood, number of bedrooms etc. …

# Implementation of Simple Linear Regression Using Normal Equation(Matrices)

In the last article we saw how we can derive the Normal Equation. So in this article we are going to solve the Simple Linear Regression problem using Normal Equation.

Normal Equation uses matrices to find out the slope and intercept of the best fit line. If you have read my previous articles then you might know that we have already implemented Simple Linear Regression using sklearn library in python and by building a function from scratch. …

# Linear Regression With Normal Equation Complete Derivation (Matrices)

Normal Equation is an analytic approach to Linear Regression with a least square cost function. We can directly find out the value of θ without using Gradient Descent. Following this approach is an effective and time-saving option when we are working with dataset with small features.

Normal Equation is as follows :

In the above equation :

θ : hypothesis parameters that define it the best.

X : input feature value of each instance

Y :Output value of each instance

# Derivation Of Normal Equation:

(1) Hypothesis function :

Where,

n : number of features in the dataset.

X0 = 1 (for vector multiplication)

(2) Vector θ…

# Introduction To Matrices (For Machine Learning)

In this article I’m going to show some basic operations that we can perform on matrices like addition, multiplication, adjoint and so on.

It’s very important for us to know how this operations work as we’re going to use matrices and it’s concepts a lot in machine learning algorithm.

These are just basic concepts that you might already have learn in high school. So let’s get started.

# PART : 1

An m-by-n matrix is a rectangular array of numbers with m rows and n columns.

(1) 2*2 matrix with 2 rows and 2 columns :

(2) 2*3 matrix with 2 rows and 3 columns…

# Error Calculation Techniques For Linear Regression

## I hope you and your model are doing good !

There are various methods to calculate the accuracy of our model. In this article I’m going to cover a few of them. In this article we are not going to use python libraries to calculate accuracy of models, but instead we are going to do it from scratch. I hope you guys enjoy it.

In this article I’m going to use a user defined function to calculate the slope and intercept of a regression line. So if you haven’t read my previous article about it’s derivation then I think it’ll be for you to start with that. …

# Linear Regression With Gradient Descent From Scratch

In the last article we saw that how the formula for finding the regression line with gradient descent works. In that article we started with some basic cost function and then made our way through our original cost function which was Mean Squared Error(MSE). I recommend you to read that article first,if you haven’t already!

All my articles are available on my blog : patrickstar0110.blogspot.com

Here we will use a slightly different cost function that MSE. Here we will use our basic Sum of Squared Residual (SSR) function to find the optimal parameter values. First we’ll find the parameters for one iteration by hand. …