Linear Regression, Gradient Descent, Model Regularization

Ibrahim Kovan
Geek Culture
Published in
11 min readJul 12, 2021

--

This article breaks down the topic of linear models for regression deductively, covering the most basic approach to the underlying mathematics. Alternative types of core techniques are explained mathematically and codes are implemented. In short, it is aimed to design the optimum linear model that can be created specifically for a project. The article is supported by examples and aims to create their approaches to the subject, regardless of their level of knowledge.

Table of Contents (TOC)
----Linear Regression
--Mean Square Error
--Singular Value Decomposition
----Gradient Descent
--Batch Gradient Descent
--Stochastic Gradient Descent
--Mini Batch Gradient Descent
-----Model Regularization
--Ridge Regression (L2 Regularization)
--Lasso Regression (L1 Regularization)
--ElasticNet
----Polynomial Regression
Photo by JJ Ying on Unsplash

Linear Regression

Linear models are the model which is frequently used in the practice. Although it is mostly used for regression, it is also used for classification. This article focuses on the linear models for regression. Linear models predict the outputs based on the convenient functions according to the model. Let’s start to the topic with a question from high school. What is the relationship between the following points?

--

--