Member-only story

Animations of Multiple Linear Regression with Python

Tobias Roeschl
Towards Data Science
7 min readOct 7, 2020

In this article, we aim to expand our capabilities in visualizing gradient descent to Multiple Linear Regression. This is the follow-up article to “Gradient Descent Animation: 1. Simple linear regression”. Just as we did before, our goal is to set up a model, fit the model to our training data using batch gradient descent while storing the parameter values for each epoch. Afterwards, we can use our stored data to create animations with Python’s celluloid module.

This is the revised version of an article about the same topic I uploaded on July 20th. Key improvements include the cover photo and some of the animations.

Setting up the model

Multiple linear regression is an extended version of simple linear regression in which more than one predictor variable X is used to predict a single dependent variable Y. With n predictor variables, this can be mathematically expressed as

with

and b representing the y-intercept (‘bias’) of our regression plane. Our objective is to find the hyperplane which minimizes the mean squared distance between the training data points and that hyperplane. Gradient descent enables us to determine the optimal values for our model parameters θ, consisting of our weights w and our bias term b, to minimize the mean squared error between observed data points y and data points we predicted with our regression model (ŷ). During training, we aim to update our parameter values according to the…

--

--

Towards Data Science
Towards Data Science

Published in Towards Data Science

Your home for data science and AI. The world’s leading publication for data science, data analytics, data engineering, machine learning, and artificial intelligence professionals.

Tobias Roeschl
Tobias Roeschl

Written by Tobias Roeschl

Resident physician passionate about data science, statistics and machine learning. Get in touch: www.linkedin.com/in/tobi-roeschl-60430217b

Responses (1)