Member-only story
How to Implement Logistic Regression with PyTorch
Understand Logistic Regression and sharpen your PyTorch skills
To understand better what we’re going to do next, you can read my previous article about logistic regression:
So, what’s our plan for implementing Logistic Regression with PyTorch?
Let’s first think of the underlying math that we want to use.
There are many ways to define a loss function and then find the optimal parameters for it, among them, here we will implement in our LogisticRegression
class the following 3 ways for learning the parameters:
- We will rewrite the logistic regression equation so that we turn it into a least-squares linear regression problem with different labels and then, we use the closed-form formula to find the weights:
- Like above, we turn logistic into least-squares linear regression, but instead of the closed-form formula, we use stochastic gradient descent (SGD) to minimize the following loss function: