# Data Science (Python) :: Polynomial Linear Regression

Intention of this post is to give a quick refresher (thus, it’s assumed that you are already familiar with the stuff) of Polynomial Linear Regression (using Python). You can treat this as FAQ’s as well.

What kind of linear regression is Polynomial Linear Regression?
Non-linear regression

*************************************************

What kind of dataset are suited for Polynomial Linear Regression?
A dataset for which the data points doesn’t look like they are suiting in a linear model. For e.g, when a graph is plotted for the dataset and the graph looks more like an curved hockey stick, then polynomial regression is recommended.

*************************************************

Sample code for adding a degree of 3 for Polynomial Features and coming up with a regression model?
from sklearn.preprocessing import PolynomialFeatures
var_poly_Reg = PolynomialFeatures(degree = 4)
var_X_poly = var_poly_Reg.fit_transform(var_X) # var_X is the original matrix with just 1 column
lin_Reg_PolyModel = LinearRegression()
lin_Reg_PolyModel.fit(var_X_poly, var_y)

*************************************************

Why do we need to plot a polynomial regression model with values of features which differ by only a small margin? (For e.g, why it’s better to plot for 1, 1.1, 1.2 than 1,2,4…etc)
When we plot for a smaller values, it’s easier to see the prediction converge and not as a straight line. This gives us a better understanding of whether we need to raise or lower our degree for Polynomial Regression. In technical terms, this can also be described as, to get higher resolution graph which is much smoother.

*************************************************

Sample Polynomial Expression?
y = b0 + b1x1 + b2 x1² + b3x1³ + … + b99x1⁹⁹ + …