Bias Variance Trade-off in Machine Learning — Explained

Priya Varshini G
Analytics Vidhya
Published in
4 min readJun 29, 2020

--

To understand the performance of a model, it is necessary to depend on its ability to make predictions on the unseen data (i.e.) test data. One way to measure the accuracy of a model is by taking account of the bias and variance value in the model. In this article, we will learn how bias-variance plays a vital role in determining the prediction accuracy of the model. Before diving deep into the prediction accuracy it is vital to understand what is an error in the context of modeling performance and its various types.

Prediction Error (Total Error):

The value of prediction error helps in the assessment of a model’s future performance. The error can be defined as the difference between the actual output and the predicted output. The value of error can be decomposed into the sum of two fundamental quantities namely:

Total Error : Irreducible Error + Reducible Error

Irreducible Error:

As understandable by the name, the irreducible error cannot be reduced regardless of the change in algorithm. It is the error introduced from the chosen model technique of the problem and may be influenced by various factors like unknown variables that influence the mapping of the input variables to the output function.

Reducible Error:

This error can be broken down further into bias error and variance error, which are the key topics for this article.

--

--