Bias — Variance Trade Off!

Source: Google Images

Overview:

In Supervised Machine Learning an algorithm learns a model from training data.

  1. Bias Error
  2. Variance Error.
  3. Irreducible Error.
Source: Google Images

BIAS ERROR:

Bias are the simplifying assumption made by a model to make Target function easier to learn.

Variance Error:

Variance is the amount that the estimate of the target function will change if different training data is used.

Trade-Off

The Goal of a good Machine Learning algorithm is to achieve Low Bias and Low Variance for good prediction performance.

  1. Linear models have High Bias and Low Variance as seen from above examples.
  2. Non Linear Models have Low Bias but High Variance.

Solution for Treating the Errors:

High Bias:

  • Use more complex model (e.g. kernelize, use non-linear models)
  • Add more features to your model.
  • Perform Boosting

High Variance:

  • Add more Training Data.
  • Reduce Model Complexity
  • Perform Bagging.

Summary

In this post, you discovered bias, variance and the bias-variance trade-off for machine learning algorithms.

  • Bias is the simplifying assumptions made by the model to make the target function easier to approximate.
  • Variance is the amount that the estimate of the target function will change given different training data.
  • Trade-off is tension between the error introduced by the bias and the variance.
  • How to deal with High Variance and High Bias.
Source: Google Images
  1. https://www.machinelearningplus.com/machine-learning/bias-variance-tradeoff/
  2. https://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote12.html

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Lokesh Rathi

Lokesh Rathi

I write articles on Data science and Machine learning.