Linear Discriminant Analysis

Srishti Sawla
3 min readJun 5, 2018

--

Linear Discriminant Analysis(LDA) is a very common technique used for supervised classification problems.Lets understand together what is LDA and how does it work.

What is Linear Discriminant Analysis ?

Linear Discriminant Analysis is a dimensionality reduction technique used as a preprocessing step in Machine Learning and pattern classification applications.

The main goal of dimensionality reduction techinques is to reduce the dimensions by removing the reduntant and dependent features by transforming the features from higher dimensional space to a space with lower dimensions.

Linear Discriminant Analysis is a supervised classification technique which takes labels into consideration.This category of dimensionality reduction is used in biometrics,bioinformatics and chemistry.

How does Linear Discriminant Analysis Work ?

The goal of Linear Discriminant Analysis is to project the features in higher dimension space onto a lower dimensional space.

This can be achieved in three steps :

The first step is to calculate the separability between different classes(i.e the distance between the mean of different classes) also called as between-class variance

Second Step is to calculate the distance between the mean and sample of each class,which is called the within class variance

The third step is to construct the lower dimensional space which maximizes the between class variance and minimizes the within class variance.Let P be the lower dimensional space projection,which is called Fisher’s criterion.

Extension to LDA :

Linear Discriminant Analysis is a simple and effective method for classification. Because it is simple and so well understood, there are many extensions and variations to the method. Some popular extensions include:

  • Quadratic Discriminant Analysis (QDA): Each class uses its own estimate of variance (or covariance when there are multiple input variables).
  • Flexible Discriminant Analysis (FDA): Where non-linear combinations of inputs is used such as splines.
  • Regularized Discriminant Analysis (RDA): Introduces regularization into the estimate of the variance (actually covariance), moderating the influence of different variables on LDA.

The original development was called the Linear Discriminant or Fisher’s Discriminant Analysis. The multi-class version was referred to Multiple Discriminant Analysis. These are all simply referred to as Linear Discriminant Analysis now.

Reference : https://machinelearningmastery.com/linear-discriminant-analysis-for-machine-learning/

Happy Learning !

--

--