TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Member-only story

Dimensionality Reduction for Linearly Inseparable Data

7 min readDec 20, 2022

--

Photo by Steve Johnson on Unsplash

Standard PCA is suitable for linear dimensionality reduction as it does linear transformation when reducing the number of features in the data. In other words, standard PCA works well with linearly separable data in which the different classes can be clearly separated by drawing a straight line (in the case of 2D data) or a hyperplane (in the case of 3D and higher dimensional data).

Standard PCA will not work well with linearly inseparable data in which the different classes cannot be clearly separated by drawing a straight line or a hyperplane, but can only be separated by using a curved decision boundary.

For nonlinear dimensionality reduction, we can use the kernel PCA which is the non-linear form of the standard PCA.

Both standard PCA and kernel PCA reduce the dimensionality (number of features) in the data, but only the kernel PCA can make the data linearly separable while still reducing the dimensionality of the data — by author

Kernels and kernel trick

--

--

TDS Archive
TDS Archive

Published in TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Rukshan Pramoditha
Rukshan Pramoditha

Written by Rukshan Pramoditha

3,000,000+ Views | BSc in Stats (University of Colombo, Sri Lanka) | Top 50 Data Science, AI/ML Technical Writer on Medium

No responses yet