Member-only story
Dimensionality Reduction for Linearly Inseparable Data
Non-linear dimensionality reduction using kernel PCA
Standard PCA is suitable for linear dimensionality reduction as it does linear transformation when reducing the number of features in the data. In other words, standard PCA works well with linearly separable data in which the different classes can be clearly separated by drawing a straight line (in the case of 2D data) or a hyperplane (in the case of 3D and higher dimensional data).
Standard PCA will not work well with linearly inseparable data in which the different classes cannot be clearly separated by drawing a straight line or a hyperplane, but can only be separated by using a curved decision boundary.
For nonlinear dimensionality reduction, we can use the kernel PCA which is the non-linear form of the standard PCA.
Both standard PCA and kernel PCA reduce the dimensionality (number of features) in the data, but only the kernel PCA can make the data linearly separable while still reducing the dimensionality of the data — by author