[ Achieved Post ] Collection of useful presentation for Independent Component Analysis

Jae Duk Seo
6 min readSep 4, 2018

--

Gif from this website

Please note that this post is for my future self and to store some great presentations.

Fundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA)

PPT from Dr Mohsen Naqvi

Basically, the H and W are opposite from one another, one mixes while the other un-mixes.

PCA, projecting the data to the vectors with most variance, second order statistics, Orthogonal projection.

Eigen Vector — direction of principle component, eigenvalues the variance.

Data whitening is another form of PCA, and principle components are not independent.

Independent variable → uncorrelated, but uncorrelated are not independent.

ICA assumes one of the signals at least to be non-Gaussian, and usually un-mixing matrix are square.

ICA have two steps measure non-guassian and optimization.

Identical solutions, examples of cost functions.

As random variables mix, their distributions becomes more and more Gaussian.

We need to measure non-gaussianity and we can do that by, Kurtosis but if data is whitened then we can simplify the equation.

Kurtosis, can be either positive or negative, and we take the absolute value, super gaussian positive. Sub gaussian smaller numbers.

Problem, Kurtosis is not robust, use Negentropy but computationally expensive.

Zero for guassian non negative otherwise.

Cost function with nonquadratic function G.

ICA learning rule via gradient descent, but we need to make the W values independent. (have identity matrix as co-variance matrix.)

Summarized ICA step by step, right is for FASTICA.

Limitation of ICA, are permutation and scales.

IVA models the statistical independence between sources. And maintains the dependency between frequency bins of each source.

Conclusion IVA is more powerful method.

Introduction to Independent Component Analysis

ppt from Barnabás Póczos

Difference between ICA and PCA, it ICA does full rank and PCA can do compression. Also this PCA can be used before ICA so dimension reduction is also possible.

Good summary of the differences.

Use of ICA in EEG signals here and there.

Different measurements of entropy.

ICA, can only estimate up to sign scale and permutation. (IVA can help)

Whitening the data before ICA is a very great idea.

Again, sum of random variables the distributions becomes Gaussian.

holly shit…….

Maximum Likelihood, Kurtosis, and FAST ICA.

Kernel ICA Algorithm.

Independent Subspace Analysis, super sexy stuffs.

PCA and ICA

PPT from this website

Reviewing the fact for PCA we are maximizing the variance, hence the projection vectors are orthogonal to one another. (Many other names of PCA)

Limitations of PCA and what the authors did to overcome these types of problems.

What PCA is doing at the end of day is we can look at it in a way, maximize variance or minimize error formulation.

Methods for calculating the principle components, and their big o notation times. Also, there exist a method called probabilistic PCA.

What ICA is doing at the end of the day, limitations of ICA, the permutation, scale and order.

ICA via gradient ascent is maximizing the log likelihood of the data, via matrix W.

A good visual to see what is statistically independent from one another.

A Tutorial on Data Reduction

PPT from Shireen Elhabian

Independent Component Analysis — A Gentle Introduction

Image from this website

The gradient ascent method is maximizing the likelihood estimation.

Reference

  1. (2018). Pdfs.semanticscholar.org. Retrieved 4 September 2018, from https://pdfs.semanticscholar.org/presentation/c697/99b81650966f78ce432c7d100d745c967372.pdf
  2. (2018). Cs.cmu.edu. Retrieved 4 September 2018, from https://www.cs.cmu.edu/~bapoczos/other_presentations/ICA_26_10_2009.pdf
  3. (2018). Sci.utah.edu. Retrieved 4 September 2018, from http://www.sci.utah.edu/~shireen/pdfs/tutorials/Elhabian_ICA09.pdf
  4. (2018). Cs.ubc.ca. Retrieved 9 September 2018, from http://www.cs.ubc.ca/~jnutini/documents/mlrg_pca.pdf
  5. Independent Component Analysis — A Gentle Introduction. (2018). Danieltakeshi.github.io. Retrieved 9 September 2018, from https://danieltakeshi.github.io/2015/01/03/independent-component-analysis-a-gentle-introduction/

--

--

Jae Duk Seo

Exploring the intersection of AI, deep learning, and art. Passionate about pushing the boundaries of multi-media production and beyond. #AIArt