# Independent Component Analysis Algorithm

(Redirected from Independent component analysis)

Jump to navigation
Jump to search
An Independent Component Analysis Algorithm is a dimensionality compression algorithm that finds the independent components by maximizing the statistical independence of the estimated components.

**AKA:**ICA.**Context:**- It can range from being a Linear Independent Component Analysis Algorithm to being a Non-Linear Independent Component Analysis Algorithm.
- It can be used in optical Imaging of neurons, face recognition, predicting stock market prices, color based detection of the ripeness of tomatoes, etc.
- …

**Counter-Example(s):****See:**Component Analysis, Signal Processing, Multivariate Statistics, Statistical Independence, Blind Source Separation, Cocktail Party Problem.

## References

### 2015

- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/independent_component_analysis Retrieved:2015-1-21.
- In signal processing,
**independent component analysis**(ICA) is a computational method for separating a multivariate signal into additive subcomponents. This is done by assuming that the subcomponents are non-Gaussian signals and that they are statistically independent from each other. ICA is a special case of blind source separation. A common example application is the “cocktail party problem” of listening in on one person's speech in a noisy room.

- In signal processing,

### 2008

- (Argyriou et al., 2008) ⇒ Andreas Argyriou, Theodoros Evgeniou, and Massimiliano Pontil. (2008). “Convex Multi-task Feature Learning.” In: Machine Learning Journal, 73(3). doi:10.1007/s10994-007-5040-8
- QUOTE: We present a method for learning sparse representations shared across multiple tasks. … Learning common sparse representations across multiple tasks or datasets may also be of interest for example for data compression. While the problem of learning (or selecting) sparse representations has been extensively studied either for single-task supervised learning (e.g., using 1-norm regularization) or for unsupervised learning (e.g., using principal component analysis (PCA) or independent component analysis (ICA)), there has been only limited work [3, 9, 31, 48] in the multi-task supervised learning setting.

### 2005

- (Zhang, Ghahramani & Yang, 2005) ⇒ J. Zhang, Zoubin Ghahramani, and Y. Yang. (2005). “Learning Multiple Related Tasks using Latent Independent Component Analysis.” In: Advances in Neural Information Processing Systems, 18 (NIPS 2005).

### 2002

- (Fodor, 2002) ⇒ Imola K. Fodor. (2002). “A Survey of Dimension Reduction Techniques." LLNL technical report, UCRL ID-148494
- QUOTE: This section is based on [22], a recent survey on independent component analysis (ICA). More information (and software) on this currently very popular method can be found at various websites, including [6, 24, 49]. Books summarizing the recent advances in the theory and application of ICA include [1, 48, 15, 38].
ICA is a higher-order method that seeks linear projections, not necessarily orthogonal to each other, that are as nearly statistically independent as possible. Statistical independence is a much stronger condition than uncorrelatdness. While the latter only involves the second-order statistics, the former depends on all the higher-order statistics.

- QUOTE: This section is based on [22], a recent survey on independent component analysis (ICA). More information (and software) on this currently very popular method can be found at various websites, including [6, 24, 49]. Books summarizing the recent advances in the theory and application of ICA include [1, 48, 15, 38].

### 2000

- (Hyvärinen & Oja, 2000) ⇒ Aapo Hyvärinen, and Erkki Oja. (2000). “Independent Component Analysis: Algorithms and Applications.” In: Neural Networks, 13(4-5). doi:10.1016/S0893-6080(00)00026-5.
- QUOTE: …
**independent component analysis**(ICA) is a recently developed**method**in which the goal is to find a linear representation of non-Gaussian data so that the components are statistically independent, or as independent as possible. Such a representation seems to capture the essential structure of the data in many applications, including feature extraction and signal separation. …

- QUOTE: …