2000 IndependentComponentAnalysis
Jump to navigation
Jump to search
- (Hyvärinen & Oja, 2000) ⇒ Aapo Hyvärinen, Erkki Oja. (2000). “Independent Component Analysis: Algorithms and Applications.” In: Neural Networks, 13(4-5). doi:10.1016/S0893-6080(00)00026-5.
Subject Headings: Independent Component Analysis, Projection Pursuit, Factor Analysis.
Notes
Cited By
- Cited by ~6522 http://scholar.google.com/scholar?cites=771289760838077176
Quotes
- Author Keywords: Independent component analysis; Projection pursuit; Blind signal separation; Source separation; Factor analysis; Representation.
Abstract
- A fundamental problem in neural network research, as well as in many other disciplines, is finding a suitable representation of multivariate data, i.e. random vectors. For reasons of computational and conceptual simplicity, the representation is often sought as a linear transformation of the original data. In other words, each component of the representation is a linear combination of the original variables. Well-known linear transformation methods include principal component analysis, factor analysis, and projection pursuit. Independent component analysis (ICA) is a recently developed method in which the goal is to find a linear representation of non-Gaussian data so that the components are statistically independent, or as independent as possible. Such a representation seems to capture the essential structure of the data in many applications, including feature extraction and signal separation. In this paper, we present the basic theory and applications of ICA, and our recent work on the subject.
1. Motivation
- Another, very different application of ICA is on feature extraction. A fundamental problem in digital signal processing is to find suitable representations for image, audio or other kind of data for tasks like compression and denoising. Data representations are often based on (discrete) linear transformations. Standard linear transformations widely used in image processing are the Fourier, Haar, cosine transforms etc. Each of them has its own favorable properties (Gonzales and Wintz, 1987).
- It would be most useful to estimate the linear transformation from the data itself, in which case the transform could be ideally adapted to the kind of data that is being processed. Figure 4 shows the basis functions obtained by ICA from patches of natural images. Each image window in the set of training images would be a superposition of these windows so that the coefficient in the superposition are independent. Feature extraction by ICA will be explained in more detail later on.
- All of the applications described above can actually be formulated in a unified mathematical framework, that of ICA. This is a very general-purposemethod of signal processing and data analysis.
,
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2000 IndependentComponentAnalysis | Aapo Hyvärinen Erkki Oja | Independent Component Analysis: Algorithms and Applications | Neural Network Model (NNet) Training Algorithm | http://amber.feld.cvut.cz/bio/konopka/file/1.pdf | 10.1016/S0893-6080(00)00026-5 | 2000 |