# Projection Pursuit Algorithm

Jump to navigation
Jump to search

A Projection Pursuit Algorithm is a dimensionality compression algorithm that ...

**AKA:**PP.**Context:**- It can incorporate higher than second-order information (Fodor, 2002).
- …

**Counter-Example(s):****See:**Independent Component Analysis, Projection (Linear Algebra), Normal Distribution, Matching Pursuit, Blind Source Separation, Projection Pursuit Regression.

## References

### 2015

- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/projection_pursuit Retrieved:2015-1-8.
**Projection pursuit (PP)**is a type of statistical technique which involves finding the most "interesting" possible projections in multidimensional data. Often, projections which deviate more from a normal distribution are considered to be more interesting. As each projection is found, the data are reduced by removing the component along that projection, and the process is repeated to find new projections; this is the "pursuit" aspect that motivated the technique known as matching pursuit. The idea of projection pursuit is to locate the projection or projections from high-dimensional space to low-dimensional space that reveal the most details about the structure of the data set. Once an interesting set of projections has been found, existing structures (clusters, surfaces, etc.) can be extracted and analyzed separately. Projection pursuit has been widely use for blind source separation, so it is very important in independent component analysis. Projection pursuit seek one projection at a time such that the extracted signal is as non-Gaussian as possible.^{[1]}

- ↑ James V. Stone(2004); "Independent Component Analysis: A Tutorial Introduction", The MIT Press Cambridge, Massachusetts, London, England; ISBN 0-262-69315-1

### 2002

- (Fodor, 2002) ⇒ Imola K. Fodor. (2002). “A Survey of Dimension Reduction Techniques." LLNL technical report, UCRL ID-148494
**Projection pursuit (PP)**is a linear method that, unlike PCA and FA, can incorporate higher than second-order information, and thus is useful for non-Gaussian datasets. It is more computationally intensive than second-order methods.Given a projection index that defines the "interestingness" of a direction, PP looks for the directions that optimize that index. As the Gaussian distribution is the least interesting distribution (having the least structure), projection indices usually measure some aspect of non-Gaussianity. If, however, one uses the second-order maximum variance, subject that the projections be orthogonal, as the projection index, PP yields the familiar PCA.

### 1992

- (Scott, 1992) ⇒ David W. Scott. (1992). “Multivariate Density Estimation: theory, practice, and visualization." Wiley. ISBN:0471547700