# Non-Negative Matrix Factorization (NNMF) Task

(Redirected from Nonnegative Matrix Factorization)

A Non-Negative Matrix Factorization (NNMF) Task is a matrix factorization task that decomposes an [math]m \times n[/math] non-negative matrix [math]X[/math] into [math]m×k[/math] and [math]k×n[/math] non-negative matrices (where typically [math]k \ll n[/math] and [math]k \ll m[/math]).

**Context:**- It can be solved by a Non-Negative Matrix Factorization System (that implements a non-negative matrix factorization algorithm)
- It can range from being an Exact NNMF Task to being an Approximate NNMF.
- It can be a Non-Negative Rank Factorization Task
- It can range from being a Non-Convex NNMF to being a Convex NNMF.
- It can range from being an Non-Negative Integer Matrix Decomposition Task to being Non-Negative Real Matrix Decomposition Task.

**Example(s):****Counter-Example(s):****See:**Tensor Factorization Task, Feature Extraction Task.

## References

### 2017

- (Wikipedia, 2017) ⇒ https://en.wikipedia.org/wiki/Non-negative_matrix_factorization Retrieved:2017-9-11.
**Non-negative matrix factorization**(**NMF**or**NNMF**), also**non-negative matrix approximation**^{[1]}is a group of algorithms in multivariate analysis and linear algebra where a matrix**V**is factorized into (usually) two matrices**W**and**H**, with the property that all three matrices have no negative elements. This non-negativity makes the resulting matrices easier to inspect. Also, in applications such as processing of audio spectrograms or muscular activity, non-negativity is inherent to the data being considered. Since the problem is not exactly solvable in general, it is commonly approximated numerically. NMF finds applications in such fields as computer vision, document clustering,^{[1]}chemometrics, audio signal processing^{[2]}and recommender systems.^{[3]}

### 2006

- (Ding et al., 2006) ⇒ Chris Ding, Tao Li, and Wei Peng. (2006). “Nonnegative Matrix Factorization and Probabilistic Latent Semantic Indexing: Equivalence Chi-Square Statistic, and a Hybrid Method.” In: Proceedings of AAAI 2006 (AAAI 2006).

### 2005

- (Shashua & Hazan, 2005) ⇒ Amnon Shashua, and Tamir Hazan. (2005). “Non-Negative Tensor Factorization with Applications to Statistics and Computer Vision.” In: Proceedings of the 22nd International Conference on Machine learning (ICML 2005).
- QUOTE: We derive algorithms for finding a non-negative
*n*-dimensional tensor factorization (*n*-NTF) which includes the non-negative matrix factorization (NMF) as a particular case when*n*= 2.

- QUOTE: We derive algorithms for finding a non-negative

### 1999

- (Lee & Seung, 1999) ⇒ Daniel D. Lee, and H. Sebastian Seung. (1999). “Learning the Parts of Objects by Non-negative Matrix Factorization.” In: Nature, 401(6755). doi:10.1038/44565
- ABSTRACT: Is perception of the whole based on perception of its parts? There is psychological1 and physiological[2,3] evidence for parts-based representations in the brain, and certain computational theories of object recognition rely on such representations [4,5]. But little is known about how brains or computers might learn the parts of objects. Here we demonstrate an algorithm for non-negative matrix factorization that is able to learn parts of faces and semantic features of text. This is in contrast to other methods, such as principal components analysis and vector quantization, that learn holistic, not parts-based, representations. Non-negative matrix factorization is distinguished from the other methods by its use of non-negativity constraints. These constraints lead to a parts-based representation because they allow only additive, not subtractive, combinations. When non-negative matrix factorization is implemented as a neural network, parts-based representations emerge by virtue of two properties: the firing rates of neurons are never negative and synaptic strengths do not change sign.