# Matrix Factorization System

Jump to navigation
Jump to search

A Matrix Factorization System is a matrix processing system that applies a matrix factorization algorithm to solve a matrix factorization task.

**AKA:**Matrix Decomposer.**Context:**- It can range from being an Exact Matrix Decomposition System to being an Approximate Matrix Decomposition System (such as a regularized matrix factorization system).
- It can range from being a Global Matrix Decomposition System to being a Local Matrix Decomposition System.
- It can range from being a Nonnegative Matrix Factorization System to being a Positive Matrix Factorization System to being ...
- It can range from being a Weighted Matrix Decomposition System to being ...
- It can range from being a Boolean Matrix Decomposition System to being an Integer Matrix Decomposition System to being Real Matrix Decomposition System.
- It can support a Matrix Compression System.

**Example(s):**- an SVD Decomposition System, such as numpy.linalg.svd().
- a QR Decomposition System, such as numpy.linalg.qr().
- a Cholesky Decomposition System, such as numpy.linalg.cholesky().
- a Matrix Factorization-based Recommender System.
- a Python-based Matrix Factorization System, such as
`sklearn.decomposition`

. - a Principal Components Analysis (PCA) System?

**Counter-Example(s):**- …

**See:**Linear Programming System.

## References

### 2019

- "2.5. Decomposing signals in components (matrix factorization problems)." In: scikit-learn documentation.

2.5. Decomposing signals in components (matrix factorization problems) 2.5.1. Principal component analysis (PCA) 2.5.1.1. Exact PCA and probabilistic interpretation 2.5.1.2. Incremental PCA 2.5.1.3. PCA using randomized SVD 2.5.1.4. Kernel PCA 2.5.1.5. Sparse principal components analysis (SparsePCA and MiniBatchSparsePCA) 2.5.2. Truncated singular value decomposition and latent semantic analysis 2.5.3. Dictionary Learning 2.5.3.1. Sparse coding with a precomputed dictionary 2.5.3.2. Generic dictionary learning 2.5.3.3. Mini-batch dictionary learning 2.5.4. Factor Analysis 2.5.5. Independent component analysis (ICA) 2.5.6. Non-negative matrix factorization (NMF or NNMF) 2.5.6.1. NMF with the Frobenius norm 2.5.6.2. NMF with a beta-divergence 2.5.7. Latent Dirichlet Allocation (LDA)

### 2016

- (Bayer, 2016) ⇒ Immanuel Bayer. (2016). “fastFM: A Library for Factorization Machines.” In: The Journal of Machine Learning Research, 17(1).
- QUOTE: Factorization Machines (FM) are currently only used in a narrow range of applications and are not yet part of the standard machine learning toolbox, despite their great success in collaborative filtering and click-through rate prediction. However, Factorization Machines are a general model to deal with sparse and high dimensional features. Our Factorization Machine implementation (
*fast*FM) provides easy access to many solvers and supports regression, classification and ranking tasks. Such an implementation simplifies the use of FM for a wide range of applications. Therefore, our implementation has the potential to improve understanding of the FM model and drive new development.

- QUOTE: Factorization Machines (FM) are currently only used in a narrow range of applications and are not yet part of the standard machine learning toolbox, despite their great success in collaborative filtering and click-through rate prediction. However, Factorization Machines are a general model to deal with sparse and high dimensional features. Our Factorization Machine implementation (