Tensor Docomposition Task
(Redirected from Tensor Decomposition)
- AKA: Tensor Factorization.
- See: Matrix Decomposition, Tensor Rank Decomposition, Tensor Field, High-Order SVD.
- (Rendle et al., 2009) ⇒ Steffen Rendle, Leandro Balby Marinho, Alexandros Nanopoulos, and Lars Schmidt-Thieme. (2009). “Learning Optimal Ranking with Tensor Factorization for Tag Recommendation.” In: Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD-2009). doi:10.1145/1557019.1557100
- (Kolda & Bader, 2009) ⇒ Tamara G. Kolda, and Brett W. Bader. (2009). “Tensor Decompositions and Applications.” In: SIAM review, 51(3). doi:10.1137/07070111X
- ABSTRACT: This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or N-way array. Decompositions of higher-order tensors (i.e., N-way arrays with [math]N \geq 3[/math]) have applications in psycho-metrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, and elsewhere. Two particular tensor decompositions can be considered to be higher-order extensions of the matrix singular value decomposition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rank-one tensors, and the Tucker decomposition is a higher-order form of principal component analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The N-way Toolbox, Tensor Toolbox, and Multilinear Engine are examples of software packages for working with tensors.