Singular Value Decomposition Structure

From GM-RKB
Jump to navigation Jump to search

A Singular Value Decomposition Structure is a matrix decomposition [math]\displaystyle{ UΣV^T }[/math] for [math]\displaystyle{ m \times n }[/math] matrix [math]\displaystyle{ M }[/math], where [math]\displaystyle{ U }[/math] is a [math]\displaystyle{ m \times k }[/math] orthonormal matrix (with left singular vectors), [math]\displaystyle{ \Sigma }[/math] being a [math]\displaystyle{ k \times k }[/math] nonnegative diagonal matrix (of singular values), and [math]\displaystyle{ V^T }[/math] being an [math]\displaystyle{ n \times k }[/math] orthonormal matrix (with right singular vectors).

  • Context:
  • Example(s):
    • [math]\displaystyle{ \begin{bmatrix}\frac{1}{\sqrt{5}} & \frac{2}{\sqrt{5}}\\ \frac{2}{\sqrt{5}} & \frac{-1}{\sqrt{5}}\end{bmatrix} \begin{bmatrix}\sqrt{125} & 0\\0 & 0\end{bmatrix} \begin{bmatrix}0.8 & 0.6\\0.6 & -0.8\end{bmatrix} }[/math], for [math]\displaystyle{ \operatorname{SVD}\left(\begin{bmatrix}4 & 3\\8 & 6\end{bmatrix}\right) }[/math].

    • [math]\displaystyle{ \begin{bmatrix} 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & -1 \\ 1 & 0 & 0 & 0 \end{bmatrix} \begin{bmatrix} 4 & 0 & 0 & 0 & 0 \\ 0 & 3 & 0 & 0 & 0 \\ 0 & 0 & \sqrt{5} & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \end{bmatrix} \begin{bmatrix} 0 & 1 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 & 0 \\ \sqrt{0.2} & 0 & 0 & 0 & \sqrt{0.8} \\ 0 & 0 & 0 & 1 & 0 \\ -\sqrt{0.8} & 0 & 0 & 0 & \sqrt{0.2} \end{bmatrix} \ , \text{for} \ \operatorname{SVD}\left(\begin{bmatrix} 1 & 0 & 0 & 0 & 2 \\ 0 & 0 & 3 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 4 & 0 & 0 & 0 \end{bmatrix} \right) }[/math]
  • Counter-Example(s):
  • See: Singular Value Decomposition Task.


References

2015

2012

  • (Golub & Van Loan, 2012) ⇒ Gene H. Golub, and Charles F. Van Loan. (2012). “Matrix Computations (4th Ed.)." Johns Hopkins University Press. ISBN:1421408597
    • QUOTE: If [math]\displaystyle{ A }[/math] is a real m-by-n matrix, then there exist orthogonal matrices :[math]\displaystyle{ U = \bigl [ u_1,...,u_m \bigr] \in \mathbb{R}^{m \times m} \ \text{ and } \ V = \bigl [ v_1,...,v_n \bigr] \in \mathbb{R}^{n \times n} }[/math] such that :[math]\displaystyle{ U^TAV = \Sigma = \operatorname{diag}(\sigma_1,...,\sigma_p) \in \mathbb{R}^{m \times n} \ = \ \operatorname{min}\{m,n\} }[/math] where [math]\displaystyle{ σ_1 ≥ σ_2 ≥ … ≥ σ_p ≥ 0 }[/math]. ...

      ... The [math]\displaystyle{ σ_i }[/math] are the singular values of [math]\displaystyle{ A }[/math], the [math]\displaystyle{ u_i }[/math] are the left singular vectors of [math]\displaystyle{ A }[/math], and the [math]\displaystyle{ v_i }[/math] are right singular vectors of [math]\displaystyle{ A }[/math]. Separate visualizations of the SVD are required depending upon whether [math]\displaystyle{ A }[/math] has more rows or columns. Here are the 3-by-2 and 2-by-3 examples: :[math]\displaystyle{ \begin{bmatrix} u_{11} & u_{12} & u_{13} \\ u_{21} & u_{22} & u_{23} \\ u_{31} & u_{32} & u_{33} \end{bmatrix}^T \begin{bmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \\ a_{31} & a_{32} \end{bmatrix} \begin{bmatrix} v_{11} & v_{12} \\ v_{21} & v_{22} \end{bmatrix} = \begin{bmatrix} \sigma_{1} & 0 \\ 0 & \sigma_{2} \\ 0 & 0 \end{bmatrix}. }[/math] :[math]\displaystyle{ \begin{bmatrix} u_{11} & u_{12} \\ u_{21} & u_{22} \end{bmatrix}^T \begin{bmatrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \end{bmatrix} \begin{bmatrix} v_{11} & v_{12} & v_{13} \\ v_{21} & v_{22} & v_{23} \\ v_{31} & v_{32} & v_{33} \end{bmatrix} = \begin{bmatrix} \sigma_{1} & 0 & 0 \\ 0 & \sigma_{2} & 0 \end{bmatrix} . }[/math]