Eigenvector

From GM-RKB
(Redirected from eigenvectors)
Jump to navigation Jump to search

An eigenvector is a non-zero vector [math]\displaystyle{ \mathbf{x} }[/math] that when multiplied by a square matrix [math]\displaystyle{ A }[/math] either remains proportional to [math]\displaystyle{ \mathbf{x} }[/math] (i.e., change only in vector magnitude, not in vector direction) or become a zero vector.

  • AKA: Latent Vector, Characteristic Vector.
  • Context:
    • It can be used to transform the coordinate system to a new system called eigen space whose axes are the linear combination of eigen vectors.
    • It cannot change directions under Linear Transformation.
    • Let a matrix [math]\displaystyle{ T }[/math] be a linear operator on the finite-dimensional vector space [math]\displaystyle{ V }[/math]. The matrix [math]\displaystyle{ T }[/math] is diagonalizable when the eigenvectors of [math]\displaystyle{ T }[/math] span [math]\displaystyle{ V }[/math].
    • It can be used to diagonalize a matrix.A square matrix [math]\displaystyle{ A }[/math] of order [math]\displaystyle{ n }[/math] is diagonalizable if and only if it has [math]\displaystyle{ n }[/math] linearly independent eigenvectors.
    • It can, for matrix [math]\displaystyle{ A }[/math], be used to find Orthonormal Basis of A's Matrix Column Vectors.
  • Example(s):
    • [math]\displaystyle{ \mathbf x = \begin{bmatrix} 3 \\ -3 \end{bmatrix} }[/math] is an eigenvector (with eigenvalue 1) for the square matrix [math]\displaystyle{ A = \begin{bmatrix} 2 & 1\\1 & 2 \end{bmatrix} }[/math].
    • [math]\displaystyle{ \begin{bmatrix} 2 & 2 & 2 \\ 2 & 6 & 2\\ 2 & 2 & 2\end{bmatrix} }[/math] has the eigenvectors [math]\displaystyle{ \begin{bmatrix} \frac{1}{\sqrt{6}} \\ \frac{2}{\sqrt{6}} \\ \frac{1}{\sqrt{6}} \end{bmatrix},\begin{bmatrix} \frac{1}{\sqrt{3}} \\ \frac{-1}{\sqrt{3}} \\ \frac{1}{\sqrt{3}}\end{bmatrix} and \begin{bmatrix} \frac{-1}{\sqrt{2}} \\ 0 \\ \frac{1}{\sqrt{2}} \end{bmatrix} }[/math] for the eigenvalues 8,2 and 0 respectively. Here the eigenvectors are orthonormal to each other which together generates a [math]\displaystyle{ 3\times 3 }[/math] orthogonal matrix [math]\displaystyle{ \begin{bmatrix} \frac{1}{\sqrt{6}} & \frac{1}{\sqrt{3}} & \frac{-1}{\sqrt{2}} \\ \frac{2}{\sqrt{6}} & \frac{-1}{\sqrt{3}} & 0 \\ \frac{1}{\sqrt{6}} & \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{2}} \end{bmatrix} }[/math].
    • a Right Singular Vector, or a Left Singular Vector.
  • Counter-Example(s):
    • [math]\displaystyle{ \mathbf x = \begin{bmatrix} 0 \\ 1 \end{bmatrix} }[/math] is not an eigenvector for the square matrix [math]\displaystyle{ A = \begin{bmatrix} 2 & 1\\1 & 2 \end{bmatrix} }[/math].
  • See: Singular Matrix, Eigenvalue, Matrix Determinant.


References

2011

  • http://en.wikipedia.org/wiki/Eigenvector
    • The eigenvectors of a square matrix are the non-zero vectors that, after being multiplied by the matrix, either remain proportional to the original vector (i.e., change only in magnitude, not in direction) or become zero. For each eigenvector, the corresponding eigenvalue is the factor by which the eigenvector changes when multiplied by the matrix. The prefix eigen- is adopted from the German word "eigen" for "own"[1] in the sense of a characteristic description. The eigenvectors are sometimes also called characteristic vectors. Similarly, the eigenvalues are also known as characteristic values.

      The mathematical expression of this idea is as follows: if A is a square matrix, a non-zero vector v is an eigenvector of A if there is a scalar λ (lambda) such that

      [math]\displaystyle{ A\mathbf{v} = \lambda \mathbf{v} \, . }[/math]

      The scalar λ (lambda) is said to be the eigenvalue of A corresponding to v. An eigenspace of A is the set of all eigenvectors with the same eigenvalue together with the zero vector. However, the zero vector is not an eigenvector.

      For the matrix [math]\displaystyle{ A = \begin{bmatrix} 2 & 1\\1 & 2 \end{bmatrix} }[/math] the vector [math]\displaystyle{ \mathbf x = \begin{bmatrix} 3 \\ -3 \end{bmatrix} }[/math] is an eigenvector with eigenvalue 1. Indeed, [math]\displaystyle{ A \mathbf x = \begin{bmatrix} 2 & 1\\1 & 2 \end{bmatrix} \begin{bmatrix} 3 \\ -3 \end{bmatrix} = \begin{bmatrix} 2 \cdot 3 + 1 \cdot (-3) \\ 1 \cdot 3 + 2 \cdot (-3) \end{bmatrix} = \begin{bmatrix} 3 \\ -3 \end{bmatrix} = 1 \cdot \begin{bmatrix} 3 \\ -3 \end{bmatrix}. }[/math]

      On the other hand the vector [math]\displaystyle{ \mathbf x = \begin{bmatrix} 0 \\ 1 \end{bmatrix} }[/math] is not an eigenvector, since :[math]\displaystyle{ \begin{bmatrix} 2 & 1\\1 & 2 \end{bmatrix} \begin{bmatrix} 0 \\ 1 \end{bmatrix} = \begin{bmatrix} 2 \cdot 0 + 1 \cdot 1 \\ 1 \cdot 0 + 2 \cdot 1 \end{bmatrix} = \begin{bmatrix} 1 \\ 2 \end{bmatrix} }[/math], and this vector is not a multiple of the original vector x.

  1. See also: eigen or eigenvalue at Wiktionary.

2009