Eigenvalue

From GM-RKB
(Redirected from eigenvalues)
Jump to navigation Jump to search

An Eigenvalue is a real number, [math]\displaystyle{ \lambda }[/math] that when multiplied with an eigenvector yields the original vector.

  • AKA: Characteristic Root, Latent Value, Proper Value, Spectral Value.
  • Context:
    • It can be produced by an Eigenvalue Finding Task.
    • It be represented as:
      Let [math]\displaystyle{ V }[/math] be a vector space over the field [math]\displaystyle{ F }[/math] and let [math]\displaystyle{ T }[/math] be a linear operator on [math]\displaystyle{ V }[/math]. An eigenvalue of [math]\displaystyle{ T }[/math] is a scalar [math]\displaystyle{ \lambda }[/math] in [math]\displaystyle{ F }[/math] such that there is a non-zero vector [math]\displaystyle{ X }[/math] in [math]\displaystyle{ V }[/math] with [math]\displaystyle{ TX=\lambda X }[/math].Here [math]\displaystyle{ X }[/math] is called the eigen vector of T for respective eigen value [math]\displaystyle{ \lambda }[/math].
    • It can used in a Dimension-Compression Task (along with its eigenvectors).
    • When a beam is struck, its natural frequency (eigenvalues) can be measured. So eigenvalues can be used to test for cracks and deformities in the structural components used for construction.
    • If [math]\displaystyle{ \lambda }[/math] is an eigenvalue of a matrix [math]\displaystyle{ A }[/math] and [math]\displaystyle{ \alpha }[/math] be any scalar then
      • The matrix [math]\displaystyle{ \alpha A }[/math] has eigenvalue [math]\displaystyle{ \alpha \lambda }[/math].
      • The matrix [math]\displaystyle{ A^m }[/math] has eigenvalue [math]\displaystyle{ \lambda^m }[/math].
      • The matrix [math]\displaystyle{ A-kI }[/math] has eigenvalue [math]\displaystyle{ \lambda-k }[/math].
      • The matrix [math]\displaystyle{ A^{-1} }[/math] has eigenvalue [math]\displaystyle{ \frac{1}{\lambda} }[/math].
      • The matrix [math]\displaystyle{ A }[/math] and [math]\displaystyle{ A^T }[/math] have same eigenvalues.
  • Example(s):
    • [math]\displaystyle{ \operatorname{Eigenvalues} \Bigl(\begin{bmatrix} 5 & 4 \\ 1 & 2 \end{bmatrix} \Bigr) = \{6,1\} }[/math].
    • [math]\displaystyle{ \operatorname{Eigenvalues} \Bigl(\begin{bmatrix} 1 & 2 \\ -2 & 1 \end{bmatrix} \Bigr) = \{ 1+2i , 1-2i \} }[/math].
    • [math]\displaystyle{ \operatorname{Eigenvalues} \Bigl(\begin{bmatrix} -2 & 2 & 3 \\ 2 & 1 & -6 \\ -1 & -2 & 0 \end{bmatrix}\Bigr) = \{ 5, -3 , -3 \} }[/math] .
  • See: Matrix Equation, Square Matrix.


References

2015

  • (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors Retrieved:2015-2-16.
    • An eigenvector or characteristic vector of a linear transformation defines a direction that is invariant under the transformation. Let the transformation be defined by the square matrix A, then an invariant direction of A is the non-zero vector v, that has the property that the product Av is a scalar multiple of v. This is written as the equation, : [math]\displaystyle{ A\mathbf{v} = \lambda \mathbf{v}, }[/math] where λ is known as the eigenvalue associated with the eigenvector v.

      (Because this equation uses post-multiplication of the matrix A by the vector v it describes an right eigenvector.) The number λ is called the eigenvalue or characteristic value of A corresponding to v.[1]

  1. Wolfram Research, Inc. (2010) Eigenvector. Accessed on 2010-01-29.

2013

  • http://en.wikipedia.org/wiki/Eigenvalue
    • An eigenvector of a square matrix [math]\displaystyle{ A }[/math] is a non-zero vector [math]\displaystyle{ v }[/math] that, when multiplied by [math]\displaystyle{ A }[/math], yields the original vector multiplied by a single number [math]\displaystyle{ \lambda }[/math]; that is: [math]\displaystyle{ A v = \lambda v }[/math] The number [math]\displaystyle{ \lambda }[/math] is called the eigenvalue of [math]\displaystyle{ A }[/math] corresponding to [math]\displaystyle{ v }[/math]. … Thus, for example, the exponential function [math]\displaystyle{ f(x) = a^x }[/math] is an eigenfunction of the derivative operator " [math]\displaystyle{ {}' }[/math] ", with eigenvalue [math]\displaystyle{ \lambda = \ln a }[/math], since its derivative is [math]\displaystyle{ f'(x) = (\ln a)a^x = \lambda f(x) }[/math].

      The set of all eigenvectors of a matrix (or linear operator), each paired with its corresponding eigenvalue, is called the eigensystem of that matrix. An eigenspace of a matrix [math]\displaystyle{ A }[/math] is the set of all eigenvectors with the same eigenvalue, together with the zero vector. An eigenbasis for [math]\displaystyle{ A }[/math] is any basis for the set of all vectors that consists of linearly independent eigenvectors of [math]\displaystyle{ A }[/math]. Not every real matrix has real eigenvalues, but every complex matrix has at least one complex eigenvalue.

2012

2010

2009