# Eigenvalue

(Redirected from eigenvalue)

An Eigenvalue is a real number, $\displaystyle{ \lambda }$ that when multiplied with an eigenvector yields the original vector.

• AKA: Characteristic Root, Latent Value, Proper Value, Spectral Value.
• Context:
• It can be produced by an Eigenvalue Finding Task.
• It be represented as:
Let $\displaystyle{ V }$ be a vector space over the field $\displaystyle{ F }$ and let $\displaystyle{ T }$ be a linear operator on $\displaystyle{ V }$. An eigenvalue of $\displaystyle{ T }$ is a scalar $\displaystyle{ \lambda }$ in $\displaystyle{ F }$ such that there is a non-zero vector $\displaystyle{ X }$ in $\displaystyle{ V }$ with $\displaystyle{ TX=\lambda X }$.Here $\displaystyle{ X }$ is called the eigen vector of T for respective eigen value $\displaystyle{ \lambda }$.
• It can used in a Dimension-Compression Task (along with its eigenvectors).
• When a beam is struck, its natural frequency (eigenvalues) can be measured. So eigenvalues can be used to test for cracks and deformities in the structural components used for construction.
• If $\displaystyle{ \lambda }$ is an eigenvalue of a matrix $\displaystyle{ A }$ and $\displaystyle{ \alpha }$ be any scalar then
• The matrix $\displaystyle{ \alpha A }$ has eigenvalue $\displaystyle{ \alpha \lambda }$.
• The matrix $\displaystyle{ A^m }$ has eigenvalue $\displaystyle{ \lambda^m }$.
• The matrix $\displaystyle{ A-kI }$ has eigenvalue $\displaystyle{ \lambda-k }$.
• The matrix $\displaystyle{ A^{-1} }$ has eigenvalue $\displaystyle{ \frac{1}{\lambda} }$.
• The matrix $\displaystyle{ A }$ and $\displaystyle{ A^T }$ have same eigenvalues.
• Example(s):
• $\displaystyle{ \operatorname{Eigenvalues} \Bigl(\begin{bmatrix} 5 & 4 \\ 1 & 2 \end{bmatrix} \Bigr) = \{6,1\} }$.
• $\displaystyle{ \operatorname{Eigenvalues} \Bigl(\begin{bmatrix} 1 & 2 \\ -2 & 1 \end{bmatrix} \Bigr) = \{ 1+2i , 1-2i \} }$.
• $\displaystyle{ \operatorname{Eigenvalues} \Bigl(\begin{bmatrix} -2 & 2 & 3 \\ 2 & 1 & -6 \\ -1 & -2 & 0 \end{bmatrix}\Bigr) = \{ 5, -3 , -3 \} }$ .
• See: Matrix Equation, Square Matrix.

## References

### 2015

• (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors Retrieved:2015-2-16.
• An eigenvector or characteristic vector of a linear transformation defines a direction that is invariant under the transformation. Let the transformation be defined by the square matrix A, then an invariant direction of A is the non-zero vector v, that has the property that the product Av is a scalar multiple of v. This is written as the equation, : $\displaystyle{ A\mathbf{v} = \lambda \mathbf{v}, }$ where λ is known as the eigenvalue associated with the eigenvector v.

(Because this equation uses post-multiplication of the matrix A by the vector v it describes an right eigenvector.) The number λ is called the eigenvalue or characteristic value of A corresponding to v.[1]

1. Wolfram Research, Inc. (2010) Eigenvector. Accessed on 2010-01-29.

### 2013

• http://en.wikipedia.org/wiki/Eigenvalue
• An eigenvector of a square matrix $\displaystyle{ A }$ is a non-zero vector $\displaystyle{ v }$ that, when multiplied by $\displaystyle{ A }$, yields the original vector multiplied by a single number $\displaystyle{ \lambda }$; that is: $\displaystyle{ A v = \lambda v }$ The number $\displaystyle{ \lambda }$ is called the eigenvalue of $\displaystyle{ A }$ corresponding to $\displaystyle{ v }$. … Thus, for example, the exponential function $\displaystyle{ f(x) = a^x }$ is an eigenfunction of the derivative operator " $\displaystyle{ {}' }$ ", with eigenvalue $\displaystyle{ \lambda = \ln a }$, since its derivative is $\displaystyle{ f'(x) = (\ln a)a^x = \lambda f(x) }$.

The set of all eigenvectors of a matrix (or linear operator), each paired with its corresponding eigenvalue, is called the eigensystem of that matrix. An eigenspace of a matrix $\displaystyle{ A }$ is the set of all eigenvectors with the same eigenvalue, together with the zero vector. An eigenbasis for $\displaystyle{ A }$ is any basis for the set of all vectors that consists of linearly independent eigenvectors of $\displaystyle{ A }$. Not every real matrix has real eigenvalues, but every complex matrix has at least one complex eigenvalue.