Eigenvalues and eigenvectors

From TCS Wiki
Jump to navigation Jump to search

Template:Complex

File:Mona Lisa with eigenvector.png
Illustration of a transformation (of Mona Lisa): The image is changed in such a way that the red arrow (vector) does not change its direction, but the blue one does. The red vector therefore is an eigenvector of this transformation, the blue one is not. Since the red vector does not change its length, its eigenvalue is 1. The transformation used is called shear mapping.

Linear algebra talks about functions, which are often called transformations. In that context, an eigenvector is a vector—different from the null vector—which does not change direction in the transformation (except if it turns the vector exactly around). The vector may change its length, or become null. The value of the change in length of the vector is known as eigenvalue.

Basics

If there exists a square matrix called A, a scalar λ, and a non-zero vector v, then λ is the Eigenvalue and v is the Eigenvector if the following equation is satisfied:

[math]\displaystyle{ A\mathbf{v} = \lambda \mathbf{v} \, . }[/math]

In other words, if matrix A times the vector v is equal to the scalar λ times the vector v, then λ is the eigenvalue of v, where v is the eigenvector.

An eigenspace of A is the set of all eigenvectors with the same eigenvalue together with the zero vector. However, the zero vector is not an eigenvector.[1]

These ideas often are extended to more general situations, where scalars are elements of any field, vectors are elements of any vector space, and linear transformations may or may not be represented by matrix multiplication. For example, instead of real numbers, scalars may be complex numbers; instead of arrows, vectors may be functions or frequencies; instead of matrix multiplication, linear transformations may be operators such as the derivative from calculus. These are only a few of countless examples where eigenvectors and eigenvalues are important.

In cases like these, the idea of direction loses its ordinary meaning, and has a more abstract definition instead. But even in this case, if that abstract direction is unchanged by a given linear transformation, the prefix "eigen" is used, as in eigenfunction, eigenmode, eigenface, eigenstate, and eigenfrequency.

Eigenvalues and eigenvectors have many applications in both pure and applied mathematics. They are used in matrix factorization, in quantum mechanics, facial recognition systems, and in many other areas.

Example

For the matrix A

[math]\displaystyle{ A = \begin{bmatrix} 2 & 1\\1 & 2 \end{bmatrix}. }[/math]

the vector

[math]\displaystyle{ \mathbf x = \begin{bmatrix} 3 \\ -3 \end{bmatrix} }[/math]

is an eigenvector with eigenvalue 1. Indeed,

[math]\displaystyle{ A \mathbf x = \begin{bmatrix} 2 & 1\\1 & 2 \end{bmatrix} \begin{bmatrix} 3 \\ -3 \end{bmatrix} = \begin{bmatrix} (2 \cdot 3) + (1 \cdot (-3)) \\ (1 \cdot 3) + (2 \cdot (-3)) \end{bmatrix} = \begin{bmatrix} 3 \\ -3 \end{bmatrix} = 1 \cdot \begin{bmatrix} 3 \\ -3 \end{bmatrix}. }[/math]

On the other hand the vector

[math]\displaystyle{ \mathbf x = \begin{bmatrix} 0 \\ 1 \end{bmatrix} }[/math]

is not an eigenvector, since

[math]\displaystyle{ \begin{bmatrix} 2 & 1\\1 & 2 \end{bmatrix} \begin{bmatrix} 0 \\ 1 \end{bmatrix} = \begin{bmatrix} (2 \cdot 0) + (1 \cdot 1) \\ (1 \cdot 0) + (2 \cdot 1) \end{bmatrix} = \begin{bmatrix} 1 \\ 2 \end{bmatrix}. }[/math]

and this vector is not a multiple of the original vector x.


Notes

Template:Reflist

References

Other websites

Template:Wikibooks Template:Wikibooks

Theory
Online calculators