In
linear algebra, eigendecomposition is the
factorization of a
matrix into a
canonical form, whereby the matrix is represented in terms of its
eigenvalues and eigenvectors. Only
diagonalizable matrices can be factorized in this way. When the matrix being factorized is a
normal or real
symmetric matrix, the decomposition is called "spectral decomposition", derived from the
spectral theorem.
Fundamental theory of matrix eigenvectors and eigenvalues
A (nonzero) vector of dimension is an eigenvector of a square matrix if it satisfies a
linear equation of the form
for some scalar . Then is called the eigenvalue corresponding to . Geometrically speaking, the eigenvectors of are the vectors that merely elongates or shrinks, and the amount that they elongate/shrink by is the eigenvalue. The above equation is called the eigenvalue equation or the eigenvalue problem.
This yields an equation for the eigenvalues
We call the
characteristic polynomial, and the equation, called the characteristic equation, is an th-order polynomial equation in the unknown . This equation will have distinct solutions, where . The set of solutions, that is, the eigenvalues, is called the
spectrum of .
If the field of scalars is
algebraically closed, then we can
factor as
The integer is termed the
algebraic multiplicity of eigenvalue . The algebraic multiplicities sum to :
For each eigenvalue , we have a specific eigenvalue equation
There will be
linearly independent solutions to each eigenvalue equation. The linear combinations of the solutions (except the one which gives the zero vector) are the eigenvectors associated with the eigenvalue . The integer is termed the
geometric multiplicity of . It is important to keep in mind that the algebraic multiplicity and geometric multiplicity may or may not be equal, but we always have . The simplest case is of course when . The total number of linearly independent eigenvectors, , can be calculated by summing the geometric multiplicities
The eigenvectors can be indexed by eigenvalues, using a double index, with being the th eigenvector for the th eigenvalue. The eigenvectors can also be indexed using the simpler notation of a single index , with .
Eigendecomposition of a matrix
Let be a square matrix with linearly independent eigenvectors (where ). Then can be
factored as
where is the square matrix whose th column is the eigenvector of , and is the
diagonal matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
whose diagonal elements are the corresponding eigenvalues, . Note that only
diagonalizable matrices can be factorized in this way. For example, the
defective matrix