In
linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as:
:a_1x_1+\cdots +a_nx_n=b,
linear maps such as:
:(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n,
and their representations in vector spaces and through matrice ...
, eigendecomposition is the
factorization of a
matrix
Matrix most commonly refers to:
* ''The Matrix'' (franchise), an American media franchise
** '' The Matrix'', a 1999 science-fiction action film
** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchi ...
into a
canonical form, whereby the matrix is represented in terms of its
eigenvalues and eigenvectors. Only
diagonalizable matrices can be factorized in this way. When the matrix being factorized is a
normal or real
symmetric matrix, the decomposition is called "spectral decomposition", derived from the
spectral theorem
In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful be ...
.
Fundamental theory of matrix eigenvectors and eigenvalues
A (nonzero) vector of dimension is an eigenvector of a square matrix if it satisfies a linear equation of the form
:
for some scalar . Then is called the eigenvalue corresponding to . Geometrically speaking, the eigenvectors of are the vectors that merely elongates or shrinks, and the amount that they elongate/shrink by is the eigenvalue. The above equation is called the eigenvalue equation or the eigenvalue problem.
This yields an equation for the eigenvalues
:
We call the
characteristic polynomial, and the equation, called the characteristic equation, is an th order polynomial equation in the unknown . This equation will have distinct solutions, where . The set of solutions, that is, the eigenvalues, is called the
spectrum of .
If the field of scalars is
algebraically closed, then we can
factor as
:
The integer is termed the
algebraic multiplicity of eigenvalue . The algebraic multiplicities sum to :
For each eigenvalue , we have a specific eigenvalue equation
:
There will be
linearly independent solutions to each eigenvalue equation. The linear combinations of the solutions (except the one which gives the zero vector) are the eigenvectors associated with the eigenvalue . The integer is termed the
geometric multiplicity
In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
of . It is important to keep in mind that the algebraic multiplicity and geometric multiplicity may or may not be equal, but we always have . The simplest case is of course when . The total number of linearly independent eigenvectors, , can be calculated by summing the geometric multiplicities
:
The eigenvectors can be indexed by eigenvalues, using a double index, with being the th eigenvector for the th eigenvalue. The eigenvectors can also be indexed using the simpler notation of a single index , with .
Eigendecomposition of a matrix
Let be a square matrix with linearly independent eigenvectors (where ). Then can be
factorized as
:
where is the square matrix whose th column is the eigenvector of , and is the
diagonal matrix whose diagonal elements are the corresponding eigenvalues, . Note that only
diagonalizable matrices can be factorized in this way. For example, the
defective matrix
In linear algebra, a defective matrix is a square matrix that does not have a complete basis of eigenvectors, and is therefore not diagonalizable. In particular, an ''n'' × ''n'' matrix is defective if and only if it does not h ...