In
linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as:
:a_1x_1+\cdots +a_nx_n=b,
linear maps such as:
:(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n,
and their representations in vector spaces and through matric ...
, eigendecomposition is the
factorization of a
matrix
Matrix most commonly refers to:
* ''The Matrix'' (franchise), an American media franchise
** '' The Matrix'', a 1999 science-fiction action film
** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchi ...
into a
canonical form
In mathematics and computer science, a canonical, normal, or standard form of a mathematical object is a standard way of presenting that object as a mathematical expression. Often, it is one which provides the simplest representation of an ob ...
, whereby the matrix is represented in terms of its
eigenvalues and eigenvectors
In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denote ...
. Only
diagonalizable matrices
In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P and a diagonal matrix D such that or equivalently (Such D are not unique. ...
can be factorized in this way. When the matrix being factorized is a
normal or real
symmetric matrix
In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally,
Because equal matrices have equal dimensions, only square matrices can be symmetric.
The entries of a symmetric matrix are symmetric with ...
, the decomposition is called "spectral decomposition", derived from the
spectral theorem
In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful b ...
.
Fundamental theory of matrix eigenvectors and eigenvalues
A (nonzero) vector of dimension is an eigenvector of a square matrix if it satisfies a linear equation of the form
:
for some scalar . Then is called the eigenvalue corresponding to . Geometrically speaking, the eigenvectors of are the vectors that merely elongates or shrinks, and the amount that they elongate/shrink by is the eigenvalue. The above equation is called the eigenvalue equation or the eigenvalue problem.
This yields an equation for the eigenvalues
:
We call the
characteristic polynomial
In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The ...
, and the equation, called the characteristic equation, is an th order polynomial equation in the unknown . This equation will have distinct solutions, where . The set of solutions, that is, the eigenvalues, is called the
spectrum
A spectrum (plural ''spectra'' or ''spectrums'') is a condition that is not limited to a specific set of values but can vary, without gaps, across a continuum. The word was first used scientifically in optics to describe the rainbow of color ...
of .
If the field of scalars is
algebraically closed, then we can
factor
Factor, a Latin word meaning "who/which acts", may refer to:
Commerce
* Factor (agent), a person who acts for, notably a mercantile and colonial agent
* Factor (Scotland), a person or firm managing a Scottish estate
* Factors of production, ...
as
:
The integer is termed the
algebraic multiplicity of eigenvalue . The algebraic multiplicities sum to :
For each eigenvalue , we have a specific eigenvalue equation
:
There will be
linearly independent
In the theory of vector spaces, a set of vectors is said to be if there is a nontrivial linear combination of the vectors that equals the zero vector. If no such linear combination exists, then the vectors are said to be . These concepts ...
solutions to each eigenvalue equation. The linear combinations of the solutions (except the one which gives the zero vector) are the eigenvectors associated with the eigenvalue . The integer is termed the
geometric multiplicity of . It is important to keep in mind that the algebraic multiplicity and geometric multiplicity may or may not be equal, but we always have . The simplest case is of course when . The total number of linearly independent eigenvectors, , can be calculated by summing the geometric multiplicities
:
The eigenvectors can be indexed by eigenvalues, using a double index, with being the th eigenvector for the th eigenvalue. The eigenvectors can also be indexed using the simpler notation of a single index , with .
Eigendecomposition of a matrix
Let be a square matrix with linearly independent eigenvectors (where ). Then can be
factorized as
:
where is the square matrix whose th column is the eigenvector of , and is the
diagonal matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal ...
whose diagonal elements are the corresponding eigenvalues, . Note that only
diagonalizable matrices
In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P and a diagonal matrix D such that or equivalently (Such D are not unique. ...
can be factorized in this way. For example, the
defective matrix