In
linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as:
:a_1x_1+\cdots +a_nx_n=b,
linear maps such as:
:(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n,
and their representations in vector spaces and through matric ...
, the characteristic polynomial of a
square matrix is a
polynomial
In mathematics, a polynomial is an expression consisting of indeterminates (also called variables) and coefficients, that involves only the operations of addition, subtraction, multiplication, and positive-integer powers of variables. An ex ...
which is invariant under
matrix similarity and has the
eigenvalues as
roots. It has the
determinant
In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if ...
and the
trace of the matrix among its coefficients. The characteristic polynomial of an
endomorphism of a finite-dimensional
vector space
In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called '' vectors'', may be added together and multiplied ("scaled") by numbers called '' scalars''. Scalars are often real numbers, but ...
is the characteristic polynomial of the matrix of that endomorphism over any base (that is, the characteristic polynomial does not depend on the choice of a
basis). The characteristic equation, also known as the determinantal equation, is the equation obtained by equating the characteristic polynomial to zero.
In
spectral graph theory
In mathematics, spectral graph theory is the study of the properties of a graph in relationship to the characteristic polynomial, eigenvalues, and eigenvectors of matrices associated with the graph, such as its adjacency matrix or Laplacian mat ...
, the characteristic polynomial of a
graph
Graph may refer to:
Mathematics
*Graph (discrete mathematics), a structure made of vertices and edges
**Graph theory, the study of such graphs and their properties
*Graph (topology), a topological space resembling a graph in the sense of discre ...
is the characteristic polynomial of its
adjacency matrix
In graph theory and computer science, an adjacency matrix is a square matrix used to represent a finite graph. The elements of the matrix indicate whether pairs of vertices are adjacent or not in the graph.
In the special case of a finite simple ...
.
Motivation
In
linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as:
:a_1x_1+\cdots +a_nx_n=b,
linear maps such as:
:(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n,
and their representations in vector spaces and through matric ...
,
eigenvalues and eigenvectors play a fundamental role, since, given a
linear transformation, an eigenvector is a vector whose direction is not changed by the transformation, and the corresponding eigenvalue is the measure of the resulting change of magnitude of the vector.
More precisely, if the transformation is represented by a square matrix
an eigenvector
and the corresponding eigenvalue
must satisfy the equation
or, equivalently,
where
is the
identity matrix
In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere.
Terminology and notation
The identity matrix is often denoted by I_n, or simply by I if the size is immaterial ...
, and
(although the zero vector satisfies this equation for every
it is not considered as an eigenvector).
It follows that the matrix
must be
singular, and its determinant
must be zero.
In other words, the eigenvalues of are the
roots of
which is a
monic polynomial in of degree if is a matrix. This polynomial is the ''characteristic polynomial'' of .
Formal definition
Consider an
matrix
The characteristic polynomial of
denoted by
is the polynomial defined by
where
denotes the
identity matrix
In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere.
Terminology and notation
The identity matrix is often denoted by I_n, or simply by I if the size is immaterial ...
.
Some authors define the characteristic polynomial to be
That polynomial differs from the one defined here by a sign
so it makes no difference for properties like having as roots the eigenvalues of
; however the definition above always gives a
monic polynomial, whereas the alternative definition is monic only when
is even.
Examples
To compute the characteristic polynomial of the matrix
the
determinant
In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if ...
of the following is computed:
and found to be
the characteristic polynomial of
Another example uses
hyperbolic functions of a
hyperbolic angle φ.
For the matrix take
Its characteristic polynomial is
Properties
The characteristic polynomial
of a
matrix is monic (its leading coefficient is
) and its degree is
The most important fact about the characteristic polynomial was already mentioned in the motivational paragraph: the eigenvalues of
are precisely the
roots of
(this also holds for the
minimal polynomial of
but its degree may be less than
). All coefficients of the characteristic polynomial are
polynomial expressions in the entries of the matrix. In particular its constant coefficient
is
the coefficient of
is one, and the coefficient of
is , where is the
trace of
(The signs given here correspond to the formal definition given in the previous section; for the alternative definition these would instead be
and respectively.)
For a
matrix
the characteristic polynomial is thus given by
Using the language of
exterior algebra, the characteristic polynomial of an
matrix
may be expressed as
where
is the
trace of the
th
exterior power of
which has dimension
This trace may be computed as the sum of all
principal minors of
of size
The recursive
Faddeev–LeVerrier algorithm computes these coefficients more efficiently.
When the
characteristic of the
field of the coefficients is
each such trace may alternatively be computed as a single determinant, that of the
matrix,
The
Cayley–Hamilton theorem states that replacing
by
in the characteristic polynomial (interpreting the resulting powers as matrix powers, and the constant term
as
times the identity matrix) yields the zero matrix. Informally speaking, every matrix satisfies its own characteristic equation. This statement is equivalent to saying that the
minimal polynomial of
divides the characteristic polynomial of
Two
similar matrices have the same characteristic polynomial. The converse however is not true in general: two matrices with the same characteristic polynomial need not be similar.
The matrix
and its
transpose have the same characteristic polynomial.
is similar to a
triangular matrix
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal ar ...
if and only if
In logic and related fields such as mathematics and philosophy, "if and only if" (shortened as "iff") is a biconditional logical connective between statements, where either both statements are true or both are false.
The connective is bi ...
its characteristic polynomial can be completely factored into linear factors over
(the same is true with the minimal polynomial instead of the characteristic polynomial). In this case
is similar to a matrix in
Jordan normal form.
Characteristic polynomial of a product of two matrices
If
and
are two square
matrices then characteristic polynomials of
and
coincide:
When
is
non-singular this result follows from the fact that
and
are
similar:
For the case where both
and
are singular, the desired identity is an equality between polynomials in
and the coefficients of the matrices. Thus, to prove this equality, it suffices to prove that it is verified on a non-empty
open subset (for the usual
topology
In mathematics, topology (from the Greek words , and ) is concerned with the properties of a geometric object that are preserved under continuous deformations, such as stretching, twisting, crumpling, and bending; that is, without closing ho ...
, or, more generally, for the
Zariski topology) of the space of all the coefficients. As the non-singular matrices form such an open subset of the space of all matrices, this proves the result.
More generally, if
is a matrix of order
and
is a matrix of order
then
is
and
is
matrix, and one has
To prove this, one may suppose
by exchanging, if needed,
and
Then, by bordering
on the bottom by
rows of zeros, and
on the right, by,
columns of zeros, one gets two
matrices
and
such that
and
is equal to
bordered by
rows and columns of zeros. The result follows from the case of square matrices, by comparing the characteristic polynomials of
and
Characteristic polynomial of ''A''''k''
If
is an eigenvalue of a square matrix
with eigenvector
then
is an eigenvalue of
because
The multiplicities can be shown to agree as well, and this generalizes to any polynomial in place of
:
That is, the algebraic multiplicity of
in
equals the sum of algebraic multiplicities of
in
over
such that
In particular,
and
Here a polynomial
for example, is evaluated on a matrix
simply as
The theorem applies to matrices and polynomials over any field or
commutative ring.
However, the assumption that
has a factorization into linear factors is not always true, unless the matrix is over an
algebraically closed field such as the complex numbers.
Secular function and secular equation