
In
linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as
:a_1x_1+\cdots +a_nx_n=b,
linear maps such as
:(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n,
and their representations in vector spaces and through matrix (mathemat ...
, a symmetric matrix is a
square matrix
In mathematics, a square matrix is a Matrix (mathematics), matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied.
Squ ...
that is equal to its
transpose
In linear algebra, the transpose of a Matrix (mathematics), matrix is an operator which flips a matrix over its diagonal;
that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other ...
. Formally,
Because equal matrices have equal dimensions, only square matrices can be symmetric.
The entries of a symmetric matrix are symmetric with respect to the
main diagonal
In linear algebra, the main diagonal (sometimes principal diagonal, primary diagonal, leading diagonal, major diagonal, or good diagonal) of a matrix A is the list of entries a_ where i = j. All off-diagonal elements are zero in a diagonal matrix ...
. So if
denotes the entry in the
th row and
th column then
for all indices
and
Every square
diagonal matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
is symmetric, since all off-diagonal elements are zero. Similarly in
characteristic different from 2, each diagonal element of a
skew-symmetric matrix
In mathematics, particularly in linear algebra, a skew-symmetric (or antisymmetric or antimetric) matrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition
In terms of the entries of the matrix, if a ...
must be zero, since each is its own negative.
In linear algebra, a
real symmetric matrix represents a
self-adjoint operator
In mathematics, a self-adjoint operator on a complex vector space ''V'' with inner product \langle\cdot,\cdot\rangle is a linear map ''A'' (from ''V'' to itself) that is its own adjoint. That is, \langle Ax,y \rangle = \langle x,Ay \rangle for al ...
represented in an
orthonormal basis
In mathematics, particularly linear algebra, an orthonormal basis for an inner product space V with finite Dimension (linear algebra), dimension is a Basis (linear algebra), basis for V whose vectors are orthonormal, that is, they are all unit vec ...
over a
real inner product space
In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, ofte ...
. The corresponding object for a
complex
Complex commonly refers to:
* Complexity, the behaviour of a system whose components interact in multiple ways so possible interactions are difficult to describe
** Complex system, a system composed of many components which may interact with each ...
inner product space is a
Hermitian matrix
In mathematics, a Hermitian matrix (or self-adjoint matrix) is a complex square matrix that is equal to its own conjugate transpose—that is, the element in the -th row and -th column is equal to the complex conjugate of the element in the ...
with complex-valued entries, which is equal to its
conjugate transpose
In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an m \times n complex matrix \mathbf is an n \times m matrix obtained by transposing \mathbf and applying complex conjugation to each entry (the complex conjugate ...
. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refers to one which has real-valued entries. Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.
Example
The following
matrix is symmetric:
Since
.
Properties
Basic properties
* The sum and difference of two symmetric matrices is symmetric.
* This is not always true for the
product: given symmetric matrices
and
, then
is symmetric if and only if
and
commute, i.e., if
.
* For any integer
,
is symmetric if
is symmetric.
* If
exists, it is symmetric if and only if
is symmetric.
* Rank of a symmetric matrix
is equal to the number of non-zero eigenvalues of
.
Decomposition into symmetric and skew-symmetric
Any square matrix can uniquely be written as sum of a symmetric and a skew-symmetric matrix. This decomposition is known as the Toeplitz decomposition. Let
denote the space of
matrices. If
denotes the space of
symmetric matrices and
the space of
skew-symmetric matrices then
and
, i.e.
where
denotes the
direct sum
The direct sum is an operation between structures in abstract algebra, a branch of mathematics. It is defined differently but analogously for different kinds of structures. As an example, the direct sum of two abelian groups A and B is anothe ...
. Let
then
Notice that
and
. This is true for every
square matrix
In mathematics, a square matrix is a Matrix (mathematics), matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied.
Squ ...
with entries from any
field whose
characteristic is different from 2.
A symmetric
matrix is determined by
scalars (the number of entries on or above the
main diagonal
In linear algebra, the main diagonal (sometimes principal diagonal, primary diagonal, leading diagonal, major diagonal, or good diagonal) of a matrix A is the list of entries a_ where i = j. All off-diagonal elements are zero in a diagonal matrix ...
). Similarly, a
skew-symmetric matrix
In mathematics, particularly in linear algebra, a skew-symmetric (or antisymmetric or antimetric) matrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition
In terms of the entries of the matrix, if a ...
is determined by
scalars (the number of entries above the main diagonal).
Matrix congruent to a symmetric matrix
Any matrix
congruent to a symmetric matrix is again symmetric: if
is a symmetric matrix, then so is
for any matrix
.
Symmetry implies normality
A (real-valued) symmetric matrix is necessarily a
normal matrix
In mathematics, a complex square matrix is normal if it commutes with its conjugate transpose :
:A \text \iff A^*A = AA^* .
The concept of normal matrices can be extended to normal operators on infinite-dimensional normed spaces and to nor ...
.
Real symmetric matrices
Denote by
the standard
inner product
In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, ofte ...
on
. The real
matrix
is symmetric if and only if
Since this definition is independent of the choice of
basis, symmetry is a property that depends only on the
linear operator
In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pr ...
A and a choice of
inner product
In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, ofte ...
. This characterization of symmetry is useful, for example, in
differential geometry
Differential geometry is a Mathematics, mathematical discipline that studies the geometry of smooth shapes and smooth spaces, otherwise known as smooth manifolds. It uses the techniques of Calculus, single variable calculus, vector calculus, lin ...
, for each
tangent space
In mathematics, the tangent space of a manifold is a generalization of to curves in two-dimensional space and to surfaces in three-dimensional space in higher dimensions. In the context of physics the tangent space to a manifold at a point can be ...
to a
manifold
In mathematics, a manifold is a topological space that locally resembles Euclidean space near each point. More precisely, an n-dimensional manifold, or ''n-manifold'' for short, is a topological space with the property that each point has a N ...
may be endowed with an inner product, giving rise to what is called a
Riemannian manifold
In differential geometry, a Riemannian manifold is a geometric space on which many geometric notions such as distance, angles, length, volume, and curvature are defined. Euclidean space, the N-sphere, n-sphere, hyperbolic space, and smooth surf ...
. Another area where this formulation is used is in
Hilbert space
In mathematics, a Hilbert space is a real number, real or complex number, complex inner product space that is also a complete metric space with respect to the metric induced by the inner product. It generalizes the notion of Euclidean space. The ...
s.
The finite-dimensional
spectral theorem
In linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful because computations involvin ...
says that any symmetric matrix whose entries are
real can be
diagonalized by an
orthogonal matrix
In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors.
One way to express this is
Q^\mathrm Q = Q Q^\mathrm = I,
where is the transpose of and is the identi ...
. More explicitly: For every real symmetric matrix
there exists a real orthogonal matrix
such that
is a
diagonal matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
. Every real symmetric matrix is thus,
up to Two Mathematical object, mathematical objects and are called "equal up to an equivalence relation "
* if and are related by , that is,
* if holds, that is,
* if the equivalence classes of and with respect to are equal.
This figure of speech ...
choice of an
orthonormal basis
In mathematics, particularly linear algebra, an orthonormal basis for an inner product space V with finite Dimension (linear algebra), dimension is a Basis (linear algebra), basis for V whose vectors are orthonormal, that is, they are all unit vec ...
, a diagonal matrix.
If
and
are
real symmetric matrices that commute, then they can be simultaneously diagonalized by an orthogonal matrix: there exists a basis of
such that every element of the basis is an
eigenvector
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by ...
for both
and
.
Every real symmetric matrix is
Hermitian, and therefore all its
eigenvalues
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a ...
are real. (In fact, the eigenvalues are the entries in the diagonal matrix
(above), and therefore
is uniquely determined by
up to the order of its entries.) Essentially, the property of being symmetric for real matrices corresponds to the property of being Hermitian for complex matrices.
Complex symmetric matrices
A complex symmetric matrix can be 'diagonalized' using a
unitary matrix: thus if
is a complex symmetric matrix, there is a unitary matrix
such that
is a real diagonal matrix with non-negative entries. This result is referred to as the Autonne–Takagi factorization. It was originally proved by
Léon Autonne (1915) and
Teiji Takagi
Teiji Takagi (高木 貞治 ''Takagi Teiji'', April 21, 1875 – February 28, 1960) was a Japanese mathematician, best known for proving the Takagi existence theorem in class field theory. The Blancmange curve, the graph of a nowhere-differenti ...
(1925) and rediscovered with different proofs by several other mathematicians. In fact, the matrix
is Hermitian and
positive semi-definite, so there is a unitary matrix
such that
is diagonal with non-negative real entries. Thus
is complex symmetric with
real. Writing
with
and
real symmetric matrices,
. Thus
. Since
and
commute, there is a real orthogonal matrix
such that both
and
are diagonal. Setting
(a unitary matrix), the matrix
is complex diagonal. Pre-multiplying
by a suitable diagonal unitary matrix (which preserves unitarity of
), the diagonal entries of
can be made to be real and non-negative as desired. To construct this matrix, we express the diagonal matrix as
. The matrix we seek is simply given by
. Clearly
as desired, so we make the modification
. Since their squares are the eigenvalues of
, they coincide with the
singular values of
. (Note, about the eigen-decomposition of a complex symmetric matrix
, the Jordan normal form of
may not be diagonal, therefore
may not be diagonalized by any similarity transformation.)
Decomposition
Using the
Jordan normal form
\begin
\lambda_1 1\hphantom\hphantom\\
\hphantom\lambda_1 1\hphantom\\
\hphantom\lambda_1\hphantom\\
\hphantom\lambda_2 1\hphantom\hphantom\\
\hphantom\hphantom\lambda_2\hphantom\\
\hphantom\lambda_3\hphantom\\
\hphantom\ddots\hphantom\\
...
, one can prove that every square real matrix can be written as a product of two real symmetric matrices, and every square complex matrix can be written as a product of two complex symmetric matrices.
Every real
non-singular matrix can be uniquely factored as the product of an
orthogonal matrix
In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors.
One way to express this is
Q^\mathrm Q = Q Q^\mathrm = I,
where is the transpose of and is the identi ...
and a symmetric
positive definite matrix
In mathematics, a symmetric matrix M with real entries is positive-definite if the real number \mathbf^\mathsf M \mathbf is positive for every nonzero real column vector \mathbf, where \mathbf^\mathsf is the row vector transpose of \mathbf.
Mo ...
, which is called a
polar decomposition
In mathematics, the polar decomposition of a square real or complex matrix A is a factorization of the form A = U P, where U is a unitary matrix, and P is a positive semi-definite Hermitian matrix (U is an orthogonal matrix, and P is a posit ...
. Singular matrices can also be factored, but not uniquely.
Cholesky decomposition states that every real positive-definite symmetric matrix
is a product of a lower-triangular matrix
and its transpose,
If the matrix is symmetric indefinite, it may be still decomposed as
where
is a permutation matrix (arising from the need to
pivot),
a lower unit triangular matrix, and
is a direct sum of symmetric
and
blocks, which is called Bunch–Kaufman decomposition
A general (complex) symmetric matrix may be
defective and thus not be
diagonalizable. If
is diagonalizable it may be decomposed as
where
is an orthogonal matrix
, and
is a diagonal matrix of the eigenvalues of
. In the special case that
is real symmetric, then
and
are also real. To see orthogonality, suppose
and
are eigenvectors corresponding to distinct eigenvalues
,
. Then
Since
and
are distinct, we have
.
Hessian
Symmetric
matrices of real functions appear as the
Hessians of twice differentiable functions of
real variables (the continuity of the second derivative is not needed, despite common belief to the opposite
).
Every
quadratic form
In mathematics, a quadratic form is a polynomial with terms all of degree two (" form" is another name for a homogeneous polynomial). For example,
4x^2 + 2xy - 3y^2
is a quadratic form in the variables and . The coefficients usually belong t ...
on
can be uniquely written in the form
with a symmetric
matrix
. Because of the above spectral theorem, one can then say that every quadratic form, up to the choice of an orthonormal basis of
, "looks like"
with real numbers
. This considerably simplifies the study of quadratic forms, as well as the study of the level sets
which are generalizations of
conic section
A conic section, conic or a quadratic curve is a curve obtained from a cone's surface intersecting a plane. The three types of conic section are the hyperbola, the parabola, and the ellipse; the circle is a special case of the ellipse, tho ...
s.
This is important partly because the second-order behavior of every smooth multi-variable function is described by the quadratic form belonging to the function's Hessian; this is a consequence of
Taylor's theorem
In calculus, Taylor's theorem gives an approximation of a k-times differentiable function around a given point by a polynomial of degree k, called the k-th-order Taylor polynomial. For a smooth function, the Taylor polynomial is the truncation a ...
.
Symmetrizable matrix
An
matrix
is said to be symmetrizable if there exists an invertible
diagonal matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
and symmetric matrix
such that
The transpose of a symmetrizable matrix is symmetrizable, since
and
is symmetric. A matrix
is symmetrizable if and only if the following conditions are met:
#
implies
for all
#
for any finite sequence
See also
Other types of
symmetry
Symmetry () in everyday life refers to a sense of harmonious and beautiful proportion and balance. In mathematics, the term has a more precise definition and is usually used to refer to an object that is Invariant (mathematics), invariant und ...
or pattern in square matrices have special names; see for example:
*
Skew-symmetric matrix
In mathematics, particularly in linear algebra, a skew-symmetric (or antisymmetric or antimetric) matrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition
In terms of the entries of the matrix, if a ...
(also called ''antisymmetric'' or ''antimetric'')
*
Centrosymmetric matrix
*
Circulant matrix
*
Covariance matrix
*
Coxeter matrix
*
GCD matrix
*
Hankel matrix
*
Hilbert matrix
*
Persymmetric matrix
*
Sylvester's law of inertia
*
Toeplitz matrix
*
Transpositions matrix
See also
symmetry in mathematics
Symmetry occurs not only in geometry, but also in other branches of mathematics. Symmetry is a type of invariance: the property that a mathematical object remains unchanged under a set of operations or transformations.
Given a structured obje ...
.
Notes
References
*
External links
*
A brief introduction and proof of eigenvalue properties of the real symmetric matrixHow to implement a Symmetric Matrix in C++
{{Authority control
Matrices (mathematics)