In the
mathematical
Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many ar ...
discipline of
linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as
:a_1x_1+\cdots +a_nx_n=b,
linear maps such as
:(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n,
and their representations in vector spaces and through matrix (mathemat ...
, a matrix decomposition or matrix factorization is a
factorization
In mathematics, factorization (or factorisation, see American and British English spelling differences#-ise, -ize (-isation, -ization), English spelling differences) or factoring consists of writing a number or another mathematical object as a p ...
of a
matrix
Matrix (: matrices or matrixes) or MATRIX may refer to:
Science and mathematics
* Matrix (mathematics), a rectangular array of numbers, symbols or expressions
* Matrix (logic), part of a formula in prenex normal form
* Matrix (biology), the m ...
into a product of matrices. There are many different matrix decompositions; each finds use among a particular class of problems.
Example
In
numerical analysis
Numerical analysis is the study of algorithms that use numerical approximation (as opposed to symbolic computation, symbolic manipulations) for the problems of mathematical analysis (as distinguished from discrete mathematics). It is the study of ...
, different decompositions are used to implement efficient matrix
algorithm
In mathematics and computer science, an algorithm () is a finite sequence of Rigour#Mathematics, mathematically rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algo ...
s.
For example, when solving a
system of linear equations
In mathematics, a system of linear equations (or linear system) is a collection of two or more linear equations involving the same variable (math), variables.
For example,
: \begin
3x+2y-z=1\\
2x-2y+4z=-2\\
-x+\fracy-z=0
\end
is a system of th ...
, the matrix ''A'' can be decomposed via the
LU decomposition
In numerical analysis and linear algebra, lower–upper (LU) decomposition or factorization factors a matrix as the product of a lower triangular matrix and an upper triangular matrix (see matrix multiplication and matrix decomposition). The produ ...
. The LU decomposition factorizes a matrix into a
lower triangular matrix ''L'' and an
upper triangular matrix
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are ...
''U''. The systems
and
require fewer additions and multiplications to solve, compared with the original system
, though one might require significantly more digits in inexact arithmetic such as
floating point
In computing, floating-point arithmetic (FP) is arithmetic on subsets of real numbers formed by a ''significand'' (a signed sequence of a fixed number of digits in some base) multiplied by an integer power of that base.
Numbers of this form ...
.
Similarly, the
QR decomposition
In linear algebra, a QR decomposition, also known as a QR factorization or QU factorization, is a decomposition of a matrix ''A'' into a product ''A'' = ''QR'' of an orthonormal matrix ''Q'' and an upper triangular matrix ''R''. QR decom ...
expresses ''A'' as ''QR'' with ''Q'' an
orthogonal matrix
In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors.
One way to express this is
Q^\mathrm Q = Q Q^\mathrm = I,
where is the transpose of and is the identi ...
and ''R'' an upper triangular matrix. The system ''Q''(''R''x) = b is solved by ''R''x = ''Q''
Tb = c, and the system ''R''x = c is solved by '
back substitution
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
'. The number of additions and multiplications required is about twice that of using the LU solver, but no more digits are required in inexact arithmetic because the QR decomposition is
numerically stable.
Decompositions related to solving systems of linear equations
LU decomposition
*Traditionally applicable to:
square matrix
In mathematics, a square matrix is a Matrix (mathematics), matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied.
Squ ...
''A'', although rectangular matrices can be applicable.
[If a non-square matrix is used, however, then the matrix ''U'' will also have the same rectangular shape as the original matrix ''A''. And so, calling the matrix ''U'' upper triangular would be incorrect as the correct term would be that ''U'' is the 'row echelon form' of ''A''. Other than this, there are no differences in LU factorization for square and non-square matrices.]
*Decomposition:
, where ''L'' is
lower triangular
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
and ''U'' is
upper triangular
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
.
*Related: the
''LDU'' decomposition is
, where ''L'' is
lower triangular
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
with ones on the diagonal, ''U'' is
upper triangular
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
with ones on the diagonal, and ''D'' is a
diagonal matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
.
*Related: the
''LUP'' decomposition is
, where ''L'' is
lower triangular
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
, ''U'' is
upper triangular
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
, and ''P'' is a
permutation matrix
In mathematics, particularly in matrix theory, a permutation matrix is a square binary matrix that has exactly one entry of 1 in each row and each column with all other entries 0. An permutation matrix can represent a permutation of elements. ...
.
*Existence: An LUP decomposition exists for any square matrix ''A''. When ''P'' is an
identity matrix
In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. It has unique properties, for example when the identity matrix represents a geometric transformation, the obje ...
, the LUP decomposition reduces to the LU decomposition.
*Comments: The LUP and LU decompositions are useful in solving an ''n''-by-''n'' system of linear equations
. These decompositions summarize the process of
Gaussian elimination
In mathematics, Gaussian elimination, also known as row reduction, is an algorithm for solving systems of linear equations. It consists of a sequence of row-wise operations performed on the corresponding matrix of coefficients. This method can a ...
in matrix form. Matrix ''P'' represents any row interchanges carried out in the process of Gaussian elimination. If Gaussian elimination produces the
row echelon form
In linear algebra, a matrix is in row echelon form if it can be obtained as the result of Gaussian elimination. Every matrix can be put in row echelon form by applying a sequence of elementary row operations. The term ''echelon'' comes from the F ...
without requiring any row interchanges, then ''P'' = ''I'', so an LU decomposition exists.
LU reduction
Block LU decomposition
Rank factorization
*Applicable to: ''m''-by-''n'' matrix ''A'' of rank ''r''
*Decomposition:
where ''C'' is an ''m''-by-''r'' full column rank matrix and ''F'' is an ''r''-by-''n'' full row rank matrix
*Comment: The rank factorization can be used to
compute the Moore–Penrose pseudoinverse of ''A'', which one can apply to
obtain all solutions of the linear system .
Cholesky decomposition
*Applicable to:
square
In geometry, a square is a regular polygon, regular quadrilateral. It has four straight sides of equal length and four equal angles. Squares are special cases of rectangles, which have four equal angles, and of rhombuses, which have four equal si ...
,
hermitian {{Short description, none
Numerous things are named after the French mathematician Charles Hermite (1822–1901):
Hermite
* Cubic Hermite spline, a type of third-degree spline
* Gauss–Hermite quadrature, an extension of Gaussian quadrature me ...
,
positive definite matrix
*Decomposition:
, where
is upper triangular with real positive diagonal entries
*Comment: if the matrix
is Hermitian and positive semi-definite, then it has a decomposition of the form
if the diagonal entries of
are allowed to be zero
*Uniqueness: for positive definite matrices Cholesky decomposition is unique. However, it is not unique in the positive semi-definite case.
*Comment: if
is real and symmetric,
has all real elements
*Comment: An alternative is the
LDL decomposition, which can avoid extracting square roots.
QR decomposition
*Applicable to: ''m''-by-''n'' matrix ''A'' with linearly independent columns
*Decomposition:
where
is a
unitary matrix
In linear algebra, an invertible complex square matrix is unitary if its matrix inverse equals its conjugate transpose , that is, if
U^* U = UU^* = I,
where is the identity matrix.
In physics, especially in quantum mechanics, the conjugate ...
of size ''m''-by-''m'', and
is an
upper triangular
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
matrix of size ''m''-by-''n''
*Uniqueness: In general it is not unique, but if
is of full
rank
A rank is a position in a hierarchy. It can be formally recognized—for example, cardinal, chief executive officer, general, professor—or unofficial.
People Formal ranks
* Academic rank
* Corporate title
* Diplomatic rank
* Hierarchy ...
, then there exists a single
that has all positive diagonal elements. If
is square, also
is unique.
*Comment: The QR decomposition provides an effective way to solve the system of equations
. The fact that
is
orthogonal
In mathematics, orthogonality (mathematics), orthogonality is the generalization of the geometric notion of ''perpendicularity''. Although many authors use the two terms ''perpendicular'' and ''orthogonal'' interchangeably, the term ''perpendic ...
means that
, so that
is equivalent to
, which is very easy to solve since
is
triangular.
RRQR factorization
Interpolative decomposition
Decompositions based on eigenvalues and related concepts
Eigendecomposition
*Also called ''
spectral decomposition''.
*Applicable to:
square matrix
In mathematics, a square matrix is a Matrix (mathematics), matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied.
Squ ...
''A'' with linearly independent eigenvectors (not necessarily distinct eigenvalues).
*Decomposition:
, where ''D'' is a
diagonal matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
formed from the
eigenvalue
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a ...
s of ''A'', and the columns of ''V'' are the corresponding
eigenvector
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by ...
s of ''A''.
*Existence: An ''n''-by-''n'' matrix ''A'' always has ''n'' (complex) eigenvalues, which can be ordered (in more than one way) to form an ''n''-by-''n'' diagonal matrix ''D'' and a corresponding matrix of nonzero columns ''V'' that satisfies the
eigenvalue equation
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a c ...
.
is invertible if and only if the ''n'' eigenvectors are
linearly independent
In the theory of vector spaces, a set of vectors is said to be if there exists no nontrivial linear combination of the vectors that equals the zero vector. If such a linear combination exists, then the vectors are said to be . These concep ...
(that is, each eigenvalue has
geometric multiplicity
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a c ...
equal to its
algebraic multiplicity). A sufficient (but not necessary) condition for this to happen is that all the eigenvalues are different (in this case geometric and algebraic multiplicity are equal to 1)
*Comment: One can always normalize the eigenvectors to have length one (see the definition of the eigenvalue equation)
*Comment: Every
normal matrix
In mathematics, a complex square matrix is normal if it commutes with its conjugate transpose :
:A \text \iff A^*A = AA^* .
The concept of normal matrices can be extended to normal operators on infinite-dimensional normed spaces and to nor ...
''A'' (that is, matrix for which
, where
is a
conjugate transpose
In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an m \times n complex matrix \mathbf is an n \times m matrix obtained by transposing \mathbf and applying complex conjugation to each entry (the complex conjugate ...
) can be eigendecomposed. For a
normal matrix
In mathematics, a complex square matrix is normal if it commutes with its conjugate transpose :
:A \text \iff A^*A = AA^* .
The concept of normal matrices can be extended to normal operators on infinite-dimensional normed spaces and to nor ...
''A'' (and only for a normal matrix), the eigenvectors can also be made orthonormal (
) and the eigendecomposition reads as
. In particular all
unitary
Unitary may refer to:
Mathematics
* Unitary divisor
* Unitary element
* Unitary group
* Unitary matrix
* Unitary morphism
* Unitary operator
* Unitary transformation
* Unitary representation
* Unitarity (physics)
* ''E''-unitary inverse semigr ...
,
Hermitian {{Short description, none
Numerous things are named after the French mathematician Charles Hermite (1822–1901):
Hermite
* Cubic Hermite spline, a type of third-degree spline
* Gauss–Hermite quadrature, an extension of Gaussian quadrature me ...
, or
skew-Hermitian
__NOTOC__
In linear algebra, a square matrix with complex entries is said to be skew-Hermitian or anti-Hermitian if its conjugate transpose is the negative of the original matrix. That is, the matrix A is skew-Hermitian if it satisfies the relati ...
(in the real-valued case, all
orthogonal
In mathematics, orthogonality (mathematics), orthogonality is the generalization of the geometric notion of ''perpendicularity''. Although many authors use the two terms ''perpendicular'' and ''orthogonal'' interchangeably, the term ''perpendic ...
,
symmetric
Symmetry () in everyday life refers to a sense of harmonious and beautiful proportion and balance. In mathematics, the term has a more precise definition and is usually used to refer to an object that is invariant under some transformations ...
, or
skew-symmetric, respectively) matrices are normal and therefore possess this property.
*Comment: For any real
symmetric matrix
In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally,
Because equal matrices have equal dimensions, only square matrices can be symmetric.
The entries of a symmetric matrix are symmetric with ...
''A'', the eigendecomposition always exists and can be written as
, where both ''D'' and ''V'' are real-valued.
*Comment: The eigendecomposition is useful for understanding the solution of a system of linear ordinary differential equations or linear difference equations. For example, the difference equation
starting from the initial condition
is solved by
, which is equivalent to
, where ''V'' and ''D'' are the matrices formed from the eigenvectors and eigenvalues of ''A''. Since ''D'' is diagonal, raising it to power
, just involves raising each element on the diagonal to the power ''t''. This is much easier to do and understand than raising ''A'' to power ''t'', since ''A'' is usually not diagonal.
Jordan decomposition
The
Jordan normal form
\begin
\lambda_1 1\hphantom\hphantom\\
\hphantom\lambda_1 1\hphantom\\
\hphantom\lambda_1\hphantom\\
\hphantom\lambda_2 1\hphantom\hphantom\\
\hphantom\hphantom\lambda_2\hphantom\\
\hphantom\lambda_3\hphantom\\
\hphantom\ddots\hphantom\\
...
and the
Jordan–Chevalley decomposition
In mathematics, specifically linear algebra, the Jordan–Chevalley decomposition, named after Camille Jordan and Claude Chevalley, expresses a linear operator in a unique way as the sum of two other linear operators which are simpler to understand ...
*Applicable to:
square matrix
In mathematics, a square matrix is a Matrix (mathematics), matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied.
Squ ...
''A''
*Comment: the Jordan normal form generalizes the eigendecomposition to cases where there are repeated eigenvalues and cannot be diagonalized, the Jordan–Chevalley decomposition does this without choosing a basis.
Schur decomposition
*Applicable to:
square matrix
In mathematics, a square matrix is a Matrix (mathematics), matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied.
Squ ...
''A''
*Decomposition (complex version):
, where ''U'' is a
unitary matrix
In linear algebra, an invertible complex square matrix is unitary if its matrix inverse equals its conjugate transpose , that is, if
U^* U = UU^* = I,
where is the identity matrix.
In physics, especially in quantum mechanics, the conjugate ...
,
is the
conjugate transpose
In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an m \times n complex matrix \mathbf is an n \times m matrix obtained by transposing \mathbf and applying complex conjugation to each entry (the complex conjugate ...
of ''U'', and ''T'' is an
upper triangular
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
matrix called the complex
Schur form which has the
eigenvalue
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a ...
s of ''A'' along its diagonal.
*Comment: if ''A'' is a
normal matrix
In mathematics, a complex square matrix is normal if it commutes with its conjugate transpose :
:A \text \iff A^*A = AA^* .
The concept of normal matrices can be extended to normal operators on infinite-dimensional normed spaces and to nor ...
, then ''T'' is diagonal and the Schur decomposition coincides with the spectral decomposition.
Real Schur decomposition
*Applicable to:
square matrix
In mathematics, a square matrix is a Matrix (mathematics), matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied.
Squ ...
''A''
*Decomposition: This is a version of Schur decomposition where
and
only contain real numbers. One can always write
where ''V'' is a real
orthogonal matrix
In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors.
One way to express this is
Q^\mathrm Q = Q Q^\mathrm = I,
where is the transpose of and is the identi ...
,
is the
transpose
In linear algebra, the transpose of a Matrix (mathematics), matrix is an operator which flips a matrix over its diagonal;
that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other ...
of ''V'', and ''S'' is a
block upper triangular matrix called the real
Schur form. The blocks on the diagonal of ''S'' are of size 1×1 (in which case they represent real eigenvalues) or 2×2 (in which case they are derived from
complex conjugate
In mathematics, the complex conjugate of a complex number is the number with an equal real part and an imaginary part equal in magnitude but opposite in sign. That is, if a and b are real numbers, then the complex conjugate of a + bi is a - ...
eigenvalue pairs).
QZ decomposition
*Also called: ''generalized Schur decomposition''
*Applicable to:
square matrices
In mathematics, a square matrix is a matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied.
Square matrices are often ...
''A'' and ''B''
*Comment: there are two versions of this decomposition: complex and real.
*Decomposition (complex version):
and
where ''Q'' and ''Z'' are
unitary matrices, the * superscript represents
conjugate transpose
In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an m \times n complex matrix \mathbf is an n \times m matrix obtained by transposing \mathbf and applying complex conjugation to each entry (the complex conjugate ...
, and ''S'' and ''T'' are
upper triangular
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
matrices.
*Comment: in the complex QZ decomposition, the ratios of the diagonal elements of ''S'' to the corresponding diagonal elements of ''T'',
, are the generalized
eigenvalue
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a ...
s that solve the
generalized eigenvalue problem
In linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the mat ...
(where
is an unknown scalar and v is an unknown nonzero vector).
*Decomposition (real version):
and
where ''A'', ''B'', ''Q'', ''Z'', ''S'', and ''T'' are matrices containing real numbers only. In this case ''Q'' and ''Z'' are
orthogonal matrices, the ''T'' superscript represents
transposition, and ''S'' and ''T'' are
block upper triangular matrices. The blocks on the diagonal of ''S'' and ''T'' are of size 1×1 or 2×2.
Takagi's factorization
*Applicable to: square, complex, symmetric matrix ''A''.
*Decomposition:
, where ''D'' is a real nonnegative
diagonal matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
, and ''V'' is
unitary
Unitary may refer to:
Mathematics
* Unitary divisor
* Unitary element
* Unitary group
* Unitary matrix
* Unitary morphism
* Unitary operator
* Unitary transformation
* Unitary representation
* Unitarity (physics)
* ''E''-unitary inverse semigr ...
.
denotes the
matrix transpose
In linear algebra, the transpose of a matrix is an operator which flips a matrix over its diagonal;
that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other notations).
The tr ...
of ''V''.
*Comment: The diagonal elements of ''D'' are the nonnegative square roots of the eigenvalues of
.
*Comment: ''V'' may be complex even if ''A'' is real.
*Comment: This is not a special case of the eigendecomposition (see above), which uses
instead of
. Moreover, if ''A'' is not real, it is not Hermitian and the form using
also does not apply.
Singular value decomposition
*Applicable to: ''m''-by-''n'' matrix ''A''.
*Decomposition:
, where ''D'' is a nonnegative
diagonal matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
, and ''U'' and ''V'' satisfy
. Here
is the
conjugate transpose
In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an m \times n complex matrix \mathbf is an n \times m matrix obtained by transposing \mathbf and applying complex conjugation to each entry (the complex conjugate ...
of ''V'' (or simply the
transpose
In linear algebra, the transpose of a Matrix (mathematics), matrix is an operator which flips a matrix over its diagonal;
that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other ...
, if ''V'' contains real numbers only), and ''I'' denotes the identity matrix (of some dimension).
*Comment: The diagonal elements of ''D'' are called the
singular values of ''A''.
*Comment: Like the eigendecomposition above, the singular value decomposition involves finding basis directions along which matrix multiplication is equivalent to scalar multiplication, but it has greater generality since the matrix under consideration need not be square.
*Uniqueness: the singular values of
are always uniquely determined.
and
need not to be unique in general.
Scale-invariant decompositions
Refers to variants of existing matrix decompositions, such as the SVD, that are invariant with respect to diagonal scaling.
*Applicable to: ''m''-by-''n'' matrix ''A''.
*Unit-Scale-Invariant Singular-Value Decomposition:
, where ''S'' is a unique nonnegative
diagonal matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
of scale-invariant singular values, ''U'' and ''V'' are
unitary matrices,
is the
conjugate transpose
In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an m \times n complex matrix \mathbf is an n \times m matrix obtained by transposing \mathbf and applying complex conjugation to each entry (the complex conjugate ...
of ''V'', and positive diagonal matrices ''D'' and ''E''.
*Comment: Is analogous to the SVD except that the diagonal elements of ''S'' are invariant with respect to left and/or right multiplication of ''A'' by arbitrary nonsingular diagonal matrices, as opposed to the standard SVD for which the singular values are invariant with respect to left and/or right multiplication of ''A'' by arbitrary unitary matrices.
*Comment: Is an alternative to the standard SVD when invariance is required with respect to diagonal rather than unitary transformations of ''A''.
*Uniqueness: The scale-invariant singular values of
(given by the diagonal elements of ''S'') are always uniquely determined. Diagonal matrices ''D'' and ''E'', and unitary ''U'' and ''V'', are not necessarily unique in general.
*Comment: ''U'' and ''V'' matrices are not the same as those from the SVD.
Analogous scale-invariant decompositions can be derived from other matrix decompositions; for example, to obtain scale-invariant eigenvalues.
Hessenberg decomposition
*Applicable to:
square matrix
In mathematics, a square matrix is a Matrix (mathematics), matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied.
Squ ...
A.
*Decomposition:
where
is the
Hessenberg matrix
In linear algebra, a Hessenberg matrix is a special kind of square matrix, one that is "almost" triangular. To be exact, an upper Hessenberg matrix has zero entries below the first subdiagonal, and a lower Hessenberg matrix has zero entries above ...
and
is a
unitary matrix
In linear algebra, an invertible complex square matrix is unitary if its matrix inverse equals its conjugate transpose , that is, if
U^* U = UU^* = I,
where is the identity matrix.
In physics, especially in quantum mechanics, the conjugate ...
.
*Comment: often the first step in the Schur decomposition.
Complete orthogonal decomposition
*Also known as: ''UTV decomposition'', ''ULV decomposition'', ''URV decomposition''.
*Applicable to: ''m''-by-''n'' matrix ''A''.
*Decomposition:
, where ''T'' is a
triangular matrix
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are z ...
, and ''U'' and ''V'' are
unitary matrices.
*Comment: Similar to the singular value decomposition and to the Schur decomposition.
Other decompositions
Polar decomposition
*Applicable to: any square complex matrix ''A''.
*Decomposition:
(right polar decomposition) or
(left polar decomposition), where ''U'' is a
unitary matrix
In linear algebra, an invertible complex square matrix is unitary if its matrix inverse equals its conjugate transpose , that is, if
U^* U = UU^* = I,
where is the identity matrix.
In physics, especially in quantum mechanics, the conjugate ...
and ''P'' and ''P are
positive semidefinite Hermitian matrices
In mathematics, a Hermitian matrix (or self-adjoint matrix) is a complex square matrix that is equal to its own conjugate transpose—that is, the element in the -th row and -th column is equal to the complex conjugate of the element in the -th ...
.
*Uniqueness:
is always unique and equal to
(which is always hermitian and positive semidefinite). If
is invertible, then
is unique.
*Comment: Since any Hermitian matrix admits a spectral decomposition with a unitary matrix,
can be written as
. Since
is positive semidefinite, all elements in
are non-negative. Since the product of two unitary matrices is unitary, taking
one can write
which is the singular value decomposition. Hence, the existence of the polar decomposition is equivalent to the existence of the singular value decomposition.
Algebraic polar decomposition
*Applicable to: square, complex, non-singular matrix ''A''.
*Decomposition:
, where ''Q'' is a complex orthogonal matrix and ''S'' is complex symmetric matrix.
*Uniqueness: If
has no negative real eigenvalues, then the decomposition is unique.
*Comment: The existence of this decomposition is equivalent to
being similar to
.
*Comment: A variant of this decomposition is
, where ''R'' is a real matrix and ''C'' is a
circular matrix.
[
]
Mostow's decomposition
* Applicable to: square, complex, non-singular matrix ''A''.
* Decomposition: , where ''U'' is unitary, ''M'' is real anti-symmetric and ''S'' is real symmetric.
* Comment: The matrix ''A'' can also be decomposed as , where ''U''2 is unitary, ''M''2 is real anti-symmetric and ''S''2 is real symmetric.[
]
Sinkhorn normal form
*Applicable to: square real matrix ''A'' with strictly positive elements.
*Decomposition: , where ''S'' is doubly stochastic and ''D''1 and ''D''2 are real diagonal matrices with strictly positive elements.
Sectoral decomposition
*Applicable to: square, complex matrix ''A'' with numerical range In the mathematics, mathematical field of linear algebra and convex analysis, the numerical range or field of values of a complex number, complex n \times n square matrix, matrix ''A'' is the set
:W(A)
= \left\
= \left\
where \mathbf^* denotes t ...
contained in the sector .
*Decomposition: , where ''C'' is an invertible complex matrix and with all .
Williamson's normal form
* Applicable to: square, positive-definite In mathematics, positive definiteness is a property of any object to which a bilinear form or a sesquilinear form may be naturally associated, which is positive-definite. See, in particular:
* Positive-definite bilinear form
* Positive-definite ...
real matrix ''A'' with order 2''n''×2''n''.
* Decomposition: , where is a symplectic matrix
In mathematics, a symplectic matrix is a 2n\times 2n matrix M with real entries that satisfies the condition
where M^\text denotes the transpose of M and \Omega is a fixed 2n\times 2n nonsingular, skew-symmetric matrix. This definition can be ...
and ''D'' is a nonnegative ''n''-by-''n'' diagonal matrix.
Matrix square root
* Decomposition: , not unique in general.
* In the case of positive semidefinite , there is a unique positive semidefinite such that .
Generalizations
There exist analogues of the SVD, QR, LU and Cholesky factorizations for quasimatrices and cmatrices or continuous matrices. A ‘quasimatrix’ is, like a matrix, a rectangular scheme whose elements are indexed, but one discrete index is replaced by a continuous index. Likewise, a ‘cmatrix’, is continuous in both indices. As an example of a cmatrix, one can think of the kernel of an integral operator
An integral operator is an operator that involves integration. Special instances are:
* The operator of integration itself, denoted by the integral symbol
* Integral linear operators, which are linear operators induced by bilinear forms involvi ...
.
These factorizations are based on early work by , and . For an account, and a translation to English of the seminal papers, see .
See also
* Matrix splitting
In the mathematical discipline of numerical linear algebra, a matrix splitting is an expression which represents a given matrix as a sum or difference of matrices. Many iterative methods (for example, for systems of differential equations) depe ...
* Non-negative matrix factorization
Non-negative matrix factorization (NMF or NNMF), also non-negative matrix approximation is a group of algorithms in multivariate analysis and linear algebra where a matrix is factorized into (usually) two matrices and , with the property th ...
* Principal component analysis
Principal component analysis (PCA) is a linear dimensionality reduction technique with applications in exploratory data analysis, visualization and data preprocessing.
The data is linearly transformed onto a new coordinate system such that th ...
References
Notes
Citations
Bibliography
*
*
*
*
*
*
*
*
*
*
External links
Online Matrix Calculator
Wolfram Alpha Matrix Decomposition Computation » LU and QR Decomposition
Springer Encyclopaedia of Mathematics » Matrix factorization
GraphLab collaborative filtering
Collaborative filtering (CF) is, besides content-based filtering, one of two major techniques used by recommender systems.Francesco Ricci and Lior Rokach and Bracha ShapiraIntroduction to Recommender Systems Handbook, Recommender Systems Handbo ...
library, large scale parallel implementation of matrix decomposition methods (in C++) for multicore.
{{linear algebra
Matrix theory
factorization