Schur decomposition
   HOME

TheInfoList



OR:

In the
mathematical Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many ar ...
discipline of
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as :a_1x_1+\cdots +a_nx_n=b, linear maps such as :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrix (mathemat ...
, the Schur decomposition or Schur triangulation, named after
Issai Schur Issai Schur (10 January 1875 – 10 January 1941) was a Russian mathematician who worked in Germany for most of his life. He studied at the Humboldt University of Berlin, University of Berlin. He obtained his doctorate in 1901, became lecturer i ...
, is a matrix decomposition. It allows one to write an arbitrary complex square matrix as unitarily similar to an upper triangular matrix whose diagonal elements are the eigenvalues of the original matrix.


Statement

The complex Schur decomposition reads as follows: if is an square matrix with
complex Complex commonly refers to: * Complexity, the behaviour of a system whose components interact in multiple ways so possible interactions are difficult to describe ** Complex system, a system composed of many components which may interact with each ...
entries, then ''A'' can be expressed as (Section 2.3 and further at p. 79(Section 7.7 at p. 313 A = Q U Q^ for some unitary matrix ''Q'' (so that the inverse ''Q''−1 is also the conjugate transpose ''Q''* of ''Q''), and some upper triangular matrix ''U''. This is called a Schur form of ''A''. Since ''U'' is similar to ''A'', it has the same
spectrum A spectrum (: spectra or spectrums) is a set of related ideas, objects, or properties whose features overlap such that they blend to form a continuum. The word ''spectrum'' was first used scientifically in optics to describe the rainbow of co ...
, and since it is triangular, its
eigenvalue In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a ...
s are the diagonal entries of ''U''. The Schur decomposition implies that there exists a nested sequence of ''A''-invariant subspaces , and that there exists an ordered
orthonormal basis In mathematics, particularly linear algebra, an orthonormal basis for an inner product space V with finite Dimension (linear algebra), dimension is a Basis (linear algebra), basis for V whose vectors are orthonormal, that is, they are all unit vec ...
(for the standard
Hermitian form In mathematics, a sesquilinear form is a generalization of a bilinear form that, in turn, is a generalization of the concept of the dot product of Euclidean space. A bilinear form is linear map, linear in each of its arguments, but a sesquilinear f ...
of ) such that the first ''i'' basis vectors span for each ''i'' occurring in the nested sequence. Phrased somewhat differently, the first part says that a
linear operator In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pr ...
''J'' on a complex finite-dimensional vector space stabilizes a complete
flag A flag is a piece of textile, fabric (most often rectangular) with distinctive colours and design. It is used as a symbol, a signalling device, or for decoration. The term ''flag'' is also used to refer to the graphic design employed, and fla ...
. There is also a real Schur decomposition. If is an square matrix with real entries, then ''A'' can be expressed as (Section 2.3 and further at p. 82 A = Q H Q^ where is an
orthogonal matrix In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is Q^\mathrm Q = Q Q^\mathrm = I, where is the transpose of and is the identi ...
and is either upper or lower quasi-triangular. A quasi-triangular matrix is a matrix that when expressed as a block matrix of and blocks is triangular. This is a stronger property than being Hessenberg. Just as in the complex case, a family of commuting real matrices may be simultaneously brought to quasi-triangular form by an orthogonal matrix. There exists an orthogonal matrix ''Q'' such that, for every ''Ai'' in the given family, H_i = Q A_i Q^ is upper quasi-triangular.


Proof

A constructive proof for the Schur decomposition is as follows: every operator ''A'' on a complex finite-dimensional vector space has an eigenvalue ''λ'', corresponding to some eigenspace ''Vλ''. Let ''Vλ'' be its orthogonal complement. It is clear that, with respect to this orthogonal decomposition, ''A'' has matrix representation (one can pick here any orthonormal bases ''Z''1 and ''Z''2 spanning ''Vλ'' and ''Vλ'' respectively) \begin Z_1 & Z_2 \end^ A \beginZ_1 & Z_2\end = \begin \lambda \, I_ & A_ \\ 0 & A_ \end: \begin V_ \\ \oplus \\ V_^ \end \rightarrow \begin V_ \\ \oplus \\ V_^ \end where ''Iλ'' is the identity operator on ''Vλ''. The above matrix would be upper-triangular except for the ''A''22 block. But exactly the same procedure can be applied to the sub-matrix ''A''22, viewed as an operator on ''Vλ'', and its submatrices. Continue this way until the resulting matrix is upper triangular. Since each conjugation increases the dimension of the upper-triangular block by at least one, this process takes at most ''n'' steps. Thus the space C''n'' will be exhausted and the procedure has yielded the desired result. The above argument can be slightly restated as follows: let ''λ'' be an eigenvalue of ''A'', corresponding to some eigenspace ''Vλ''. ''A'' induces an operator ''T'' on the quotient space C''n''/''Vλ''. This operator is precisely the ''A''22 submatrix from above. As before, ''T'' would have an eigenspace, say ''Wμ'' ⊂ C''n'' modulo ''Vλ''. Notice the preimage of ''Wμ'' under the quotient map is an invariant subspace of ''A'' that contains ''Vλ''. Continue this way until the resulting quotient space has dimension 0. Then the successive preimages of the eigenspaces found at each step form a flag that ''A'' stabilizes.


Notes

Although every square matrix has a Schur decomposition, in general this decomposition is not unique. For example, the eigenspace ''Vλ'' can have dimension > 1, in which case any orthonormal basis for ''Vλ'' would lead to the desired result. Write the triangular matrix ''U'' as ''U'' = ''D'' + ''N'', where ''D'' is diagonal and ''N'' is strictly upper triangular (and thus a
nilpotent matrix In linear algebra, a nilpotent matrix is a square matrix ''N'' such that :N^k = 0\, for some positive integer k. The smallest such k is called the index of N, sometimes the degree of N. More generally, a nilpotent transformation is a linear trans ...
). The diagonal matrix ''D'' contains the eigenvalues of ''A'' in arbitrary order (hence its Frobenius norm, squared, is the sum of the squared moduli of the eigenvalues of ''A'', while the Frobenius norm of ''A'', squared, is the sum of the squared singular values of ''A''). The nilpotent part ''N'' is generally not unique either, but its
Frobenius norm In the field of mathematics, norms are defined for elements within a vector space. Specifically, when the vector space comprises matrices, such norms are referred to as matrix norms. Matrix norms differ from vector norms in that they must also ...
is uniquely determined by ''A'' (just because the Frobenius norm of A is equal to the Frobenius norm of ''U'' = ''D'' + ''N''). It is clear that if ''A'' is a
normal matrix In mathematics, a complex square matrix is normal if it commutes with its conjugate transpose : :A \text \iff A^*A = AA^* . The concept of normal matrices can be extended to normal operators on infinite-dimensional normed spaces and to nor ...
, then ''U'' from its Schur decomposition must be a
diagonal matrix In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
and the column vectors of ''Q'' are the
eigenvector In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by ...
s of ''A''. Therefore, the Schur decomposition extends the spectral decomposition. In particular, if ''A'' is positive definite, the Schur decomposition of ''A'', its spectral decomposition, and its
singular value decomposition In linear algebra, the singular value decomposition (SVD) is a Matrix decomposition, factorization of a real number, real or complex number, complex matrix (mathematics), matrix into a rotation, followed by a rescaling followed by another rota ...
coincide. A
commuting Commuting is periodically recurring travel between a place of residence and place of work or study, where the traveler, referred to as a commuter, leaves the boundary of their home community. By extension, it can sometimes be any regular o ...
family of matrices can be simultaneously triangularized, i.e. there exists a unitary matrix ''Q'' such that, for every ''Ai'' in the given family, ''Q Ai Q*'' is upper triangular. This can be readily deduced from the above proof. Take element ''A'' from and again consider an eigenspace ''VA''. Then ''VA'' is invariant under all matrices in . Therefore, all matrices in must share one common eigenvector in ''VA''. Induction then proves the claim. As a corollary, we have that every commuting family of normal matrices can be simultaneously diagonalized. In the infinite dimensional setting, not every bounded operator on a
Banach space In mathematics, more specifically in functional analysis, a Banach space (, ) is a complete normed vector space. Thus, a Banach space is a vector space with a metric that allows the computation of vector length and distance between vectors and ...
has an invariant subspace. However, the upper-triangularization of an arbitrary square matrix does generalize to compact operators. Every compact operator on a complex Banach space has a
nest A nest is a structure built for certain animals to hold Egg (biology), eggs or young. Although nests are most closely associated with birds, members of all classes of vertebrates and some invertebrates construct nests. They may be composed of ...
of closed invariant subspaces.


Computation

The Schur decomposition of a given matrix is numerically computed by the QR algorithm or its variants. In other words, the roots of the
characteristic polynomial In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The ...
corresponding to the matrix are not necessarily computed ahead in order to obtain its Schur decomposition. Conversely, the QR algorithm can be used to compute the roots of any given
characteristic polynomial In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The ...
by finding the Schur decomposition of its companion matrix. Similarly, the QR algorithm is used to compute the eigenvalues of any given matrix, which are the diagonal entries of the upper triangular matrix of the Schur decomposition. Its complexity is \mathcal(n^3).


Applications

Lie theory In mathematics, the mathematician Sophus Lie ( ) initiated lines of study involving integration of differential equations, transformation groups, and contact (mathematics), contact of spheres that have come to be called Lie theory. For instance, ...
applications include: * Every invertible operator is contained in a Borel group. * Every operator fixes a point of the
flag manifold In mathematics, a generalized flag variety (or simply flag variety) is a homogeneous space whose points are flags in a finite-dimensional vector space ''V'' over a field F. When F is the real or complex numbers, a generalized flag variety is a sm ...
.


Generalized Schur decomposition

Given square matrices ''A'' and ''B'', the generalized Schur decomposition factorizes both matrices as A = QSZ^* and B = QTZ^*, where ''Q'' and ''Z'' are unitary, and ''S'' and ''T'' are upper triangular. The generalized Schur decomposition is also sometimes called the QZ decomposition. Daniel Kressner: "Numerical Methods for General and Structured Eigenvalue Problems", Chap-2, Springer, LNCSE-46 (2005). The generalized
eigenvalue In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a ...
s \lambda that solve the generalized eigenvalue problem A\mathbf=\lambda B\mathbf (where x is an unknown nonzero vector) can be calculated as the ratio of the diagonal elements of ''S'' to those of ''T''. That is, using subscripts to denote matrix elements, the ''i''th generalized eigenvalue \lambda_i satisfies \lambda_i = S_ / T_{ii}.


References

Matrix theory Articles containing proofs Matrix decompositions Issai Schur