HOME

TheInfoList



OR:

In mathematics, in particular functional analysis, the singular values, or ''s''-numbers of a compact operator T: X \rightarrow Y acting between
Hilbert space In mathematics, Hilbert spaces (named after David Hilbert) allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise natu ...
s X and Y, are the square roots of the (necessarily non-negative)
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denot ...
s of the self-adjoint operator T^*T (where T^* denotes the adjoint of T). The singular values are non-negative
real number In mathematics, a real number is a number that can be used to measurement, measure a ''continuous'' one-dimensional quantity such as a distance, time, duration or temperature. Here, ''continuous'' means that values can have arbitrarily small var ...
s, usually listed in decreasing order (''σ''1(''T''), ''σ''2(''T''), …). The largest singular value ''σ''1(''T'') is equal to the operator norm of ''T'' (see Min-max theorem). If ''T'' acts on Euclidean space \Reals ^n, there is a simple geometric interpretation for the singular values: Consider the image by T of the
unit sphere In mathematics, a unit sphere is simply a sphere of radius one around a given center. More generally, it is the set of points of distance 1 from a fixed central point, where different norms can be used as general notions of "distance". A u ...
; this is an
ellipsoid An ellipsoid is a surface that may be obtained from a sphere by deforming it by means of directional scalings, or more generally, of an affine transformation. An ellipsoid is a quadric surface;  that is, a surface that may be defined as the ...
, and the lengths of its semi-axes are the singular values of T (the figure provides an example in \Reals^2). The singular values are the absolute values of the eigenvalues of a
normal matrix In mathematics, a complex square matrix is normal if it commutes with its conjugate transpose : The concept of normal matrices can be extended to normal operators on infinite dimensional normed spaces and to normal elements in C*-algebras. ...
''A'', because the
spectral theorem In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful b ...
can be applied to obtain unitary diagonalization of A as A = U\Lambda U^*. Therefore, Most norms on Hilbert space operators studied are defined using ''s''-numbers. For example, the Ky Fan-''k''-norm is the sum of first ''k'' singular values, the trace norm is the sum of all singular values, and the Schatten norm is the ''p''th root of the sum of the ''p''th powers of the singular values. Note that each norm is defined only on a special class of operators, hence ''s''-numbers are useful in classifying different operators. In the finite-dimensional case, a
matrix Matrix most commonly refers to: * ''The Matrix'' (franchise), an American media franchise ** '' The Matrix'', a 1999 science-fiction action film ** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchi ...
can always be decomposed in the form \mathbf, where \mathbf and \mathbf are unitary matrices and \mathbf is a rectangular diagonal matrix with the singular values lying on the diagonal. This is the
singular value decomposition In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix. It generalizes the eigendecomposition of a square normal matrix with an orthonormal eigenbasis to any \ m \times n\ matrix. It is r ...
.


Basic properties

For A \in \mathbb^, and i = 1,2, \ldots, \min \. Min-max theorem for singular values. Here U: \dim(U) = i is a subspace of \mathbb^n of dimension i. :\begin \sigma_i(A) &= \min_ \max_ \left\, Ax \right\, _2. \\ \sigma_i(A) &= \max_ \min_ \left\, Ax \right\, _2. \end Matrix transpose and conjugate do not alter singular values. :\sigma_i(A) = \sigma_i\left(A^\textsf\right) = \sigma_i\left(A^*\right). For any unitary U \in \mathbb^, V \in \mathbb^. :\sigma_i(A) = \sigma_i(UAV). Relation to eigenvalues: :\sigma_i^2(A) = \lambda_i\left(AA^*\right) = \lambda_i\left(A^*A\right). Relation to trace: :\sum_^n \sigma_i^2=\text\ A^\ast A. If A^\top A is full rank, the product of singular values is \sqrt. If A A^\top is full rank, the product of singular values is \sqrt. If A is full rank, the product of singular values is , \det A, .


Inequalities about singular values

See also.


Singular values of sub-matrices

For A \in \mathbb^. # Let B denote A with one of its rows ''or'' columns deleted. Then \sigma_(A) \leq \sigma_i (B) \leq \sigma_i(A) # Let B denote A with one of its rows ''and'' columns deleted. Then \sigma_(A) \leq \sigma_i (B) \leq \sigma_i(A) # Let B denote an (m-k)\times(n-l) submatrix of A. Then \sigma_(A) \leq \sigma_i (B) \leq \sigma_i(A)


Singular values of ''A'' + ''B''

For A, B \in \mathbb^ # \sum_^ \sigma_i(A + B) \leq \sum_^ \sigma_i(A) + \sigma_i(B), \quad k=\min \ # \sigma_(A + B) \leq \sigma_i(A) + \sigma_j(B). \quad i,j\in\mathbb,\ i + j - 1 \leq \min \


Singular values of ''AB''

For A, B \in \mathbb^ # \begin \prod_^ \sigma_i(A) \sigma_i(B) &\leq \prod_^ \sigma_i(AB) \\ \prod_^k \sigma_i(AB) &\leq \prod_^k \sigma_i(A) \sigma_i(B), \\ \sum_^k \sigma_i^p(AB) &\leq \sum_^k \sigma_i^p(A) \sigma_i^p(B), \end # \sigma_n(A) \sigma_i(B) \leq \sigma_i (AB) \leq \sigma_1(A) \sigma_i(B) \quad i = 1, 2, \ldots, n. For A, B \in \mathbb^ 2 \sigma_i(A B^*) \leq \sigma_i \left(A^* A + B^* B\right), \quad i = 1, 2, \ldots, n.


Singular values and eigenvalues

For A \in \mathbb^. # See \lambda_i \left(A + A^*\right) \leq 2 \sigma_i(A), \quad i = 1, 2, \ldots, n. # Assume \left, \lambda_1(A)\ \geq \cdots \geq \left, \lambda_n(A)\. Then for k = 1, 2, \ldots, n: ## Weyl's theorem \prod_^k \left, \lambda_i(A)\ \leq \prod_^ \sigma_i(A). ## For p>0. \sum_^k \left, \lambda_i^p(A)\ \leq \sum_^ \sigma_i^p(A).


History

This concept was introduced by
Erhard Schmidt Erhard Schmidt (13 January 1876 – 6 December 1959) was a Baltic German mathematician whose work significantly influenced the direction of mathematics in the twentieth century. Schmidt was born in Tartu (german: link=no, Dorpat), in the Gover ...
in 1907. Schmidt called singular values "eigenvalues" at that time. The name "singular value" was first quoted by Smithies in 1937. In 1957, Allahverdiev proved the following characterization of the ''n''th ''s''-number: I. C. Gohberg and M. G. Krein. Introduction to the Theory of Linear Non-selfadjoint Operators. American Mathematical Society, Providence, R.I.,1969. Translated from the Russian by A. Feinstein. Translations of Mathematical Monographs, Vol. 18. : s_n(T) = \inf\big\. This formulation made it possible to extend the notion of ''s''-numbers to operators in
Banach space In mathematics, more specifically in functional analysis, a Banach space (pronounced ) is a complete normed vector space. Thus, a Banach space is a vector space with a metric that allows the computation of vector length and distance between ve ...
.


See also

* Condition number *
Cauchy interlacing theorem In linear algebra and functional analysis, the min-max theorem, or variational theorem, or Courant–Fischer–Weyl min-max principle, is a result that gives a variational characterization of eigenvalues of compact Hermitian operators ...
or
Poincaré separation theorem In mathematics, the Poincaré separation theorem, also known as the Cauchy interlacing theorem, gives some upper and lower bounds of eigenvalues of a real symmetric matrix ''B'AB'' that can be considered as the orthogonal projection of a larger rea ...
* Schur–Horn theorem *
Singular value decomposition In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix. It generalizes the eigendecomposition of a square normal matrix with an orthonormal eigenbasis to any \ m \times n\ matrix. It is r ...


References

{{Reflist Operator theory Singular value decomposition