HOME

TheInfoList



OR:

In mathematics, an invariant subspace of a
linear mapping In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pre ...
''T'' : ''V'' → ''V '' i.e. from some
vector space In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called '' vectors'', may be added together and multiplied ("scaled") by numbers called ''scalars''. Scalars are often real numbers, but can ...
''V'' to itself, is a subspace ''W'' of ''V'' that is preserved by ''T''; that is, ''T''(''W'') ⊆ ''W''.


General description

Consider a linear mapping T :T: W \to W. An invariant subspace W of T has the property that all vectors \mathbf \in W are transformed by T into vectors also contained in W. This can be stated as :\mathbf \in W \implies T(\mathbf) \in W.


Trivial examples of invariant subspaces

* \mathbb^n: Since T maps every vector in \mathbb^n into \mathbb^n. * \: Since a linear map has to map 0 \mapsto 0.


1-dimensional invariant subspace ''U''

A
basis Basis may refer to: Finance and accounting * Adjusted basis, the net cost of an asset after adjusting for various tax-related items *Basis point, 0.01%, often used in the context of interest rates * Basis trading, a trading strategy consisting ...
of a 1-dimensional space is simply a non-zero vector \mathbf. Consequently, any vector \mathbf \in U can be represented as \lambda \mathbf where \lambda is a scalar. If we represent T by a matrix A then, for U to be an invariant subspace it must satisfy : \forall \mathbf \in U \; \exists \alpha \in \mathbb: A\mathbf = \alpha \mathbf. We know that \mathbf \in U \Rightarrow \mathbf=\beta \mathbf with \beta \in \mathbb. Therefore, the condition for existence of a 1-dimensional invariant subspace is expressed as: :A\mathbf=\lambda \mathbf, where \lambda is a scalar (in the base
field Field may refer to: Expanses of open ground * Field (agriculture), an area of land used for agricultural purposes * Airfield, an aerodrome that lacks the infrastructure of an airport * Battlefield * Lawn, an area of mowed grass * Meadow, a grass ...
of the vector space. Note that this is the typical formulation of an
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
problem, which means that any
eigenvector In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
of A forms a 1-dimensional invariant subspace in T.


Formal description

An invariant subspace of a
linear mapping In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pre ...
:T:V \to V from some
vector space In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called '' vectors'', may be added together and multiplied ("scaled") by numbers called ''scalars''. Scalars are often real numbers, but can ...
''V'' to itself is a subspace ''W'' of ''V'' such that ''T''(''W'') is contained in ''W''. An invariant subspace of ''T'' is also said to be ''T'' invariant. If ''W'' is ''T''-invariant, we can
restrict In the C programming language, restrict is a keyword, introduced by the C99 standard, that can be used in pointer declarations. By adding this type qualifier, a programmer hints to the compiler that for the lifetime of the pointer, no other p ...
''T'' to ''W'' to arrive at a new linear mapping : T, _W : W \to W. This linear mapping is called the restriction of ''T'' on ''W'' and is defined by : T, _W(\mathbf) = T(\mathbf) \text \mathbf \in W. Next, we give a few immediate examples of invariant subspaces. Certainly ''V'' itself, and the subspace , are trivially invariant subspaces for every linear operator ''T'' : ''V'' → ''V''. For certain linear operators there is no ''non-trivial'' invariant subspace; consider for instance a rotation of a two-dimensional
real Real may refer to: Currencies * Brazilian real (R$) * Central American Republic real * Mexican real * Portuguese real * Spanish real * Spanish colonial real Music Albums * ''Real'' (L'Arc-en-Ciel album) (2000) * ''Real'' (Bright album) (2010) ...
vector space. Let v be an
eigenvector In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
of ''T'', i.e. ''T'' v = λv. Then ''W'' =
span Span may refer to: Science, technology and engineering * Span (unit), the width of a human hand * Span (engineering), a section between two intermediate supports * Wingspan, the distance between the wingtips of a bird or aircraft * Sorbitan ester ...
is ''T''-invariant. As a consequence of the
fundamental theorem of algebra The fundamental theorem of algebra, also known as d'Alembert's theorem, or the d'Alembert–Gauss theorem, states that every non- constant single-variable polynomial with complex coefficients has at least one complex root. This includes polynomia ...
, every linear operator on a nonzero finite-dimensional
complex Complex commonly refers to: * Complexity, the behaviour of a system whose components interact in multiple ways so possible interactions are difficult to describe ** Complex system, a system composed of many components which may interact with each ...
vector space has an eigenvector. Therefore, every such linear operator has a non-trivial invariant subspace. The fact that the complex numbers are an algebraically closed field is required here. Comparing with the previous example, one can see that the invariant subspaces of a linear transformation are dependent upon the base field of ''V''. An invariant vector (i.e. a fixed point of ''T''), other than 0, spans an invariant subspace of dimension 1. An invariant subspace of dimension 1 will be acted on by ''T'' by a scalar and consists of invariant vectors if and only if that scalar is 1. As the above examples indicate, the invariant subspaces of a given linear transformation ''T'' shed light on the structure of ''T''. When ''V'' is a finite-dimensional vector space over an algebraically closed field, linear transformations acting on ''V'' are characterized (up to similarity) by the
Jordan canonical form In linear algebra, a Jordan normal form, also known as a Jordan canonical form (JCF), is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to so ...
, which decomposes ''V'' into invariant subspaces of ''T''. Many fundamental questions regarding ''T'' can be translated to questions about invariant subspaces of ''T''. More generally, invariant subspaces are defined for sets of operators as subspaces invariant for each operator in the set. Let ''L''(''V'') denote the
algebra Algebra () is one of the broad areas of mathematics. Roughly speaking, algebra is the study of mathematical symbols and the rules for manipulating these symbols in formulas; it is a unifying thread of almost all of mathematics. Elementary ...
of linear transformations on ''V'', and Lat(''T'') be the family of subspaces invariant under ''T'' ∈ ''L''(''V''). (The "Lat" notation refers to the fact that Lat(''T'') forms a
lattice Lattice may refer to: Arts and design * Latticework, an ornamental criss-crossed framework, an arrangement of crossing laths or other thin strips of material * Lattice (music), an organized grid model of pitch ratios * Lattice (pastry), an orna ...
; see discussion below.) Given a nonempty set Σ ⊂ ''L''(''V''), one considers the invariant subspaces invariant under each ''T'' ∈ Σ. In symbols, :\operatorname(\Sigma) = \bigcap_ \operatorname(T). For instance, it is clear that if Σ = ''L''(''V''), then Lat(Σ) = . Given a representation of a
group A group is a number of persons or things that are located, gathered, or classed together. Groups of people * Cultural group, a group whose members share the same cultural identity * Ethnic group, a group whose members share the same ethnic ide ...
''G'' on a vector space ''V'', we have a linear transformation ''T''(''g'') : ''V'' → ''V'' for every element ''g'' of ''G''. If a subspace ''W'' of ''V'' is invariant with respect to all these transformations, then it is a
subrepresentation In representation theory Representation theory is a branch of mathematics that studies abstract algebraic structures by ''representing'' their elements as linear transformations of vector spaces, and studies modules over these abstract algeb ...
and the group ''G'' acts on ''W'' in a natural way. As another example, let ''T'' ∈ ''L''(''V'') and Σ be the algebra generated by , where 1 is the identity operator. Then Lat(''T'') = Lat(Σ). Because ''T'' lies in Σ trivially, Lat(Σ) ⊂ Lat(''T''). On the other hand, Σ consists of polynomials in 1 and ''T'', and therefore the reverse inclusion holds as well.


Matrix representation

Over a finite-dimensional vector space, every linear transformation ''T'' : ''V'' → ''V'' can be represented by a matrix once a
basis Basis may refer to: Finance and accounting * Adjusted basis, the net cost of an asset after adjusting for various tax-related items *Basis point, 0.01%, often used in the context of interest rates * Basis trading, a trading strategy consisting ...
of ''V'' has been chosen. Suppose now ''W'' is a ''T''-invariant subspace. Pick a basis ''C'' = of ''W'' and complete it to a basis ''B'' of ''V''. Then, with respect to this basis, the matrix representation of ''T'' takes the form: : T = \begin T_ & T_ \\ 0 & T_ \end where the upper-left block ''T''11 is the restriction of ''T'' to ''W''. In other words, given an invariant subspace ''W'' of ''T'', ''V'' can be decomposed into the direct sum :V = W \oplus W'. Viewing ''T'' as an operator matrix : T = \begin T_ & T_ \\ T_ & T_ \end : \beginW \\ \oplus \\ W' \end \rightarrow \beginW \\ \oplus \\ W' \end, it is clear that ''T''21: ''W'' → ''W' '' must be zero. Determining whether a given subspace ''W'' is invariant under ''T'' is ostensibly a problem of geometric nature. Matrix representation allows one to phrase this problem algebraically. The
projection operator In linear algebra and functional analysis, a projection is a linear transformation P from a vector space to itself (an endomorphism) such that P\circ P=P. That is, whenever P is applied twice to any vector, it gives the same result as if it wer ...
''P'' onto ''W'' is defined by ''P''(''w'' + ''w′'') = ''w'', where ''w'' ∈ ''W'' and ''w′'' ∈ ''W. The projection ''P'' has matrix representation : P = \begin 1 & 0 \\ 0 & 0 \end : \beginW \\ \oplus \\ W' \end \rightarrow \beginW \\ \oplus \\ W' \end. A straightforward calculation shows that ''W'' = ran ''P'', the range of ''P'', is invariant under ''T'' if and only if ''PTP'' = ''TP''. In other words, a subspace ''W'' being an element of Lat(''T'') is equivalent to the corresponding projection satisfying the relation ''PTP'' = ''TP''. If ''P'' is a projection (i.e. ''P''2 = ''P'') then so is 1 − ''P'', where 1 is the identity operator. It follows from the above that ''TP'' = ''PT'' if and only if both ran ''P'' and ran(1 − ''P'') are invariant under ''T''. In that case, ''T'' has matrix representation : T = \begin T_ & 0 \\ 0 & T_ \end : \begin \operatornameP \\ \oplus \\ \operatorname(1-P) \end \rightarrow \begin \operatornameP \\ \oplus \\ \operatorname(1-P) \end \;. Colloquially, a projection that commutes with ''T'' "diagonalizes" ''T''.


Invariant subspace problem

: The invariant subspace problem concerns the case where ''V'' is a separable Hilbert space over the
complex number In mathematics, a complex number is an element of a number system that extends the real numbers with a specific element denoted , called the imaginary unit and satisfying the equation i^= -1; every complex number can be expressed in the fo ...
s, of dimension > 1, and ''T'' is a
bounded operator In functional analysis and operator theory, a bounded linear operator is a linear transformation L : X \to Y between topological vector spaces (TVSs) X and Y that maps bounded subsets of X to bounded subsets of Y. If X and Y are normed vector ...
. The problem is to decide whether every such ''T'' has a non-trivial, closed, invariant subspace. This problem is unsolved . In the more general case where ''V'' is assumed to be a Banach space, there is an example of an operator without an invariant subspace due to
Per Enflo Per H. Enflo (; born 20 May 1944) is a Swedish mathematician working primarily in functional analysis, a field in which he solved mathematical problems, problems that had been considered fundamental. Three of these problems had been open problem, ...
(1976). A concrete example of an operator without an invariant subspace was produced in 1985 by Charles Read.


Invariant-subspace lattice

Given a nonempty set Σ ⊂ ''L''(''V''), the invariant subspaces invariant under each element of Σ form a
lattice Lattice may refer to: Arts and design * Latticework, an ornamental criss-crossed framework, an arrangement of crossing laths or other thin strips of material * Lattice (music), an organized grid model of pitch ratios * Lattice (pastry), an orna ...
, sometimes called the invariant-subspace lattice of Σ and denoted by Lat(Σ). The lattice operations are defined in a natural way: for Σ′ ⊂ Σ, the ''meet'' operation is defined by :\bigwedge_ W = \bigcap_ W while the ''join'' operation is defined by :\bigvee_ W = \operatorname \bigcup_ W. A minimal element in Lat(Σ) in said to be a minimal invariant subspace.


Fundamental theorem of noncommutative algebra

Just as the fundamental theorem of algebra ensures that every linear transformation acting on a finite-dimensional complex vector space has a nontrivial invariant subspace, the ''fundamental theorem of noncommutative algebra'' asserts that Lat(Σ) contains nontrivial elements for certain Σ. Theorem (Burnside) Assume ''V'' is a complex vector space of finite dimension. For every proper subalgebra Σ of ''L''(''V''), Lat(Σ) contains a nontrivial element. Burnside's theorem is of fundamental importance in
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices ...
. One consequence is that every commuting family in ''L''(''V'') can be simultaneously upper-triangularized. A nonempty set Σ ⊂ ''L''(''V'') is said to be triangularizable if there exists a basis of ''V'' such that :\operatorname \ \in \operatorname(\Sigma) \ \text\ k \geq 1 \;. In other words, Σ is triangularizable if there exists a basis such that every element of Σ has an upper-triangular matrix representation in that basis. It follows from Burnside's theorem that every commutative algebra Σ in ''L''(''V'') is triangularizable. Hence every commuting family in ''L''(''V'') can be simultaneously upper-triangularized.


Left ideals

If ''A'' is an
algebra Algebra () is one of the broad areas of mathematics. Roughly speaking, algebra is the study of mathematical symbols and the rules for manipulating these symbols in formulas; it is a unifying thread of almost all of mathematics. Elementary ...
, one can define a ''left regular representation'' Φ on ''A'': Φ(''a'')''b'' = ''ab'' is a
homomorphism In algebra, a homomorphism is a structure-preserving map between two algebraic structures of the same type (such as two groups, two rings, or two vector spaces). The word ''homomorphism'' comes from the Ancient Greek language: () meaning "same" ...
from ''A'' to ''L''(''A''), the algebra of linear transformations on ''A'' The invariant subspaces of Φ are precisely the left ideals of ''A''. A left ideal ''M'' of ''A'' gives a subrepresentation of ''A'' on ''M''. If ''M'' is a left ideal of ''A'' then the left regular representation Φ on ''M'' now descends to a representation Φ' on the
quotient vector space In linear algebra, the quotient of a vector space ''V'' by a subspace ''N'' is a vector space obtained by "collapsing" ''N'' to zero. The space obtained is called a quotient space and is denoted ''V''/''N'' (read "''V'' mod ''N''" or "''V'' by ' ...
''A''/''M''. If 'b''denotes an equivalence class in ''A''/''M'', Φ'(''a'') 'b''= 'ab'' The kernel of the representation Φ' is the set . The representation Φ' is irreducible if and only if ''M'' is a maximal left ideal, since a subspace ''V'' ⊂ ''A''/''M'' is an invariant under if and only if its preimage under the quotient map, ''V'' + ''M'', is a left ideal in ''A''.


Almost-invariant halfspaces

Related to invariant subspaces are so-called almost-invariant-halfspaces (AIHS's). A closed subspace Y of a Banach space X is said to be almost-invariant under an operator T \in \mathcal(X) if TY \subseteq Y+E for some finite-dimensional subspace E; equivalently, Y is almost-invariant under T if there is a
finite-rank operator In functional analysis, a branch of mathematics, a finite-rank operator is a bounded linear operator between Banach spaces whose range is finite-dimensional. Finite-rank operators on a Hilbert space A canonical form Finite-rank operators are ...
F \in \mathcal(X) such that (T+F)Y \subseteq Y, i.e. if Y is invariant (in the usual sense) under T+F. In this case, the minimum possible dimension of E (or rank of F) is called the defect. Clearly, every finite-dimensional and finite-codimensional subspace is almost-invariant under every operator. Thus, to make things nontrivial, we say that Y is a halfspace whenever it is a closed subspace with infinite dimension and infinite codimension. The AIHS problem asks whether every operator admits an AIHS. In the complex setting it has already been solved; that is, if X is a complex infinite-dimensional Banach space and T \in \mathcal(X) then T admits an AIHS of defect at most 1. It is not currently known whether the same holds if X is a real Banach space. However, some partial results have been established: for instance, any
self-adjoint operator In mathematics, a self-adjoint operator on an infinite-dimensional complex vector space ''V'' with inner product \langle\cdot,\cdot\rangle (equivalently, a Hermitian operator in the finite-dimensional case) is a linear map ''A'' (from ''V'' to its ...
on an infinite-dimensional real Hilbert space admits an AIHS, as does any strictly singular (or compact) operator acting on a real infinite-dimensional reflexive space.


See also

* Invariant manifold


Bibliography

* * * * * * {{cite book , first1=Heydar , last1=Radjavi , first2=Peter , last2=Rosenthal , title=Invariant Subspaces , year=2003 , edition=Update of 1973 Springer-Verlag , isbn=0-486-42822-2 , publisher=Dover Publications Linear algebra Operator theory Representation theory