Invariant subspace
In mathematics, an invariant subspace of a linear mapping T : V → V from some vector space V to itself, is a subspace W of V that is preserved by T; that is, T ⊆ W.
General description
Consider a linear mapping that transforms:An invariant subspace of has the property that all vectors are transformed by into vectors also contained in. This can be stated as
Trivial examples of invariant subspaces
- : Since maps every vector in into
- : Since a linear map has to map
Uni-dimensional invariant subspace ''U''
We know that with.
Therefore, the condition for existence of a uni-dimensional invariant subspace is expressed as:
Note that this is the typical formulation of an eigenvalue problem, which means that any eigenvector of forms a uni-dimensional invariant subspace in
Formal description
An invariant subspace of a linear mappingfrom some vector space V to itself is a subspace W of V such that T is contained in W. An invariant subspace of T is also said to be T invariant.
If W is T-invariant, we can restrict T to W to arrive at a new linear mapping
This linear mapping is called the restriction of T on W and is defined by
Next, we give a few immediate examples of invariant subspaces.
Certainly V itself, and the subspace, are trivially invariant subspaces for every linear operator T : V → V. For certain linear operators there is no non-trivial invariant subspace; consider for instance a rotation of a two-dimensional real vector space.
Let v be an eigenvector of T, i.e. T v = λv. Then W = span is T invariant. As a consequence of the fundamental theorem of algebra, every linear operator on a complex finite-dimensional vector space with dimension at least 2 has an eigenvector. Therefore, every such linear operator has a non-trivial invariant subspace. The fact that the complex numbers are algebraically closed is required here. Comparing with the previous example, one can see that the invariant subspaces of a linear transformation are dependent upon the underlying scalar field of V.
An invariant vector, other than 0, spans an invariant subspace of dimension 1. An invariant subspace of dimension 1 will be acted on by T by a scalar and consists of invariant vectors if and only if that scalar is 1.
As the above examples indicate, the invariant subspaces of a given linear transformation T shed light on the structure of T. When V is a finite-dimensional vector space over an algebraically closed field, linear transformations acting on V are characterized by the Jordan canonical form, which decomposes V into invariant subspaces of T. Many fundamental questions regarding T can be translated to questions about invariant subspaces of T.
More generally, invariant subspaces are defined for sets of operators as subspaces invariant for each operator in the set. Let L denote the algebra of linear transformations on V, and Lat be the family of subspaces invariant under T ∈ L. Given a nonempty set Σ ⊂ L, one considers the invariant subspaces invariant under each T ∈ Σ. In symbols,
For instance, it is clear that if Σ = L, then Lat =.
Given a representation of a group G on a vector space V, we have a linear transformation T : V → V for every element g of G. If a subspace W of V is invariant with respect to all these transformations, then it is a subrepresentation and the group G acts on W in a natural way.
As another example, let T ∈ L and Σ be the algebra generated by, where 1 is the identity operator. Then Lat = Lat. Because T lies in Σ trivially, Lat ⊂ Lat. On the other hand, Σ consists of polynomials in 1 and T, therefore the reverse inclusion holds as well.
Matrix representation
Over a finite dimensional vector space every linear transformation T : V → V can be represented by a matrix once a basis of V has been chosen.Suppose now W is a T invariant subspace. Pick a basis C = of W and complete it to a basis B of V. Then, with respect to this basis, the matrix representation of T takes the form:
where the upper-left block T11 is the restriction of T to W.
In other words, given an invariant subspace W of T, V can be decomposed into the direct sum
Viewing T as an operator matrix
it is clear that T21: W → W' must be zero.
Determining whether a given subspace W is invariant under T is ostensibly a problem of geometric nature. Matrix representation allows one to phrase this problem algebraically. The projection operator P onto W is defined by
P = w, where w ∈ W and w
A straightforward calculation shows that W = Ran P, the range of P, is invariant under T if and only if PTP = TP. In other words, a subspace W being an element of Lat is equivalent to the corresponding projection satisfying the relation PTP = TP.
If P is a projection, so is 1 − P, where 1 is the identity operator. It follows from the above that TP = PT if and only if both Ran P and Ran are invariant under T. In that case, T has matrix representation
Colloquially, a projection that commutes with T "diagonalizes" T.
Invariant subspace problem
The invariant subspace problem concerns the case where V is a separable Hilbert space over the complex numbers, of dimension > 1, and T is a bounded operator. The problem is to decide whether every such T has a non-trivial, closed, invariant subspace. This problem is unsolved.In the more general case where V is hypothesized to be a Banach space, there is an example of an operator without an invariant subspace due to Per Enflo. A concrete example of an operator without an invariant subspace was produced in 1985 by Charles Read.
Invariant-subspace lattice
Given a nonempty Σ ⊂ L, the invariant subspaces invariant under each element of Σ form a lattice, sometimes called the invariant-subspace lattice of Σ and denoted by Lat.The lattice operations are defined in a natural way: for Σ' ⊂ Σ, the meet operation is defined by
while the join operation is
A minimal element in Lat in said to be a minimal invariant subspace.
Fundamental theorem of noncommutative algebra
Just as the fundamental theorem of algebra ensures that every linear transformation acting on a finite dimensional complex vector space has a nontrivial invariant subspace, the fundamental theorem of noncommutative algebra asserts that Lat contains nontrivial elements for certain Σ.Theorem Assume V is a complex vector space of finite dimension. For every proper subalgebra Σ of L, Lat contains a nontrivial element.
Burnside's theorem is of fundamental importance in linear algebra. One consequence is that every commuting family in L can be simultaneously upper-triangularized.
A nonempty Σ ⊂ L is said to be triangularizable if there exists a basis of V such that
In other words, Σ is triangularizable if there exists a basis such that every element of Σ has an upper-triangular matrix representation in that basis. It follows from Burnside's theorem that every commutative algebra Σ in L is triangularizable. Hence every commuting family in L can be simultaneously upper-triangularized.
Left ideals
If A is an algebra, one can define a left regular representation Φ on A: Φb = ab is a homomorphism from A to L, the algebra of linear transformations on AThe invariant subspaces of Φ are precisely the left ideals of A. A left ideal M of A gives a subrepresentation of A on M.
If M is a left ideal of A. Consider the quotient vector space A/M. The left regular representation Φ on M now descends to a representation Φ' on A/M. If denotes an equivalence class in A/M, Φ' = . The kernel of the representation Φ' is the set.
The representation Φ' is irreducible if and only if M is a maximal left ideal, since a subspace V ⊂ A/M is an invariant under if and only if its preimage under the quotient map, V + M, is a left ideal in A.
Almost-invariant halfspaces
Related to invariant subspaces are so-called almost-invariant-halfspaces. A closed subspace of a Banach space is said to be almost-invariant under an operator if for some finite-dimensional subspace ; equivalently, is almost-invariant under if there is a finite-rank operator such that, i.e. if is invariant under. In this case, the minimum possible dimension of is called the defect.Clearly, every finite-dimensional and finite-codimensional subspace is almost-invariant under every operator. Thus, to make things nontrivial, we say that is a halfspace whenever it is a closed subspace with infinite dimension and infinite codimension.
The AIHS problem asks whether every operator admits an AIHS. In the complex setting it has already been solved; that is, if is a complex infinite-dimensional Banach space and then admits an AIHS of defect at most 1. It is not currently known whether the same holds if is a real Banach space. However, some partial results have been established. For instance, any selfadjoint operator on an infinite-dimensional real Hilbert space admits an AIHS, as does any strictly singular operator acting on a real infinite-dimensional reflexive space.