Published on

Operators on Infinite Dimensional Vector Spaces 5

Authors

Linear Operator Equations

Theorem 2.10: Let AB(H)A \in \mathcal{B}(\mathcal{H}); then:

ker(A)=Ran(A),ker(A)=Ran(A).\ker(A) = \text{Ran}(A^*)^{\perp}, \\ \ker(A^*) = \text{Ran}(A)^{\perp}.

Proof: For xHx \in \mathcal{H} we have the following sequence of equivalences:

Ax=0    (Ax,y)=0 x,yH,    (x,Ay)=0 yH,    xRan(A).Ax = 0 \iff (Ax,y) = 0 \ \forall x,y \in \mathcal{H}, \\ \iff (x,A^*y) = 0 \ \forall y \in \mathcal{H}, \\ \iff x \in \text{Ran}(A^*)^{\perp}.

Thus ker(A)=Ran(A)\ker(A) = \text{Ran}(A^*)^{\perp}. Since A=AA^{**} = A, the second assertion follows from the first one if AA is replaced by AA^*. \blacksquare

Theorem 2.11: Let AB(H)A \in \mathcal{B}(\mathcal{H}). Then:

H=Ran(A)kerA=Ran(A)kerA.\mathcal{H} = \overline{\text{Ran}(A)} \oplus \ker A^* = \overline{\text{Ran}(A^*)} \oplus \ker A.

Proof: For every closed subspace MHM \subset \mathcal{H}, H=MM\mathcal{H} = M \oplus M^{\perp}. Furthermore for every subspace (not necessarily closed) NHN \subset \mathcal{H}, N=NN^{\perp \perp} = \overline{N}. In combination with Theorem 2.10, this provides our claim. \blacksquare

Discussion: Theorem 2.11 is important when we aim to solve the linear equation:

Ax=yAx = y

Here yy is considered as given and xx is the unknown variable. Clearly, the necessary and sufficient condition for solvability of this equation is yRan(A)y \in \text{Ran}(A). Thus for the above equation to be solvable it is necessary to have ykerARan(A)y \perp \ker A^* \supset \text{Ran}(A). If Ran(A)\text{Ran}(A) is closed, this condition is also sufficient. You can also note that when AA is self-adjoint, it is not necessary to compute an adjoint: the condition becomes ykerAy \perp \ker A.