An explicit description of #\ker{L}# can be found by solving a homogeneous system of linear equations. This is not as simple as for #\im{L}#. Still, the dimension of #\ker{L}# can be found quickly thanks to the following key result:

If # L :V\rightarrow W# is a linear map with #\dim{V}\lt \infty#, then

\[

\dim{V}=\dim{\ker{L}}+\dim{\im{L}}

\]

Write #n=\dim{V}# and #p=\dim{\ker{L}}#. Because #\ker{L}\subseteq V#, we have #p\leq n#.

Choose a basis #\basis{\vec{a}_1,\ldots ,\vec{a}_p}# for #\ker{L}# and extend it with #\vec{b}_{p+1},\ldots ,\vec{b}_n# to a basis for #V#:

\[

V=\linspan {\vec{a}_1,\ldots ,\vec{a}_p,\ \vec{b}_{p+1},\ldots ,\vec{b}_n}

\] Because # L \vec{a}_1=\cdots= L \vec{a}_p=\vec{0}#, it follows from *Image as a spanned subspace* that

\[\begin{array}{rcl}

\im{L}&=&\linspan{ L \vec{a}_1,\ldots , L \vec{a}_p, L \vec{b}_{p+1},\ldots , L \vec{b}_n} \\

&=&\linspan{ L \vec{b}_{p+1},\ldots , L \vec{b}_n} \end{array}

\] If we can prove that the vectors # L \vec{b}_{p+1},\ldots , L \vec{b}_n# are *linearly independent*, then we are done, because then #\dim{\im{L}}=n-p#, so

\[n=\dim{V}=p+(n-p)=\dim{\ker{L}}+\dim{\im{L}}\] In order to demonstrate the independence, we seek solutions for the unknowns #\alpha_{p+1},\ldots,\alpha_n# in the equation

\[

\alpha_{p+1}\cdot L \vec{b}_{p+1}+\cdots +\alpha_n\cdot L \vec{b}_n=\vec{0}

\] We reduce the equation to an equation without #L#:

\[\begin{array}{rcl}

L (\alpha_{p+1}\vec{b}_{p+1}+\cdots +\alpha_n\vec{b}_n)&=&\vec{0}

\\ \phantom{x}\color{blue}{\text{linearity of }L}&&\\

\alpha_{p+1}\vec{b}_{p+1}+\cdots +\alpha_n\vec{b}_n&\in&{\ker{L}}

\\ \phantom{x}\color{blue}{\text{definition of }\ker{L}}&&\\

\alpha_{p+1}\vec{b}_{p+1}+\cdots

+\alpha_n\vec{b}_n&=&\alpha_1\vec{a}_1+\cdots+\alpha_p\vec{a}_p\\ &&\text{for certain numbers}\\ && \alpha_1,\ldots ,\alpha_p\\ \phantom{x}\color{blue}{\basis{\vec{a}_1,\ldots,\vec{a}_p}\text{ is a basis of }\ker{L}}&&\\

-\alpha_1\vec{a}_1-\cdots -\alpha_p\vec{a}_p+\alpha_{p+1}\vec{b}_{p+1}+\cdots

+\alpha_n\vec{b}_n&=&\vec{0}\\\phantom{x}\color{blue}{\text{all terms carried to the left }}&&\\\end{array}

\] Because #\basis{\vec{a}_1,\ldots ,\vec{a}_p,\vec{b}_{p+1},\ldots ,\vec{b}_n}# is a basis, it follows that the only solution is \[\alpha_1=\cdots =\alpha_p=\alpha_{p+1}=\cdots =\alpha_n=0\] Therefore, according to the *Dependence criterion*, the system # L \vec{b}_{p+1},\ldots , L \vec{b}_n# is linearly independent.

The image space of the *perpendicular projection* #P_U# onto a subspace #U# of the inner product space #\mathbb{R}^n# is, of course, equal to #U#, while the null space consists of all vectors which are perpendicular to #U#, so it is equal to #U^{\perp}#. The above Rank-nullity theorem implies that #n=\dim{U} + \dim{U^{\perp}}#. This formula can be found in the theorem below.

The word rank in the name Rank-nullity theorem refers to the dimension of the image of #L#. *Later* we will see that the rank as we know it for a matrix corresponding to #L# is the same as this dimension.

The Rank-nullity theorem implies statements on dimensions of some spaces featuring no linear maps.

We recall that, if #U# and #W# are linear subspaces of a vector space #V#, the following two subsets of #V# are also linear subspaces of #V#:

- The
*span* #\linspan{U\cup W}# of #U# and #W#. Because each element of it can be written as #\vec{u}+\vec{w}# for certain #\vec{u}\in U# and #\vec{w}\in W#, we also denote this linear subspace by #U+W#. It contains both #U# and #W# (and is the smallest subspace with this property).
- The
*intersection* #U\cap W# of #U# and #W#. This subspace is contained in both #U# and #W# (and is the greatest with this property).

Let #U# and #W# be linear subspaces of #\mathbb{R}^n# and denote by #U^\perp# the subspace of #\mathbb{R}^n# consisting of all vectors perpendicular to each vector of #U# (with respect to the standard inner product on #\mathbb{R}^n#). The following two equalities between dimensions hold: \[ \begin{array}{rcl}n &=& \dim{U}+\dim{U^\perp}\\ \dim{U}+\dim{W} &=& \dim{U\cap W} + \dim{U+W}\end{array}\]

Proof of \(n = \dim{U}+\dim{U^\perp}\): Let \(P_U\) be the *orthogonal projection* from #\mathbb{R}^n# onto #U#.

The kernel of #P_U# consists of the orthoplement #U^\perp# of #U# in #\mathbb{R}^n# and the image is #U#. Thus, it follows from the Rank-nullity theorem that \[n = \dim{\mathbb{R}^n}=\dim{\ker{P_U}}+\dim{\im{P_U}}=\dim{U^\perp}+\dim{U}

\]

Proof of \(\dim{U}+\dim{W} = \dim{U\cap W} + \dim{U+W}\): We will use the statement just proved a number of times. We also apply the statement to a linear subspace #W# of \(\mathbb{R}^n\) in the following version:

\[\dim{W} = \dim{W\cap U} + \dim{W\cap U^\perp}\]

The proof of this runs just like the proof above with the restriction of #P_U# to #W# replacing #P_U# itself; the image of this linear transformation is the same as #U\cap W# and the kernel is #U^\perp\cap W#.

We now derive the equality that needs to be proven:

\[\begin{array}{rcl} \dim{U}+\dim{W} &=&\dim{U\cap W}+\dim{U\cap W^\perp}+\dim{W}\\&&\phantom{x}\color{blue}{\dim{U} = \dim{U\cap W} + \dim{U\cap W^\perp}}\\ &=&\dim{U\cap W}+\dim{W^\perp}-\dim{U^\perp\cap W^\perp}+\dim{W}\\&&\phantom{x}\color{blue}{\dim{W^\perp} = \dim{W^\perp\cap U} + \dim{ W^\perp\cap U^\perp}}\\ &=&\dim{U\cap W}+\dim{ W^\perp}-\dim{(U+W)^\perp}+\dim{W}\\&&\phantom{x}\color{blue}{ U^\perp\cap W^\perp =\left(U+ W\right)^\perp}\\ &=&\dim{U\cap W}+n-\dim{ W}-n+\dim{U+ W}+\dim{W}\\ &&\phantom{x}\color{blue}{\dim{X^\perp} + \dim{X}=n}\\ &=&\dim{U\cap W}+\dim{U+W}\\ \end{array}

\]

Suppose the linear map #L:\mathbb{R}^{4}\to\mathbb{R}^{3}# is surjective.

What is the dimension of the kernel of #L#?

#\dim{\ker{L}}=# # 1#

According to the rank-nullity theorem we have

\[\begin{array}{rcl} \dim{\ker{L}}&=&\dim{\mathbb{R}^{4}}-\dim{\im{L}}\\

&&\phantom{xx}\color{blue}{\text{rank-nullity theorem}}\\

&=& 4- 3\\

&&\phantom{xx}\color{blue}{L\text{ is surjective}}\\

&=& 1 \end{array} \]