Because *linear subspaces* are special sets, we can apply the known *operations on sets* to linear subspaces. The intersection of two linear subspaces yields a linear subspace, but the union, in general, does not. In order to be able to discuss the smallest linear subspace containing two linear subspaces, we introduce the notion of sum of two linear subspaces.

The **sum** of two linear subspaces #X# and #Y# of a vector space #V#, denoted as #X+Y#, is the set of all vectors of the form #\vec{x}+\vec{y}# for #\vec{x}\in X# and #\vec{y}\in Y#.

The vector space #\mathbb{R}^2# is the sum of the #x#-axis and the #y#-axis:

\[\mathbb{R}^2 = \{\rv{\lambda,0}\mid \lambda\in \mathbb{R}\}+\{\rv{0,\mu}\mid \mu\in \mathbb{R}\}\]

In the same way, we can define sums of more than two subsets of #V#. For instance, \[\mathbb{R}^3 = \{\rv{\lambda,0,0}\mid \lambda\in \mathbb{R}\}+\{\rv{0,\mu,0}\mid \mu\in \mathbb{R}\}+\{\rv{0,0,\nu}\mid \nu\in \mathbb{R}\}\]

If the subsets are linear subspaces, then so is their sum:

Let #U# and #W# be linear subspaces of a vector space #V#.

The following two subsets of #V# are linear subspaces:

- The sum #U+ W# is equal to the
*span* #\linspan{U\cup W}# of #U# and #W#. This is the smallest linear subspace of #V# containing both #U# and #W#.
- The
*intersection* #U\cap W# of #U# and #W# is a linear subspace of #V#. It is the largest linear subspace contained in both #U# and #W#.

Because of the definition of span, #U+W# is contained in #\linspan{U\cup W}#. On the other hand #U+W# is a linear subspace of #V#:

Any linear combination of two vectors from #U+W# is again a vector in #U+W#: if #u_1#, #u_2\in U# and #w_1#, #w_2\in W#, and #\alpha#, #\beta# are scalars, then

\[\alpha\cdot \left(u_1+w_1\right) +\beta\cdot \left(u_2+w_2 \right) = \left(\alpha\cdot u_1+\beta\cdot u_2\right) +\left(\alpha\cdot w_1 +\beta\cdot+w_2 \right)\]

is the sum of the vector #\alpha\cdot u_1+\beta\cdot u_2# of #U# and vector #\alpha\cdot w_1 +\beta\cdot+w_2# of #W#.

Moreover, #0= 0+0# belongs to #U+W#.

Because #U+W# is a linear subspace of #V# that contains #U# (each element #u# of #U# can be written as #u+0\in U+W#), and (likewise) #W#, it contains #\linspan{U\cup W}#. Because #U+W# is also contained in #\linspan{U\cup W}#, the two subspaces #U+W# and #\linspan{U\cup W}# coincide.

The fact that the intersection #U\cap W# is a linear subspace of #V# has been proven *before*. If there would be a greater linear subspace contained in both #U# and in #W#, then it would contain a vector outside #U# or outside #W#, a contradiction. Therefore, #U\cap W# is the largest linear subspace contained in both #U# and #W#.

The union #U\cup W# itself is not a linear subspace of #V# unless it is equal to #\{\vec{0}\}#. For example, if we let #V = \mathbb{R}^2# and let #U# be the #x#-axis (the #1#-dimensional linear subspace spanned by #\rv{1,0}#) and #W# the #y#-axis (the #1#-dimensional linear subspace spanned by #\rv{0,1}#), then #U\cup W# is nothing but the union of these two axes, while it is known that together they span the whole space #V#. Specifically: the sum #\rv{1,1}# of the vectors #\rv{1,0}\in U# and #\rv{0,1}\in W# does not belong to #U\cup W#.

Here are some simple but useful properties of sum and average.

Let #U# and #W# be linear subspaces of the vector space #V#.

- If #U\subseteq W#, then #\dim{U}\leq\dim{W}#, with equality if and only if #U=W#.
- \(U\cap W=U\Leftrightarrow U\subseteq W\).
- \(U+W=U\Leftrightarrow W\subseteq U\).

Choose a basis of #U#.

1. According to the *Growth criterion for independence*, a basis of #U# can be extended to a basis of #W#. As a consequence, #\dim{U}\le\dim{W}#. If #U=W#, it goes without saying that #\dim{U}=\dim{W}#.

Now suppose #\dim{U}=\dim{W}#. Then the extension of a basis of #U# to a basis of #W# is empty so #W# is spanned by the basis of #U#. This means that #W# coincides with #U#. This proves the first statement.

2. This rule applies even if #U# and #W# are subsets of #V#. After all, the equality at each of the two sides means that each element of #U# also belongs to # W#.

3. Each vector in #W# also belongs to #U+W#. So if #U+W=U#, each vector in #W# belongs to #U#; this is the meaning of #W\subseteq U#. Conversely, if #W\subseteq U#, then #U+W# already spanned by #U#, so #U+W = \linspan{U} = U#.

The *distributive laws* for three sets #T#, #U#, #W# of #V# read

\[\begin{array}{rcl}T\cap(U\cup W) &=& (T\cap U)\cup (T\cap W)\\ T\cup(U\cap W) &=& (T\cup U)\cap (T\cup W)\end{array}\]

In general, these laws do not apply if we take #T#, #U#, #W# to be linear subspaces of #V# and replace #\cup# by the sum. For example, if #T=\linspan{\rv{1,1}}#, #U=\linspan{\rv{1,0}}#, and #W=\linspan{\rv{0,1}}# in #V = \mathbb{R}^2#, then

\[\begin{array}{rclclcl}T\cap(U+ W) &=& T\cap V = T&\ne& \{\vec{0}\} &=&(T\cap U)+ (T\cap W) \\ T+(U\cap W) &=&T+\{\vec{0}\}=T&\ne&V&=&(T+ U)\cap (T+ W)\end{array}\]

Later we will encounter special cases where these laws do apply.

Because the intersection and sum of two linear subspaces are both linear subspaces again, they have a basis and a dimension, which we will consider here.

Let #U# and #W# be linear subspaces of a vector space #V# with finite dimensions #\dim{U} = k# and #\dim{W} = m#. Let #\basis{\vec{a}_1,\ldots,\vec{a}_r}# be a basis of #U\cap W# are. Extend the basis with a basis #\basis{\vec{a}_1,\ldots,\vec{a}_r,\vec{b}_1,\ldots,\vec{b}_s}# of #U#, where #s=k-r#, and also with a basis #\basis{\vec{a}_1,\ldots,\vec{a}_r,\vec{c}_1,\ldots,\vec{c}_t}# of #W#, where #t=m-r#. Then \[ \basis{\vec{a}_1,\ldots,\vec{a}_r,\vec{b}_1,\ldots,\vec{b}_s,\vec{c}_1,\ldots,\vec{c}_t} \] is a basis of #U+W#.

In particular, \[ \dim{U}+\dim{W}=\dim{U+W}+\dim{U\cap W} \]

According to the definition of linear span, \[ \begin{array}{rcl}U+W&=&\linspan{\vec{a}_1,\ldots,\vec{a}_r,\vec{b}_1,\ldots,\vec{b}_s,\vec{a}_1,\ldots,\vec{a}_r,\vec{c}_1,\ldots,\vec{c}_t}\\&=&\linspan{\vec{a}_1,\ldots,\vec{a}_r,\vec{b}_1,\ldots,\vec{b}_s,\vec{c}_1,\ldots,\vec{c}_t}\end{array}\]

It remains to prove that this set of vectors is linearly independent. Because #\basis{\vec{a}_1,\ldots,\vec{a}_r,\vec{b}_1,\ldots,\vec{b}_s}# is linearly independent (since it is a basis of #W#), according to the Growth criterion for independence, we only need show that each #\vec{c}_j# is linearly independent of #\basis{\vec{a}_1,\ldots,\vec{a}_r,\vec{b}_1,\ldots,\vec{b}_s,\vec{c}_1,\ldots,\vec{c}_{j-1}} #. If not, then there are scalars #\lambda_1,\ldots,\lambda_r# , #\mu_1,\ldots,\mu_s# , #\nu_1,\ldots,\nu_{j-1}# such that \[ \vec{c}_j=\sum_{i=1}^r\lambda_i\vec{a}_i+\sum_{i=1}^s\mu_i\vec{b}_i+\sum_{i=1}^{j-1}\nu_i\vec{c}_i\] Then the vector #\vec{c} = \vec{c}_j-\sum_{i=1}^{j-1}\nu_i\vec{c}_i# belongs to #W# and is not equal to #\vec{0}# (for \(\basis{\vec{c}_1,\ldots,\vec{c}_j}\) are linearly independent), and is contained in \(\linspan {\vec{a}_1,\ldots,\vec{a}_r,\vec{b}_1,\ldots,\vec{b}_s} = U\). But this means that #\vec{c}\in U\cap W#, so, according to the Growth criterion for independence, \(\basis{\vec{a}_1,\ldots,\vec{a}_r,\vec{c}}\) is a linearly independent system in #U\cap W#. This contradicts the fact that \(\basis{\vec{a}_1,\ldots,\vec{a}_r}\) is a basis of #U\cap W#. We conclude that the vectors #\basis{\vec{a}_1,\ldots,\vec{a}_r,\vec{b}_1,\ldots,\vec{b}_s,\vec{c}_1,\ldots,\vec{c}_t}# are linearly independent.

The last statement follows directly from: \[\begin{array}{rcl}\dim{U+W}&=&r+s+t\\ &&\phantom{xx}\color{blue}{\text{the number of vectors in the basis found}}\\ &=&k+m-r\\ &&\phantom{xx}\color{blue}{s=k-r\text{ and }t=m-r}\\&=&\dim{U}+\dim{W}-\dim{U\cap W}\\ &&\phantom{xx}\color{blue}{\dim{U} = k\text{, }\dim{W} = m\text{, }\dim{U\cap W} = r}\end{array}\]

Set #V=\mathbb{R}^3# , #U=\linspan{\rv{1,0,0},\rv{0,1,0}}# (the #x,y#-plane) and #W=\linspan{\rv{1,0,0},\rv{0,0,1}}# (the #x,z#- plane). Then #U+W=\mathbb{R}^3# because all of the standard basis vectors occur as spanning vectors of #U# or #W#. We also see that #U\cap W# contains the #1#-dimensional subspace #\linspan{\rv{1,0,0}}# (the #x#-axis), since this vector occurs in the set spanning of both #U# and #W#. In order to determine that #U\cap W# coincides with the #x#-axis we use the Dimension theorem: \[\dim{U\cap W}=\dim{U}+\dim{W} -\dim{U+W}=2+2-3=1\] From #\dim{U\cap W}=1# and #\linspan{\rv{1,0,0}}\subseteq U\cap W# we conclude # U\cap W=\linspan{\rv{1,0,0}}#.

The dimension #V# need not be finite. This is because the whole event takes place within the linear subspace #U+W#, which is finite-dimensional. In fact, the dimension of #U+W# is at most #k+m#, since it is spanned by the union of a basis of #U# and a basis of #W#.