### Linear maps: Matrices of Linear Maps

### Relationship to systems of linear equations

Take another look at the system of linear equations \[ \begin{array}{cccccccc}

a_{11} x_1&+&a_{12}x_2&+&\ldots&+&a_{1n}x_n&=&b_1\\

\vdots &&\vdots&&\vdots &&\vdots&&\vdots\\

a_{m1} x_1&+&a_{m2}x_2&+&\ldots&+&a_{mn}x_n &=&b_m

\end{array}

\] with #(m\times n)#-coefficient matrix #A#. The system of equations can be written as a vector equation

\[

A\vec{x} = \vec{b}\phantom{xxx}

\]

The matrix #A# determines the linear map # L_A :\mathbb{R}^n\rightarrow\mathbb{R}^m#. In terms of this linear map the system can also be written as

\[

L_A( \vec{x})=\vec{b}

\]We recall that a system of linear equations is *consistent* if it has a solution.

Dimension of the solution space of a system of linear equationsLet #A# be an #(m\times n)#-matrix, and use #k# to denote the dimension of the **column space**, that is, the subspace of #\mathbb{R}^m# *spanned* by the columns of #A#. Moreover, let #\vec{b}# be a vector in #\mathbb{R}^m# and consider the system of linear equations #A\vec{x}=\vec{b}# of #m# linear equations with #n# unknowns, the coordinates of #\vec{x}#.

- The system is consistent if and only if #\vec{b}# is part of the column space of #A#.
- If the system is consistent, then the
*dimension*of the solution space is equal to #n-k#.

This solution method can be applied to arbitrary vector spaces of finite dimension.

From full inverse image of a linear map to matrix equationLet #L:V\to W# and #\vec{b}\in W#. The equation \[ L(\vec{v})=\vec{b}\] in the unknown vector #\vec{v}# in #V# has a solution #\vec{v}_0# if and only if #\vec{b}# lies in #\im{L}#. In that case, the solution space is the affine subspace \[\vec{v}_0+\ker{L}\]

If #V# has finite dimension #n# and #W# has finite dimension #m#, then this solution can be found, after the a choice of bases #\alpha# for #V# and #\beta # for #W#, by solving the system of linear equations with unknown #\vec{x}# in #\mathbb{R}^n# consisting of

\[{}_\beta L_\alpha \vec{x}=\beta(\vec{b})\]

The solution then consists of all vectors #\vec{v}=\alpha^{-1}(\vec{x})#, where #\vec{x}# is a solution of the system of linear equations.

*augmented matrix*\[A' = \left({\begin{array}{ccc |c} 3 & -2 & -5 & -12 \\ 0 & 1 & 1 & 6 \\ 2 & -2 & -4 & -12 \\ \end{array}}\right) \]

Express the solution set in the form #\vec{p} + \text{span}\left(\vec{v}_1,\ldots, {v}_t\right)# where #\vec{p}# is a particular solution to the system and #\vec{v}_1,\ldots, \vec{v}_t# are

*linearly independent*.

We write

\[ A = \matrix{3 & -2 & -5 \\ 0 & 1 & 1 \\ 2 & -2 & -4 \\ }\phantom{xxx}\text{ and }\phantom{xxx} b = \matrix{-12 \\ 6 \\ -12 \\ }\]

so #A' = \left(\begin{array}{c|c}A &\vec{b}\end{array}\right)#.

By

*row reduction*, we can rewrite the augmented matrix to

\[ \matrix{1 & 0 & -1 & 0 \\ 0 & 1 & 1 & 6 \\ 0 & 0 & 0 & 0 \\ } \]

We can read off from this

*row reduced echelon form*that the nullspace of #A# is # \text{span}\left( \matrix{1 \\ -1 \\ 1 \\ } \right)# and that \(\vec{p}= \matrix{ 2 \\ 4 \\ 2 } \) is a particular solution of #A\vec{x} = \vec{b}#. Thus, we arrive at the answer

\[\matrix{ 2 \\ 4 \\ 2 } + \text{span}\left( \matrix{1 \\ -1 \\ 1 \\ } \right)\]

*rank*of #A# equals #{2}#. This is also the rank of #A'#, so solutions exist. In fact, since the kernel of #A# has dimension #t = 3 - 2 =1 #, the number of linearly independent vectors needed to span the direction space is #t#.

**Pass Your Math**independent of your university. See pricing and more.

Or visit omptest.org if jou are taking an OMPT exam.