### Vector spaces: Coordinates

### Basis and echelon form

In theory *Finding bases* we discussed two methods for finding a basis of a vector space. One of the two methods does not need a set of spanning vectors; this method uses of the *Growth criterion for independence*. The other method does use a spanning set of vectors and works with *thinning out,* a process that requires a dependence calculation at each step. By switching to coordinates, we can find a basis efficiently in this case by use of *row reduction*.

We recall that a matrix is in *echelon form* if it has the following two characteristics:

- all of the elements in the rows below a leading element, in the column in which this leading element lies as well as in the columns to the left of it, are zero;
- all null rows are at the bottom.

The *leading element* of a row in a matrix is the first element (from the left) of the row that is not zero. The general form of a matrix in echelon form is:

\[

\left(\,\begin{array}{cccccccccccccccccc}

0 & \ldots & 0 & 1 & \ast & \ldots & \ast & \ast & \ast & \ldots & \ast & \ast & \ast & \ldots & \ast & \ast & \ldots & \ast\\

0 & \ldots & 0 & 0 & 0 & \ldots & 0 & 1 & \ast & \ldots & \ast & \ast & \ast & \ldots & \ast & \ast & \ldots & \ast\\

0 & \ldots & 0 & 0 & 0 & \ldots & 0 & 0 & 0 & \ldots & 0 & 1 & \ast & \ldots & \ast & \ast & \ldots & \ast\\

\vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots\\

0 & \ldots & 0 & 0 & 0 & \ldots & 0 & 0 & 0 & \ldots & 0 & 0 & 0 & \ldots & 1 & \ast & \ldots & \ast\\

0 & \ldots & 0 & 0 & 0 & \ldots & 0 & 0 & 0 & \ldots & 0 & 0 & 0 & \ldots & 0 & 0 & \ldots & 0\\

\vdots & & \vdots & \vdots & \vdots & & \vdots & \vdots & \vdots & & \vdots & \vdots & \vdots & & \vdots & \vdots & & \vdots\\

0 & \ldots & 0 & 0 & 0 & \ldots & 0 & 0 & 0 & \ldots & 0 & 0 & 0 & \ldots & 0 & 0 & \ldots & 0

\end{array}\,\right)

\]

We view the rows of a #(m\times n)#-matrix as vectors of #\mathbb{R}^n#.

Echelon Form and independence

The rows of a matrix in echelon form that are distinct from the null row are independent.

In particular, the *rank* of a matrix is equal to the dimension of the span of the row vectors.

The proposition *Standard operations with spanning sets* and the above theorem show how to find a basis of a subspace spanned by vectors in the coordinate space #\mathbb{R}^n#:

Finding bases by coordinates Let #\vec{a}_1,\ldots,\vec{a}_m# be a set of vectors in #\mathbb{R}^n#. A basis of #\linspan{\vec{a}_1,\ldots,\vec{a}_m}# can be found by use of the following three steps:

- Set up the matrix #M# whose rows are the vectors #\vec{a}_1,\ldots,\vec{a}_m#.
- Row reduce #M# to an echelon form #T#.
- The set of non-zero row vectors of #T# is a basis of #\linspan{\vec{a}_1,\ldots,\vec{a}_m}#.

From the theorem *Standard operations with spanning sets* it follows that the span of the rows of #M# is equal to the span of the rows of #T#. Therefore, the rows of #T# that are not equal to the null row, span #\linspan{\vec{a}_1,\ldots,\vec{a}_m}#. Because of the above statement, they are independent. They thus form a basis of #\linspan{\vec{a}_1,\ldots,\vec{a}_m}#.

If a vector space #W# is not a coordinate space, we can apply the same techniques after transition to coordinates with respect to a given basis #\basis{\vec{b}_1,\ldots,\vec{b}_n}# of #W#. If #W# is described by a spanning set of #m# vectors, we first determine the coordinates with respect to #\basis{\vec{b}_1,\ldots,\vec{b}_n}# of the spanning vectors, then apply the above theorem *Finding bases by coordinates*, and finally write the basis vectors found in #\mathbb{R}^n# as linear combinations of #\vec{b}_1,\ldots,\vec{b}_n#. This way we find a basis of #W#.

If we apply this process to the vector space of linear polynomial functions on #\mathbb{R}^n#, with basis #\basis{x_1, x_2,\ldots,x_n,1}#, then we wind up using the row reduction process of the *augmented coefficient matrix* pertaining to a system of linear equations. This idea is further discussed in one of the examples below.

We form the matrix

\[ \left(\begin{array}{cccc}

1&0&2&0 \\

1&1&2&1\\

2&-1&4&-1 \\

1&3&2&3

\end{array} \right) \] Row reduction does not change the space spanned by the rows. We reduce the matrix to

*reduced echelon form*(reducing to echelon form would suffice) and find:

\[\left(\begin{array}{cccc}

1&0&2&0\\

0&1&0&1\\

0&0&0&0\\

0&0&0&0

\end{array} \right)\] We conclude that the vectors #\rv{1,0,2,0}# and #\rv{0,1,0,1}# are coordinate vectors of a basis for #V#. Therefore, a good answer is \[\basis{\rv{1,0,2,0},\rv{0,1,0,1}}\]

**Pass Your Math**independent of your university. See pricing and more.

Or visit omptest.org if jou are taking an OMPT exam.