### Vector spaces: Coordinates

### Coordinates of sums of scalar multiples

The two operations of a vector space, addition and scalar multiplication, can be carried out in its *coordinate space*.

Operations on vector spaces in terms of coordinate spaces

Let #n# be a natural number and #\basis{\vec{a}_1 ,\ldots ,\vec{a}_n}# a basis of a vector space #V#. By #\vec{x}# and #\vec{y}# we indicate vectors in # V# with coordinates #\rv{x_1 ,\ldots ,x_n }# and #\rv{y_1 ,\ldots ,y_n }# respectively, with respect to the given basis, and we denote by #\lambda# a real number.

- The coordinate vector of #\vec{x}+\vec{y}# with respect to the basis #\basis{\vec{a}_1 ,\ldots ,\vec{a}_n}# is \[\rv{x_1 +y_1 ,\ldots ,x_n +y_n }\]
- The coordinate vector of #\lambda \cdot\vec{x}# with respect to the basis #\basis{\vec{a}_1 ,\ldots ,\vec{a}_n}# is \[\rv{\lambda \cdot x_1 ,\ldots ,\lambda\cdot x_n }\]

The following theorem shows that independence can be examined on the level of coordinates.

Independence expressed in terms of coordinates

Let #\alpha# be a basis of the #n#-dimensional vector space #V# and let #\vec{a}_1 , \ldots ,\vec{a}_m# be a sequence of vectors of #V#. Then:

- #\vec{a}_1 , \ldots ,\vec{a}_m# is independent if and only if the associated set of coordinate vectors in #\mathbb{R}^n# with respect to #\alpha# is independent.
- #\vec{a}_1 , \ldots ,\vec{a}_m# is a basis of #V# if and only if the corresponding set of coordinate vectors in #\mathbb{R}^n# with respect to #\alpha# is a basis of #\mathbb{R}^n#.

The above theory also leads to a different way of looking to systems of linear equations.

Systems of equations and coordinatization

Consider the system of linear equations in the unknowns #x_1,\ldots,x_n#:

\[

\begin{array}{cc}

a_{11}x_1 +a_{12}x_2+\cdots +a_{1n}x_n= & b_1\\

\vdots & \vdots \\

a_{m1} x_1 +a_{m2}x_2+\cdots +a_{mn}x_n= & b_m

\end{array}

\]

Let #\vec{k}_1,\ldots ,\vec{k}_n# be the columns of the coefficient matrix and put #\vec{b}=\matrix{b_1\\ \vdots\\ b_m}#. Then we can write the system in *vector form* as

\[

x_1\cdot\vec{k}_1+x_2\cdot\vec{k}_2+\cdots+x_n\cdot\vec{k}_n=\vec{b}

\]

- The system has at least one solution if and only if #\vec{b}# is located in the space spanned by the columns.
- The system has exactly one solution if and only if, moreover, the columns #\vec{k}_1,\ldots ,\vec{k}_n# are independent. In this case, the solution gives the coordinates of the vector #\vec{b}# with respect to the basis #\basis{\vec{k}_1,\ldots ,\vec{k}_n}#.

**Pass Your Math**independent of your university. See pricing and more.

Or visit omptest.org if jou are taking an OMPT exam.