### Linear maps: Dual vector spaces

### The notion of dual space

According to *Linearity of sum and scalar multiples of linear maps* we can add linear maps from one vector space to another and multiply them by scalars. These operations comply with the rules for a vector space:

The linear space of linear maps Let #V# and #W# be vector spaces. The set #L(V,W)# of linear images from #V# to #W# is a vector space with the *known addition and scalar multiplication*.

If #V# has finite dimension #n# and #W# has finite dimension #m#, then the vector space #L(V,W)# is *isomorphic* to the vector space #M_{m\times n}# of all #(m\times n)#-matrices. More precisely, if we choose a basis #\alpha# of #V# and a basis #\beta # of #W#, then the map #\varphi : L(V,W)\to M_{m\times n}# determined by #\varphi(L) =\left(\beta L\alpha^{-1}\right)_\varepsilon# is an isomorphism.

A special case occurs if #W=\mathbb{R}#. We then speak of **linear** **functions** or **linear functionals** on #V#. We now focus on the space of linear functions.

Dual space

Let #V# be a real vector space. The vector space #L(V,\mathbb{R})# of linear maps #V\rightarrow \mathbb{R}# is called the **dual space** of #V#. This vector space is denoted as #V^\star#.

In the case of a vector space #V# of finite dimension, we can use an inner product to construct an isomorphism between the dual space #V^\star# and #V#.

Inner product with fixed vector is linear mapping

If #V# is a real inner product space and #\vec{a} \in V#, then the map # L_{\vec{a}} : V \rightarrow \mathbb{R}#, defined by # L_{\vec{a}}( \vec{x}) = \dotprod{\vec{a} }{\vec{x}}#, is a linear function.

The linear map that assigns to #\vec{a}# the element #L_{\vec{a}}# of #V^\star#, is injective. In particular, it is an isomorphism #V\to V^\star# if #V# is finite-dimensional.

What is the matrix of the image #L:V\to V^\star# with respect to the bases #\alpha# of #V# and #\beta# of #V^\star# that assigns to #\vec{a}# in #V# the dual vector #L_{\vec{a}}#, given by #L_{\vec{a}}(\vec{v}) = \dotprod{\vec{a}}{\vec{v}}#?

\[L_{\vec{a}}(\vec{v}) = \dotprod{\vec{a}}{\vec{v}}=\sum_{i=1}^3a_i\dotprod{\vec{e_i}}{\vec{v}}=

\sum_{i=1}^3a_i{e_i}^\star(\vec{v}) \]

so \[L_{\vec{a}} = \sum_{i=1}^3a_i{e_i}^\star\]

Therefore, the image of the basis vector #\vec{e_j}# under #L# is \({e_j}^\star\), which implies that the requested matrix is equal to \[{}_{\beta}L_{\varepsilon} = I_3\]

**Pass Your Math**independent of your university. See pricing and more.

Or visit omptest.org if jou are taking an OMPT exam.