### Inner Product Spaces: Orthogonal projections

### Orthogonal projection

In every inner product space each point #\vec{p}# has a unique point on each finite-dimensional subspace #W# that is nearest to #\vec{p}#. The unique point is the orthogonal projection of #\vec{p}# on #W#. These results are also valid in the more general case where #W# is an *affine subspace.*

Orthogonal projection Let #W# be a finite-dimensional affine subspace of an inner product space #V# and #\vec{x}# a vector of #V#. Then there exists a unique vector #\vec{y}# in #W# such that #\vec{x}-\vec{y}# is perpendicular to #W#.

This vector #\vec{y}# is called the **orthogonal projection** of #\vec{x}# on #W#, and is denoted by #P_W(\vec{x})#.

Here are some useful properties of the orthogonal projection on an affine subspace.

Properties of the orthogonal projection Let #V# be an inner product space with affine subspace #W=\vec{a}+U# for a vector #\vec{a}# and a linear subspace #U# of #V#. Suppose that #\basis{\vec{a}_1, \ldots ,\vec{a}_k}# is an orthonormal basis of #U# for a natural number #k#. The orthogonal projection #P_W(\vec{x})# of a vector #\vec{x}# of #V# on #W# satisfies the following properties:

- #\vec{x}-P_W(\vec{x})# is orthogonal to each vector from #W#.
- The orthogonal projection #P_W(\vec{x})# is given by \[\vec{a} + (\dotprod{(\vec{x}-\vec{a})}{\vec{a}_1})\,\vec{a}_1 + \cdots +(\dotprod{(\vec{x}-\vec{a})}{\vec{a}_k})\,\vec{a}_k\]
- The distance from #\vec{x}# to a vector from #W# is minimal for the orthogonal projection on #W#: \[\norm{\vec{x}-P_W(\vec{x})}=\min_{\vec{w}\in W}\norm{\vec{x}-\vec{w}}\]
- The orthogonal projection is the unique vector for which this minimum occurs.
- #\norm{P_W(\vec{x})}\leq\norm{\vec{x}}# with equality if and only if #\vec{x}=P_W(\vec{x})#.
- The equallity #P_W(\vec{x})=\vec{x}# holds if and only if #\vec{x}# lies in #W#.

The **distance** between #\vec{x}# and #W# is defined by #\norm{\vec{x}-P_W(\vec{x})}# as in statement 3.

#P_W(\left[ 1 , -3 , -1 \right] ) = # #{{{7}\over{9}}\cdot \left[ 2 , -2 , 1 \right] }#

First, we normalize the vector #{\left[ 4 , -4 , 2 \right] }# to get an orthonormal basis for #W#. Because \[{\norm{\left[ 4 , -4 , 2 \right] }} =\sqrt{(4)^2+(-4)^2+(2)^2}=6 \] we find the normalized basis vector \[\vec{a}_1=\dfrac{1}{6}\cdot {\left[ 4 , -4 , 2 \right] } =\left[ {{2}\over{3}} , -{{2}\over{3}} , {{1}\over{3}} \right] \]

Now, the orthogonal projection is given by

\[\begin{array}{rcl} P_W(\vec{x})&=&(\dotprod{\vec{x}}{\vec{a}_1})\,\vec{a}_1\\ &=&\displaystyle\left(\dotprod{\left[ 1 , -3 , -1 \right] }{\left[ {{2}\over{3}} , -{{2}\over{3}} , {{1}\over{3}} \right] }\right)\cdot{\left[ {{2}\over{3}} , -{{2}\over{3}} , {{1}\over{3}} \right] }\\&=&\displaystyle {{7}\over{3}} \cdot {\left[ {{2}\over{3}} , -{{2}\over{3}} , {{1}\over{3}} \right] }\\&=&\displaystyle {{7}\over{9}}\cdot \left[ 2 , -2 , 1 \right] \end{array}

\]

The distance from # \left[ 1 , -3 , -1 \right] # to the subspace #W =\linspan{\left[ 4 , -4 , 2 \right] }# is equal to the length of the difference vector of #\left[ 1 , -3 , -1 \right] # and the projection #{{{7}\over{9}}\cdot \left[ 2 , -2 , 1 \right] }#:

\[\norm{\left[ 1 , -3 , -1 \right] - {{7}\over{9}}\cdot \left[ 2 , -2 , 1 \right] }=\norm{\left[ -{{5}\over{9}} , -{{13}\over{9}} , -{{16}\over{9}} \right] } = {{5\cdot \sqrt{2}}\over{3}}\]

**Pass Your Math**independent of your university. See pricing and more.

Or visit omptest.org if jou are taking an OMPT exam.