In the chapter *Linear maps* we studied properties of a linear map #V\rightarrow V#, where #V# is a vector space. If #V# is a *real inner product space*, then the linear maps #V\to V# that preserve the *real inner product* are of importance. We will see that these are precisely those linear maps from an inner product space to itself that preserve *length*.

Let #V# be a real inner product space. A linear map #L :V\rightarrow V# is called **orthogonal** if #\norm{L(\vec{x})}=\norm{\vec{x}}# for each #\vec{x}\in V#.

In other words, a linear map #L :V\rightarrow V# is orthogonal if the *length* is invariant under #L#. In this case, we also say that #L# preserves length.

Due to the linearity of the map, #L# is orthogonal if and only if the *distance* is invariant under #L#, that is, if #\norm{L(\vec{x})-L(\vec{y})}=\norm{\vec{x}-\vec{y}}# for all #\vec{x},\vec{y}\in V#.

The identity map #I_V:V\rightarrow V# (also referred to as #I#) satisfies #{ I_V}(\vec{x})=\vec{x}# for each vector #\vec{x}#. In particular, we have #\norm{I_V(\vec{x})} =\norm{\vec{x}}#. Therefore, the identity map is orthogonal.

Also #-{ I_V}# is orthogonal.

For #\lambda \neq \pm 1#, the scalar multiplication #\lambda \cdot I_V # is not orthogonal.

We examine the orthogonal maps #V\rightarrow V# in case #\dim {V}=1#. The only linear maps on a vector space of dimension #1# are scalar multiplication, i.e., multiplications by a number #\lambda#. The length is invariant under the linear transformation #L(\vec{x}) = \lambda\cdot \vec{x}# on #V# if and only if #\lambda =1# or #\lambda =-1#.

- For #\lambda =1# we find #L=I#, a direct orthogonal map.
- For #\lambda =-1# we find the map #L=-I_V#; this is the reflection about the origin.

The linear map #L: \mathbb{R}^2 \rightarrow \mathbb{R}^2# given by

\[

L(\rv{x,y})=\frac{1}{5}\rv{3x+4y,-4x+3y}

\] is orthogonal because

\[

\begin{array}{rcl}

\norm{L(\rv{x,y})}^2 & =&\displaystyle\frac{1}{25}((3x+4y)^2 + (-4x+3y)^2)\\\\

&=&\dfrac{1}{25}(9x^2 +16y^2 + 24xy + 16x^2 +9y^2 -24xy)\\\\

& =& x^2 +y^2\\\\ &=&\norm{\,\rv{x,y}\,}^2

\end{array}

\]This suffices for the proof that \(\norm{L(\rv{x,y})}=\norm{\,\rv{x,y}\,}\) because the norm takes non-negative values only.

Let #\vec{a}# be a vector of length #1# in the inner product space #V#. We consider the **orthogonal reflection** #S_{\vec{a}}:V\rightarrow V# about the subspace #\{\vec{a}\}^{\perp}#. This map is given by the rule \[S_{\vec{a}}(\vec{x})=\vec{x}-2(\dotprod{\vec{x}}{\vec{a}})\cdot \vec{a}\] We can obtain this mapping rule by the following reasoning. Let #\vec{x}# be a vector of #V#. Move from #\vec{x}# along the direction of #\vec{a}# until you arrive at #\{\vec{a}\}^{\perp}#, that is, the intersection of the straight line #\vec{x}+\lambda \cdot \vec{a}# with #\{\vec{a}\}^{\perp}#. This occurs for #\lambda =-(\dotprod{\vec{x}}{\vec{a}} )# and thus we are at the vector #\vec{x}-(\dotprod{\vec{x}}{\vec{a}} )\, \vec{a}#. To find the image of #\vec{x}# under the reflection, we need to subtract #(\dotprod{\vec{x}}{\vec{a} })\, \vec{a}# twice from #\vec{x}#.

An orthogonal reflection preserves the length of a vector. This geometrically known fact follows immediately from the following calculation of the square of the length of #S_{\vec{a}} (\vec{x})#:

\[

\begin{array}{rcl} \norm{S_{\vec{a}} ( \vec{x})}^2 &=&

\dotprod{(\vec{x}-2(\dotprod{\vec{x}}{\vec{a}})\vec{a})}{(\vec{x}-2(\dotprod{\vec{x}}{\vec{a}})\vec{a})}\\ & =&(\dotprod{\vec{x}}{\vec{x}})-

4(\dotprod{\vec{x}}{\vec{a}})\cdot(\dotprod{\vec{x}}{\vec{a}}) +4(\dotprod{\vec{x}}{\vec{a}})^2 (\dotprod{\vec{a}}{\vec{a}})\\

& =&\dotprod{\vec{x}}{\vec{x}}\\ &=& \norm{\vec{x}}^2 \end{array}

\] For an arbitrary nonzero vector #\vec{v}#, we set #S_{\vec{v}} = S_{ \vec{a}}#, where #\vec{a}# is the *normalized vector* #\frac{1}{\norm{\vec{v}}}\vec{v}#. The rule for this map is

\[S_{\vec{v}}(\vec{x})=\vec{x}-2\dfrac{\dotprod{\vec{x}}{\vec{v}}}{\dotprod{\vec{v}}{\vec{v}}}\cdot \vec{v}\]

If #\dim{V} \gt 1# and if #P: V\to V# is the

*orthogonal projection* onto the straight line #\linspan{\vec{a}}# through the origin spanned by a vector #\vec{a}#, then #P# is not an orthogonal map. To see this, choose a vector #\vec{b}# perpendicular to #\vec{a}# which is not equal to the zero vector (this is possible because #\dim{V} \gt 1#). Then we have #{P}(\vec{b})=\vec{0}#, so #0=\norm{P(\vec{b})}\lt \norm{\vec{b}}#.

The **translation** on a vector space #V# along a vector #\vec{a}# of #V# is the map #T_{\vec{a}}# given by \( T_{\vec{a}}(\vec{x}) = \vec{x}+\vec{a}\). The map is the identity if and only if #\vec{a} = \vec{0}#. If #\vec{a}# is distinct from the zero vector, then translation along #\vec{a}# is not linear (after all, the image of #\vec{0}# under #T_{\vec{a}}# is #\vec{a}#, distinct from the null vector). But if #V# is an inner product space, then \( T_{\vec{a}}\) preserves distance. After all, for each pair of vectors #\vec{x}#, #\vec{y}# of #V# we have \[\norm{T_{\vec{a}}(\vec{x})-T_{\vec{a}}(\vec{y})}=\norm{(\vec{x}+\vec{a})-(\vec{y}+\vec{a})}=\norm{\vec{x}-\vec{y}}\] Thus there are non-linear maps #V\to V# that preserve distance.

Translation along \({\vec{a}}\) also does not preserve length if #\vec{a} \ne\vec{0}#. This can be seen in the simple example with # \vec{x}=0#. Then we have \[\norm{T_{\vec{a}}(\vec{x})}=\norm{\vec{a}}\neq 0=\norm{\vec{0}}=\norm{\vec{x}}\]

*Later* we will look at maps of the more general form #T:V\to W# which preserve distance. Here, unlike the case of orthogonal maps, #V# and #W# need not be the same. Such a map is called an **isometry**. The map #L:\mathbb{R}^2\to\mathbb{R}^3# given by #L(\rv{x,y}) = \rv{x,y,0}# is an example.

For a linear map, orthogonality can also be determined on the basis of the inner product:

Let #V# be an inner product space. A linear map #L: V\to V# is orthogonal if and only if #\dotprod{L(\vec{x})}{L(\vec{y})} = \dotprod{\vec{x}}{\vec{y}} # for all vectors #\vec{x}# and #\vec{y}# of #V#.

If the linear map #L:V\to V# preserves the inner product (that is, for every #\vec{x}# and #\vec{y}# in #V# we have #\dotprod{L(\vec{x})}{L(\vec{y})} = \dotprod{\vec{x}}{\vec{y}}#), then #L# also leaves invariant the length because of the following equalities \[ \norm{\vec{x}} = \sqrt{\dotprod{\vec{x}}{\vec{x}}}=\sqrt{\dotprod{L(\vec{x})}{L(\vec{x})}}=\norm{L(\vec{x})}\]

Proving invariance of the inner product for a linear map preserving length is a little more difficult. For this purpose we rely on the *polarization formula,* which, for arbitrary vectors #\vec{a}# and #\vec{b}# of #V#, reads \[ \dotprod{\vec{a}}{\vec{b}}=\frac{1}{2}\left(\norm{\vec{a}+\vec{b}}^2 -\norm{\vec{a}}^2 -\norm{\vec{b}}^2\right)\] Invariance of the inner product can now be derived as follows.

\[\begin{array}{rcl}\dotprod{L(\vec{x})}{L(\vec{y})}&=&\dfrac{1}{2}\left(\norm{L(\vec{x})+L(\vec{y})}^2-\norm{L(\vec{x})}^2-\norm{L(\vec{y})}^2\right) \\&&\phantom{xx}\color{blue}{\text{polarization formula}}\\&=&\dfrac{1}{2}\left(\norm{L(\vec{x}+\vec{y})}^2-\norm{L(\vec{x})}^2-\norm{L(\vec{y})}^2\right)\\&&\phantom{xx}\color{blue}{\text{linearity of }L}\\&=&\dfrac{1}{2}\left(\norm{ \vec{x}+\vec{y}}^2-\norm{\vec{x}}^2-\norm{\vec{y}}^2\right)\\&&\phantom{xx}\color{blue}{\text{length is preserved}}\\ &=&\dotprod{\vec{x}}{\vec{y}}\\&&\phantom{xx}\color{blue}{\text{polarization formula}}\end{array}\]

Translation on an inner product space #V# along a vector #\vec{a}# distinct from the zero vector is a map that does not preserve the inner product: the map #T_{\vec{a}}# given by \(T_{\vec{a}}(\vec{x}) = \vec{x}+\vec{a}\) satisfies \[\dotprod{T_{\vec{a}}(\vec{0})}{T_{\vec{a}}(\vec{0})} = \dotprod{\vec{a}}{\vec{a}} \gt 0 = \dotprod{\vec{0}}{\vec{0}} \]

In the chapter *Inner product spaces* we introduced the concept of length on the basis of the inner product. To keep in line with this set-up, we could also have chosen to define an orthogonal map as a map that preserves the inner product. This characterization tells us that these two definitions are equivalent. In applications, it is more convenient to work with the definition according to which the length is preserved.

We view # \mathbb{R}^3# as the inner product space with standard inner product.

Does there exist a real number #a# such that #L :\mathbb{R}^3\to \mathbb{R}^3# is an orthogonal map with \(L (\rv{1,0,0})=\frac{1}{4}\,\rv{a, -3, 6 }\)?

No

The map #L# is linear, and so is orthogonal if and only if #\norm{L(\vec{x})}=\norm{\vec{x}}# for all vectors #\vec{x}#. In particular, we must have #\norm{L(\rv{1,0,0})}=\norm{\rv{1,0,0}}#. This leads to the following equation with unknown #a#, which will be rewritten: \[\begin{array}{rcl}\norm{\dfrac{1}{4}\,\rv{a, -3, 6 }}&=&1\\ &&\phantom{xx}\color{blue}{\text{mapping rule for }L\text{ substituted}}\\ \dfrac{1}{4}\cdot\norm{\rv{a, -3, 6 }}&=&1\\ &&\phantom{xx}\color{blue}{\text{multiplicativity of the norm }}\\

\norm{\rv{a, -3, 6}}&=&4\\ &&\phantom{xx}\color{blue}{\text{multiplied by }4}\\

a^2+(-3)^2+{6}^2&=&{4}^2\\ &&\phantom{xx}\color{blue}{\text{length calculated and both sides squared}}\\

a^2&=&-29\\ &&\phantom{xx}\color{blue}{\text{all constant term carried to the right-hand side}}\\

\text{no solution}&&\\ &&\phantom{xx}\color{blue}{\text{square is not negative}}

\end{array}\] Therefore, the answer is No.