### Orthogonal and symmetric maps: Orthogonal maps

### Some properties of orthogonal maps

Here are some general properties of orthogonal maps.

Properties of orthogonal maps Let #V# be a real inner product space and #L:V\to V# an orthogonal map.

- If also #M :V\rightarrow V# is an orthogonal map, then the composition #L\,M# is orthogonal too.
- The map #L# is injective.
- If #V# is finite dimensional, then #L# is invertible and #L^{-1}# is orthogonal.
- Each real eigenvalue of #L# equals #1# or #-1#.
- If #W# is a finite-dimensional linear subspace of #V# which is invariant under #L#, then also the orthogonal complement #W^\perp# is invariant under #L#.
- If #L# fixes the
*orthogonal complement*of a nonzero vector #\vec{v}# of #V#, then #L# is either the identity or the*orthogonal reflection*#S_{\vec{v}}#.

#A = # # \matrix{-{{9}\over{17}} & {{12}\over{17}} & -{{8}\over{17}} \\ {{12}\over{17}} & {{1}\over{17}} & -{{12}\over{17}} \\ -{{8}\over{17}} & -{{12}\over{17}} & -{{9}\over{17}} \\ }#

The vector #\rv{-2,-3,2}# is fixed by #L# and so lies in the eigenspace with respect to the eigenvalue #1#. Because #L# is orthogonal, the only real eigenvalues are #1# and #-1#. The eigenspace with respect to the eigenvalue #-1# is either #1#-dimensional or #2#-dimensional. In the first case there would be a non-real eigenvalue, and so its complex conjugate would also be an eigenvalue, which is impossible because the dimension of the inner product space is equal to #3#. Therefore, the eigenspace of #L# with respect to #-1# has dimension #2#.

We find a vector perpendicular to both \( \rv{-2,-3,2}\) and \(\rv{2,0,2}\). This can be found by solving a set of linear equations. A faster method uses the

\[\rv{-2,-3,2}\times \rv{2,0,2} = \rv{-6,8,6 }\] This should be an eigenvector of #L# with eigenvalue #-1#. Thus we find that the matrix #L_\beta# of #L# relative to the basis

\[\beta = \basis{\rv{-2,-3,2},\rv{2,0,-2},\rv{-6,8,6 }}\] is the diagonal matrix with diagonal entries #1#, #-1#, #-1#. We conclude that the matrix of #L# (relative to the standard basis #\varepsilon#) is equal to

\[\begin{array}{rcl} L_{\varepsilon} &=& {}_\varepsilon I_\beta \,L_\beta \, {}_\beta I_\varepsilon\\

&=&\matrix{-2 & 2 & -6 \\ -3 & 0 & 8 \\ 2 & 2 & 6 \\ }\,\matrix{1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1 \\ } \,\matrix{-2 & 2 & -6 \\ -3 & 0 & 8 \\ 2 & 2 & 6 \\ }^{-1} \\

&=&\matrix{-2 & -2 & 6 \\ -3 & 0 & -8 \\ 2 & -2 & -6 \\ }\, \matrix{-{{2}\over{17}} & -{{3}\over{17}} & {{2}\over{17}} \\ {{1}\over{4}} & 0 & {{1}\over{4}} \\ -{{3}\over{68}} & {{1}\over{17}} & {{3}\over{68}} \\ } \\

&=& \matrix{-{{9}\over{17}} & {{12}\over{17}} & -{{8}\over{17}} \\ {{12}\over{17}} & {{1}\over{17}} & -{{12}\over{17}} \\ -{{8}\over{17}} & -{{12}\over{17}} & -{{9}\over{17}} \\ }

\end{array}\]

The vector #\rv{-2,-3,2}# is fixed by #L# and so lies in the eigenspace with respect to the eigenvalue #1#. Because #L# is orthogonal, the only real eigenvalues are #1# and #-1#. The eigenspace with respect to the eigenvalue #-1# is either #1#-dimensional or #2#-dimensional. In the first case there would be a non-real eigenvalue, and so its complex conjugate would also be an eigenvalue, which is impossible because the dimension of the inner product space is equal to #3#. Therefore, the eigenspace of #L# with respect to #-1# has dimension #2#.

We find a vector perpendicular to both \( \rv{-2,-3,2}\) and \(\rv{2,0,2}\). This can be found by solving a set of linear equations. A faster method uses the

*cross product*:\[\rv{-2,-3,2}\times \rv{2,0,2} = \rv{-6,8,6 }\] This should be an eigenvector of #L# with eigenvalue #-1#. Thus we find that the matrix #L_\beta# of #L# relative to the basis

\[\beta = \basis{\rv{-2,-3,2},\rv{2,0,-2},\rv{-6,8,6 }}\] is the diagonal matrix with diagonal entries #1#, #-1#, #-1#. We conclude that the matrix of #L# (relative to the standard basis #\varepsilon#) is equal to

\[\begin{array}{rcl} L_{\varepsilon} &=& {}_\varepsilon I_\beta \,L_\beta \, {}_\beta I_\varepsilon\\

&=&\matrix{-2 & 2 & -6 \\ -3 & 0 & 8 \\ 2 & 2 & 6 \\ }\,\matrix{1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1 \\ } \,\matrix{-2 & 2 & -6 \\ -3 & 0 & 8 \\ 2 & 2 & 6 \\ }^{-1} \\

&=&\matrix{-2 & -2 & 6 \\ -3 & 0 & -8 \\ 2 & -2 & -6 \\ }\, \matrix{-{{2}\over{17}} & -{{3}\over{17}} & {{2}\over{17}} \\ {{1}\over{4}} & 0 & {{1}\over{4}} \\ -{{3}\over{68}} & {{1}\over{17}} & {{3}\over{68}} \\ } \\

&=& \matrix{-{{9}\over{17}} & {{12}\over{17}} & -{{8}\over{17}} \\ {{12}\over{17}} & {{1}\over{17}} & -{{12}\over{17}} \\ -{{8}\over{17}} & -{{12}\over{17}} & -{{9}\over{17}} \\ }

\end{array}\]

Unlock full access

Teacher access

Request a demo account. We will help you get started with our digital learning environment.

Student access

Is your university not a partner?
Get access to our courses via

Or visit omptest.org if jou are taking an OMPT exam.

**Pass Your Math**independent of your university. See pricing and more.Or visit omptest.org if jou are taking an OMPT exam.