### Orthogonal and symmetric maps: Orthogonal maps

### Some properties of orthogonal maps

Here are some general properties of orthogonal maps.

Properties of orthogonal maps Let #V# be a real inner product space and #L:V\to V# an orthogonal map.

- If also #M :V\rightarrow V# is an orthogonal map, then the composition #L\,M# is orthogonal too.
- The map #L# is injective.
- If #V# is finite dimensional, then #L# is invertible and #L^{-1}# is orthogonal.
- Each real eigenvalue of #L# equals #1# or #-1#.
- If #W# is a finite-dimensional linear subspace of #V# which is invariant under #L#, then also the orthogonal complement #W^\perp# is invariant under #L#.
- If #L# fixes the
*orthogonal complement*of a nonzero vector #\vec{v}# of #V#, then #L# is either the identity or the*orthogonal reflection*#S_{\vec{v}}#.

#A = # # \matrix{-{{1}\over{9}} & -{{8}\over{9}} & {{4}\over{9}} \\ -{{8}\over{9}} & -{{1}\over{9}} & -{{4}\over{9}} \\ {{4}\over{9}} & -{{4}\over{9}} & -{{7}\over{9}} \\ }#

The vector #\rv{2,-2,1}# is fixed by #L# and so lies in the eigenspace with respect to the eigenvalue #1#. Because #L# is orthogonal, the only real eigenvalues are #1# and #-1#. The eigenspace with respect to the eigenvalue #-1# is either #1#-dimensional or #2#-dimensional. In the first case there would be a non-real eigenvalue, and so its complex conjugate would also be an eigenvalue, which is impossible because the dimension of the inner product space is equal to #3#. Therefore, the eigenspace of #L# with respect to #-1# has dimension #2#.

We find a vector perpendicular to both \( \rv{2,-2,1}\) and \(\rv{1,0,-2}\). This can be found by solving a set of linear equations. A faster method uses the

\[\rv{2,-2,1}\times \rv{1,0,-2} = \rv{4,5,2 }\] This should be an eigenvector of #L# with eigenvalue #-1#. Thus we find that the matrix #L_\beta# of #L# relative to the basis

\[\beta = \basis{\rv{2,-2,1},\rv{1,0,2},\rv{4,5,2 }}\] is the diagonal matrix with diagonal entries #1#, #-1#, #-1#. We conclude that the matrix of #L# (relative to the standard basis #\varepsilon#) is equal to

\[\begin{array}{rcl} L_{\varepsilon} &=& {}_\varepsilon I_\beta \,L_\beta \, {}_\beta I_\varepsilon\\

&=&\matrix{2 & 1 & 4 \\ -2 & 0 & 5 \\ 1 & -2 & 2 \\ }\,\matrix{1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1 \\ } \,\matrix{2 & 1 & 4 \\ -2 & 0 & 5 \\ 1 & -2 & 2 \\ }^{-1} \\

&=&\matrix{2 & -1 & -4 \\ -2 & 0 & -5 \\ 1 & 2 & -2 \\ }\, \matrix{{{2}\over{9}} & -{{2}\over{9}} & {{1}\over{9}} \\ {{1}\over{5}} & 0 & -{{2}\over{5}} \\ {{4}\over{45}} & {{1}\over{9}} & {{2}\over{45}} \\ } \\

&=& \matrix{-{{1}\over{9}} & -{{8}\over{9}} & {{4}\over{9}} \\ -{{8}\over{9}} & -{{1}\over{9}} & -{{4}\over{9}} \\ {{4}\over{9}} & -{{4}\over{9}} & -{{7}\over{9}} \\ }

\end{array}\]

The vector #\rv{2,-2,1}# is fixed by #L# and so lies in the eigenspace with respect to the eigenvalue #1#. Because #L# is orthogonal, the only real eigenvalues are #1# and #-1#. The eigenspace with respect to the eigenvalue #-1# is either #1#-dimensional or #2#-dimensional. In the first case there would be a non-real eigenvalue, and so its complex conjugate would also be an eigenvalue, which is impossible because the dimension of the inner product space is equal to #3#. Therefore, the eigenspace of #L# with respect to #-1# has dimension #2#.

We find a vector perpendicular to both \( \rv{2,-2,1}\) and \(\rv{1,0,-2}\). This can be found by solving a set of linear equations. A faster method uses the

*cross product*:\[\rv{2,-2,1}\times \rv{1,0,-2} = \rv{4,5,2 }\] This should be an eigenvector of #L# with eigenvalue #-1#. Thus we find that the matrix #L_\beta# of #L# relative to the basis

\[\beta = \basis{\rv{2,-2,1},\rv{1,0,2},\rv{4,5,2 }}\] is the diagonal matrix with diagonal entries #1#, #-1#, #-1#. We conclude that the matrix of #L# (relative to the standard basis #\varepsilon#) is equal to

\[\begin{array}{rcl} L_{\varepsilon} &=& {}_\varepsilon I_\beta \,L_\beta \, {}_\beta I_\varepsilon\\

&=&\matrix{2 & 1 & 4 \\ -2 & 0 & 5 \\ 1 & -2 & 2 \\ }\,\matrix{1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1 \\ } \,\matrix{2 & 1 & 4 \\ -2 & 0 & 5 \\ 1 & -2 & 2 \\ }^{-1} \\

&=&\matrix{2 & -1 & -4 \\ -2 & 0 & -5 \\ 1 & 2 & -2 \\ }\, \matrix{{{2}\over{9}} & -{{2}\over{9}} & {{1}\over{9}} \\ {{1}\over{5}} & 0 & -{{2}\over{5}} \\ {{4}\over{45}} & {{1}\over{9}} & {{2}\over{45}} \\ } \\

&=& \matrix{-{{1}\over{9}} & -{{8}\over{9}} & {{4}\over{9}} \\ -{{8}\over{9}} & -{{1}\over{9}} & -{{4}\over{9}} \\ {{4}\over{9}} & -{{4}\over{9}} & -{{7}\over{9}} \\ }

\end{array}\]

Unlock full access

Teacher access

Request a demo account. We will help you get started with our digital learning environment.

Student access

Is your university not a partner?
Get access to our courses via

Or visit omptest.org if jou are taking an OMPT exam.

**Pass Your Math**independent of your university. See pricing and more.Or visit omptest.org if jou are taking an OMPT exam.