Orthogonal and symmetric maps: Classification of orthogonal maps
Classification of orthogonal maps
The theorem below gives information about when two orthogonal maps on a finite-dimensional inner product space are conjugate. It also shows that, if these two maps are conjugate, the conjugator can be chosen to be orthogonal.
Conjugate orthogonal maps Let #V# be an inner product space of finite dimension and suppose that #L# and #M# are orthogonal maps #V\to V#. Then the following statements are equivalent:
- There is an orthogonal map #X:V\to V# such that #M = X\, L\, X^{-1}#.
- There is an invertible linear map #Y: V\to V# such that #M = Y\, L\, Y^{-1}#.
- The characteristic polynomials of #L# and #M# are equal.
- The complex eigenvalues of #L# (with their multiplicities) coincide with those of #M#.
- The maps #L# and #M# have the same orthogonal Jordan normal form.
- There are orthonormal bases #\alpha# and #\beta# of #V# with the property that #L_\alpha = M_\beta#.
\[
A=\matrix{{{2}\over{3}} & -{{1}\over{3}} & -{{2}\over{3}} \\ -{{1}\over{3}} & {{2}\over{3}} & -{{2}\over{3}} \\ {{2}\over{3}} & {{2}\over{3}} & {{1}\over{3}} \\ }
\]
The matrix #A# is orthogonal, so also #L# is orthogonal. We first calculate the determinant of #A# and find #\det(L) =1#. The map #L# is a rotation about the axis spanned by an eigenvector with eigenvalue #1# .
We can find this eigenvector with eigenvalue #1# by solving the equation #(A -I_3)\vec{x}=\vec{0}#, so by determining the solution of a homogeneous system of linear equations with coefficient matrix \[{{1}\over{3}}\cdot \matrix{-1 & -1 & -2 \\ -1 & -1 & -2 \\ 2 & 2 & -2 \\ }\]The reduced echelon form of this matrix is \[\matrix{1 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \\ } \] from which it follows that the axis of rotation #E_{1}# is spanned by the vector \[\vec{b}_1 =\frac{1}{\sqrt{2}} \rv{ -1 , 1 , 0 } \] of length #1#. The angle of rotation can be found by using the trace: \[1+2\cos(\varphi)={{5}\over{3}}\] This gives #\cos(\varphi)={{1}\over{3}}#, so \[\varphi =\arccos\left ({{1}\over{3}} \right)\]The angle of rotation can also be found as follows: the plane perpendicular to the axis of rotation is \[{{y}\over{\sqrt{2}}}-{{x}\over{\sqrt{2}}}=0\] Choose a vector of length #1# contained in it, for example \[\vec{b}_2 =\rv{ 0 , 0 , 1 } \] Then \[A\, \vec{b}_2=\rv{ -{{2}\over{3}} , -{{2}\over{3}} , {{1}\over{3}} } \] Of course, this vector again lies in the plane \[{{y}\over{\sqrt{2}}}-{{x}\over{\sqrt{2}}}=0\] since the plane is invariant under #A#. The angle of rotation is the angle between #\vec{b}_2# and #A \vec{b}_2#. This value can be found using the inner product:
\[
\begin{array}{rcl}
\dotprod{\vec{b}_2}{(A \,\vec{b}_2)}&=&\norm{\vec{b}_2}\,\norm{A\, \vec{b}_2}\cos(\varphi)\\
{{1}\over{3}} &=&1\cdot1\cdot\cos(\varphi)\\
\end{array}
\] consistent with the earlier result #\cos(\varphi) ={{1}\over{3}}#.
Geometrically, it is clear that it does not matter which vector (unequal to the zero vector) of the plane of rotation you take, but this can also be verified by calculation.
According to the theory, the orthogonal Jordan normal form of this matrix is
\[J = \matrix{1&0&0\\ 0&\cos(\varphi)&-\sin(\varphi)\\ 0&\sin(\varphi)&\cos(\varphi)}= \matrix{1&0&0\\ 0&{{1}\over{3}}&-\frac{2}{3}\sqrt{2}\\ 0&\frac{2}{3}\sqrt{2}&{{1}\over{3}}}\] We finish by determining an orthonormal basis #\beta# with respect to which this is the matrix of #L# (that is to say, such that \(L_\beta = {}_\beta I_\varepsilon\, A\, {}_\varepsilon I_\beta =J\) ). We start with the orthonormal system #\basis{\vec{b}_1,\vec{b}_2}#. We still need to find a vector of length #1# which is perpendicular to the first two basis vectors: \[\vec{a}_3 =\left[ {{1}\over{\sqrt{2}}} , {{1}\over{\sqrt{2}}} , 0 \right] \] For the orientation of the angle #\varphi# in the orthogonal Jordan normal form, we determine the inner product #\dotprod{(A\vec{b}_2)}{\vec{a}_3}#. This is equal to \[\dotprod{\rv{ -{{2}\over{3}} , -{{2}\over{3}} , {{1}\over{3}} } }{\left[ {{1}\over{\sqrt{2}}} , {{1}\over{\sqrt{2}}} , 0 \right] } =-\frac{2}{3}\sqrt{2}\] Because this number is negative, and we need the value to be #\sin(\varphi)#, we take #\vec{b}_3# to be the vector #-\vec{a}_3#. Indeed, denoting element #(i,j)# of #J# by #J_{ij}#, we find \[\dotprod{(A\,\vec{b}_2)}{\vec{b}_3}=\dotprod{\left(J_{12}\vec{b}_1+J_{22}\vec{b}_2+J_{32}\vec{b}_3\right)}{\vec{b}_3}=J_{32}=\sin(\varphi)\] where we used the orthonormality of #\beta#. This determines #\beta =\basis{\vec{b}_1,\vec{b}_3,\vec{b}_3}# completely. The basis consists of the columns of the matrix \[{}_\varepsilon I_\beta =\matrix{-{{1}\over{\sqrt{2}}} & 0 & -{{1}\over{\sqrt{2}}} \\ {{1}\over{\sqrt{2}}} & 0 & -{{1}\over{\sqrt{2}}} \\ 0 & 1 & 0 \\ }\] The following calculation shows that this answer is correct:
\[\begin{array}{rcl} {}_\beta I_\varepsilon \, A\, {}_\varepsilon I_\beta &=&{\matrix{-{{1}\over{\sqrt{2}}} & 0 & -{{1}\over{\sqrt{2}}} \\ {{1}\over{\sqrt{2}}} & 0 & -{{1}\over{\sqrt{2}}} \\ 0 & 1 & 0 \\ }}^{-1}\, \, \matrix{{{2}\over{3}} & -{{1}\over{3}} & -{{2}\over{3}} \\ -{{1}\over{3}} & {{2}\over{3}} & -{{2}\over{3}} \\ {{2}\over{3}} & {{2}\over{3}} & {{1}\over{3}} \\ } \, \matrix{-{{1}\over{\sqrt{2}}} & 0 & -{{1}\over{\sqrt{2}}} \\ {{1}\over{\sqrt{2}}} & 0 & -{{1}\over{\sqrt{2}}} \\ 0 & 1 & 0 \\ }\\
&=& \matrix{-{{1}\over{\sqrt{2}}} & {{1}\over{\sqrt{2}}} & 0 \\ 0 & 0 & 1 \\ -{{1}\over{\sqrt{2}}} & -{{1}\over{\sqrt{2}}} & 0 \\ } \, \matrix{-{{1}\over{\sqrt{2}}} & -{{2}\over{3}} & -{{1}\over{3\cdot \sqrt{2}}} \\ {{1}\over{\sqrt{2}}} & -{{2}\over{3}} & -{{1}\over{3\cdot \sqrt{2}}} \\ 0 & {{1}\over{3}} & -{{2^{{{3}\over{2}}}}\over{3}} \\ }\\
&=& \matrix{1&0&0\\ 0&{{1}\over{3}}&-\frac{2}{3}\sqrt{2}\\ 0&\frac{2}{3}\sqrt{2}&{{1}\over{3}}}
\end{array}\]
Or visit omptest.org if jou are taking an OMPT exam.