### Orthogonal and symmetric maps: Isometries

### Equivalence of isometries

The question when two linear isometries are equal up to a choice of basis, is answered by the following theorem, which is comparable to theorem *Matrix equivalence* for general linear maps. In this case, each pair of linear isometries between two inner product spaces of finite dimension is matrix equivalent, while the corresponding invertible matrices can be chosen orthogonal.

Matrix equivalence for linear isometries

Let #V# and #W# be finite-dimensional inner product spaces of dimension #m# and #n#, respectively. Suppose that #L# is a linear isometry #V\to W#.

- We have #n\ge m#.
- If #\alpha# is an orthonormal basis of #V#, then there is a basis #\beta# of #W# such that the matrix #{}_\beta L_\alpha# is equal to the #(n\times m)#-matrix \(\matrix{I_m\\ 0}\).
- For each linear isometry #M:V\to W# there is an orthogonal map #Y:W\to W# such that #L=Y\,M#.

According to

There is even an orthogonal matrix #Y# such that #A = Y\, J#. Determine such a matrix.

*Matrix equivalence for linear isometries*, the linear isometry #L:\mathbb{R}^2\to\mathbb{R}^3# given by the matrix \[A = 1\, \matrix{0 & 0 \\ 1 & 0 \\ 0 & 1 \\ }\] is matrix equivalent to the matrix \(J =\matrix{1 & 0 \\ 0 & 1 \\ 0 & 0 \\ }\).There is even an orthogonal matrix #Y# such that #A = Y\, J#. Determine such a matrix.

#Y = # #\cdot \matrix{0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \\ }#

If we begin with the standard basis for #\mathbb{R}^2#, then the first two columns of #A# form an orthonormal system. We extend this pair with a vector to an orthonormal basis of #\mathbb{R}^3# and add this vector as a column to the matrix #A# in order to get the matrix #Y#:

\[ Y =\cdot \matrix{0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \\ }\] The matrix #A# consists of the first two columns of #Y# and thus can be written as \[A =1\, \matrix{0 & 0 \\ 1 & 0 \\ 0 & 1 \\ } =\cdot \matrix{0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \\ }\, \matrix{1 & 0 \\ 0 & 1 \\ 0 & 0 \\ }= Y\,J \]

If we begin with the standard basis for #\mathbb{R}^2#, then the first two columns of #A# form an orthonormal system. We extend this pair with a vector to an orthonormal basis of #\mathbb{R}^3# and add this vector as a column to the matrix #A# in order to get the matrix #Y#:

\[ Y =\cdot \matrix{0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \\ }\] The matrix #A# consists of the first two columns of #Y# and thus can be written as \[A =1\, \matrix{0 & 0 \\ 1 & 0 \\ 0 & 1 \\ } =\cdot \matrix{0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \\ }\, \matrix{1 & 0 \\ 0 & 1 \\ 0 & 0 \\ }= Y\,J \]

Unlock full access

Teacher access

Request a demo account. We will help you get started with our digital learning environment.

Student access

Is your university not a partner?
Get access to our courses via

Or visit omptest.org if jou are taking an OMPT exam.

**Pass Your Math**independent of your university. See pricing and more.Or visit omptest.org if jou are taking an OMPT exam.