### Invariant subspaces of linear maps: Eigenvalues and eigenvectors

### Determining eigenvalues and eigenvectors

The following theorem shows that we can find a maximal set of linearly independent eigenvectors by finding these for all eigenvalues separately.

Independence of eigenvectors for different eigenvalues

Let #V# be a vector space and # L :V\rightarrow V# a linear map. Suppose that #\lambda_1,\ldots ,\lambda_n# are mutually different eigenvalues of #L#. Suppose further that #\alpha_i# is a set of linearly independent eigenvectors of # E_i = \ker{L-\lambda_i\,I_V}# for #i=1,\ldots,n#. Then the union of #\alpha_1,\ldots,\alpha_n# is linearly independent.

Determination of eigenvalues and eigenvectors can thus be carried out as follows.

Let #V# be an #n#-dimensional vector space, where #n# is a natural number, and let #L:V\to V# be a linear map. The eigenvalues of #L# and a maximum set of linearly independent eigenvectors can be found as follows:

- Set up the matrix #A = L_{\alpha}# of the linear map # L :V \rightarrow V# with respect to a basis #\alpha# of choice.
- Set up the
*characteristic equation*#\det (A-\lambda \cdot I_n)=0# and solve it. - For each eigenvalue #\lambda# solve the system #(A-\lambda\cdot I_n)\vec{v}=\vec{0}#. Each solution provides the coordinates of a vector from #E_{\lambda}#. The solutions constitute the space #E_{\lambda}#. Choose a basis of the solution space. The union of these bases for all of the eigenvalues is a maximal set of linearly independent eigenvectors, given in terms of coordinates with respect to the basis #\alpha#.
- Go back, if desired, from coordinates to vectors in #V#.

Some of the examples below show that a given linear transformation # L :V \rightarrow V# does not always have a basis of eigenvectors, so # L# is not always determined by a diagonal matrix. Yet, we can often find a fairly simple form of the matrix. *Later *we will discuss this in greater detail.

- List of eigenvalues: \([-3,-5]\)
- Matrix whose columns are corresponding eigenvectors: \(\matrix{1&-4\\ 3 & -11}\)

We start by solving the

*characteristic equation*\(\det(A-\lambda\cdot I)=0\) of the matrix \(A\). Calculation and factorization of the characteristic polynomial gives

\[ \begin{array}{rcl}

\det(A-\lambda I) &=& \left\vert \begin{array}{cc} -27-\lambda & 8 \\ -66 & 19-\lambda \end{array} \right\vert\\ &=& (-27-\lambda)(19-\lambda)-8\cdot-66 \\

&=& \left(\lambda+3\right)\cdot \left(\lambda+5\right)

\end{array}\] This means that the eigenvalues are \(\lambda_1 = -3\) and \(\lambda_2 = -5\).

Next we calculate the corresponding eigenvector for each eigenvalue.

For \(\lambda_1 = -3\) we determine the kernel of \(A +3\cdot I_2\) by

*row reducing*the coefficient matrix:

\[\begin{array}[t]{ll}A +3\cdot I_2&=

\matrix{

-24 & 8 \\

-66 & 22 }\\

&

\begin{array}[t]{ll} \sim\left(\begin{array}{cc} -66 & 22 \\ -24 & 8 \end{array}\right) & \\ \sim\left(\begin{array}{cc} 1 & -{{1}\over{3 }} \\ -24 & 8 \end{array}\right) & \\ \sim\left(\begin{array}{cc} 1 & -{{1}\over{3}} \\ 0 & 0 \end{array}\right) \end{array}

\end{array}\] Thus the eigenspace for \(\lambda_1 = -3\) equals \(\linspan{ \cv{1\\ 3}} \).

For \(\lambda_2 = -5\) we determine the kernel of \(A +5\cdot I_2\) by row reducing the coefficient matrix:

\[\begin{array}[t]{ll}A +5\cdot I_2&=

\matrix{

-22 & 8 \\

-66 & 24

}\\

&

\begin{array}[t]{ll} \sim\left(\begin{array}{cc} -66 & 24 \\ -22 & 8 \end{array}\right) & \\ \sim\left(\begin{array}{cc} 1 & -{{4}\over{11 }} \\ -22 & 8 \end{array}\right) & \\ \sim\left(\begin{array}{cc} 1 & -{{4}\over{11}} \\ 0 & 0 \end{array}\right) \end{array}

\end{array}\] Thus the eigenspace for \(\lambda_2 = -5\) equals \(\linspan{ \cv{-4\\-11}}\).

We conclude that the list of eigenvalues is #\rv{-3,-5}# and that a matrix whose columns are corresponding eigenvalues is \(\matrix{1&-4\\ 3 & -11}\).

**Pass Your Math**independent of your university. See pricing and more.

Or visit omptest.org if jou are taking an OMPT exam.