### Invariant subspaces of linear maps: Eigenvalues and eigenvectors

### Determining eigenvalues and eigenvectors

The following theorem shows that we can find a maximal set of linearly independent eigenvectors by finding these for all eigenvalues separately.

Independence of eigenvectors for different eigenvalues

Let #V# be a vector space and # L :V\rightarrow V# a linear map. Suppose that #\lambda_1,\ldots ,\lambda_n# are mutually different eigenvalues of #L#. Suppose further that #\alpha_i# is a set of linearly independent eigenvectors of # E_i = \ker{L-\lambda_i\,I_V}# for #i=1,\ldots,n#. Then the union of #\alpha_1,\ldots,\alpha_n# is linearly independent.

Determination of eigenvalues and eigenvectors can thus be carried out as follows.

Let #V# be an #n#-dimensional vector space, where #n# is a natural number, and let #L:V\to V# be a linear map. The eigenvalues of #L# and a maximum set of linearly independent eigenvectors can be found as follows:

- Set up the matrix #A = L_{\alpha}# of the linear map # L :V \rightarrow V# with respect to a basis #\alpha# of choice.
- Set up the
*characteristic equation*#\det (A-\lambda \cdot I_n)=0# and solve it. - For each eigenvalue #\lambda# solve the system #(A-\lambda\cdot I_n)\vec{v}=\vec{0}#. Each solution provides the coordinates of a vector from #E_{\lambda}#. The solutions constitute the space #E_{\lambda}#. Choose a basis of the solution space. The union of these bases for all of the eigenvalues is a maximal set of linearly independent eigenvectors, given in terms of coordinates with respect to the basis #\alpha#.
- Go back, if desired, from coordinates to vectors in #V#.

Some of the examples below show that a given linear transformation # L :V \rightarrow V# does not always have a basis of eigenvectors, so # L# is not always determined by a diagonal matrix. Yet, we can often find a fairly simple form of the matrix. *Later *we will discuss this in greater detail.

- List of eigenvalues: \([5,3]\)
- Matrix whose columns are corresponding eigenvectors: \(\matrix{-2&-3\\ 1 & 1}\)

We start by solving the

*characteristic equation*\(\det(A-\lambda\cdot I)=0\) of the matrix \(A\). Calculation and factorization of the characteristic polynomial gives

\[ \begin{array}{rcl}

\det(A-\lambda I) &=& \left\vert \begin{array}{cc} -1-\lambda & -12 \\ 2 & 9-\lambda \end{array} \right\vert\\ &=& (-1-\lambda)(9-\lambda)+12\cdot2 \\

&=& \left(\lambda-5\right)\cdot \left(\lambda-3\right)

\end{array}\] This means that the eigenvalues are \(\lambda_1 = 5\) and \(\lambda_2 = 3\).

Next we calculate the corresponding eigenvector for each eigenvalue.

For \(\lambda_1 = 5\) we determine the kernel of \(A -5\cdot I_2\) by

*row reducing*the coefficient matrix:

\[\begin{array}[t]{ll}A -5\cdot I_2&=

\matrix{

-6 & -12 \\

2 & 4 }\\

&

\begin{array}[t]{ll} \sim\left(\begin{array}{cc} 1 & 2 \\ 2 & 4 \end{array}\right) & \\ \sim\left(\begin{array}{cc} 1 & 2 \\ 0 & 0 \end{array}\right) \end{array}

\end{array}\] Thus the eigenspace for \(\lambda_1 = 5\) equals \(\linspan{ \cv{-2\\ 1}} \).

For \(\lambda_2 = 3\) we determine the kernel of \(A -3\cdot I_2\) by row reducing the coefficient matrix:

\[\begin{array}[t]{ll}A -3\cdot I_2&=

\matrix{

-4 & -12 \\

2 & 6

}\\

&

\begin{array}[t]{ll} \sim\left(\begin{array}{cc} 1 & 3 \\ 2 & 6 \end{array}\right) & \\ \sim\left(\begin{array}{cc} 1 & 3 \\ 0 & 0 \end{array}\right) \end{array}

\end{array}\] Thus the eigenspace for \(\lambda_2 = 3\) equals \(\linspan{ \cv{-3\\1}}\).

We conclude that the list of eigenvalues is #\rv{5,3}# and that a matrix whose columns are corresponding eigenvalues is \(\matrix{-2&-3\\ 1 & 1}\).

**Pass Your Math**independent of your university. See pricing and more.

Or visit omptest.org if jou are taking an OMPT exam.