CAGD-banner.gif
On-Line Geometric Modeling Notes
EIGENVALUES AND EIGENVECTORS


In engineering applications, eigenvalue problems are among the most important problems connected with matrices. In this section we give the basic definitions of eigenvalues and eigenvectors and some of the basic results of their use.

pdficonsmall.gif For a pdf version of these notes look here


What are Eigenvalues and Eigenvectors?

Let $ A$ be an $ n \times n$ matrix and consider the vector equation

$\displaystyle A {\vec v }\: = \: \lambda {\vec v }
$

where $ \lambda$ is a scalar value.

It is clear that if $ {\vec v }= {\vec 0 }$, we have a solution for any value of $ \lambda$. A value of $ \lambda$ for which the equation has a solution with $ {\vec v }\not = {\vec 0 }$ is called an eigenvalue or characteristic value of the matrix $ A$. The corresponding solutions $ {\vec v }\not = {\vec 0 }$ are called eigenvectors or characteristic vectors of $ A$. In the problem above, we are looking for vectors that when multiplied by the matrix $ A$, give a scalar multiple of itself.

The set of eigenvalues of $ A$ is commonly called the spectrum of $ A$ and the largest of the absolute values of the eigenvalues is called the spectral radius of $ A$.


How Do We Calculate the Eigenvalues?

It is easy to see that the equation

$\displaystyle A {\vec v }\: = \: \lambda {\vec v }
$

can be rewritten as

$\displaystyle ( A - \lambda I ) {\vec v }\: = \: 0
$

where $ I$ is the identity matrix. A matrix equation of this form can only be solved if the determinant of the matrix is nonzero (see Cramer's Rule) - that is, if

$\displaystyle det ( A - \lambda I ) \: = \: 0
$

Since this equation is a polynomial in $ \lambda$, commonly called the characteristic polynomial, we only need to find the roots of this polynomial to find the eigenvalues.

We note that to get a complete set of eigenvalues, one may have to extend the scope of this discussion into the field of complex numbers.


How Do We Calculate the Eigenvectors?

The eigenvalues must be determined first. Once these are known, the corresponding eigenvectors can be calculated directly from the linear system

$\displaystyle ( A - \lambda I ) {\vec v }\: = \: 0
$

It should be noted that if $ {\vec v }$ is an eigenvector, then so is $ k {\vec v }$ for any scalar $ k$.


Right Eigenvectors

Given an eigenvalue $ \lambda$, The eigenvector $ {\vec r }$ that satisfies

$\displaystyle A {\vec r }\: = \: \lambda {\vec r }
$

is sometimes called a (right) eigenvector for the matrix $ A$ corresponding to the eigenvalue $ \lambda$. If $ \lambda_1, \lambda_2, ..., \lambda_r$ are the eigenvalues and $ {\vec r }_1, {\vec r }_2, ..., {\vec r }_r$ are the corresponding right eigenvectors, then is easy to see that the set of right eigenvectors form a basis of a vector space. If this vector space is of dimension $ n$, then we can construct an $ n \times n$ matrix $ R$ whose columns are the components of the right eigenvectors, which has the property that

$\displaystyle A R \: = \: R \Lambda
$

where $ \Lambda$ is the diagonal matrix

$\displaystyle \Lambda \: = \:
\left[
\begin{array}{ccccc}
\lambda_1 & 0 & 0 & \...
... & \vdots & \ddots & \vdots \\
0 & 0 & 0 & 0 & \lambda_n
\end{array}\right]
$

whose diagonal elements are the eigenvalues. By appropriate numbering of the eigenvalues and eigenvectors, it is possible to arrange the columns of the matrix $ R$ so that $ \lambda_1 \geq \lambda_2 \geq ... \geq \lambda_n$.


Left Eigenvectors

A vector $ {\vec l }$ so that

$\displaystyle {\vec l }^T A \: = \: \lambda {\vec l }^T
$

is called a left eigenvector for $ A$ corresponding to the eigenvalue $ \lambda$. If $ \lambda_1, \lambda_2, ..., \lambda_r$ are the eigenvalues and $ {\vec l }_1, {\vec l }_2, ..., {\vec l }_r$ are the corresponding left eigenvectors, then is easy to see that the set of left eigenvectors form a basis of a vector space. If this vector space is of dimension $ n$, then we can construct an $ n \times n$ matrix $ L$ whose rows are the components of the left eigenvectors, which has the property that

$\displaystyle L A \: = \: \Lambda L
$

It is possible to choose the left eigenvectors $ {\vec l }_1, {\vec l }_2, ...$ and right eigenvectors $ {\vec r }_1, {\vec r }_2, ...$ so that

$\displaystyle {\vec l }_i \cdot {\vec r }_j \: = \:
\left\{
\begin{cases}
1 & \text{if} \: i = j \\
0 & \rm {otherwise}
\end{cases}\right.
$

This is easily done if we define $ L = R^{-1}$ and define the components of the left eigenvectors to be the elements of the respective rows of $ L$. Beginning with $ A R = R \Lambda$ and multiplying both sides on the left by $ R^{-1}$, we obtain

$\displaystyle R^{-1} A R \: = \: \Lambda
$

and multiplying on the right by $ R^{-1}$, we have

$\displaystyle R^{-1} A \: = \: \Lambda R^{-1}
$

which implies that any row of $ R^{-1}$ satisfies the properties of a left eigenvector.


Diagonalization of a Matrix

Given an $ n \times n$ matrix $ A$, we say that $ A$ is diagonalizable if there is a matrix $ X$ so that

$\displaystyle X^{-1} A X \: = \: \Lambda
$

where

$\displaystyle \Lambda \: = \:
\left[
\begin{array}{ccccc}
\lambda_1 & 0 & 0 & \...
... & \vdots & \ddots & \vdots \\
0 & 0 & 0 & 0 & \lambda_n
\end{array}\right]
$

It is clear from the above discussions that if all the eigenvalues are real and district, then we can use the matrix of right eigenvectors $ R$ as $ X$.


Bibliography


\begin{singlespace}
\noindent
\footnotesize\bfseries All contents copyright (c) ...
...ment, University of California, Davis \\
All rights reserved.
\end{singlespace}


Ken Joy
2000-11-28