info prev up next book cdrom email home

Eigenvector

A right eigenvector satisfies

\begin{displaymath}
{\hbox{\sf A}}{\bf X}=\lambda {\bf X},
\end{displaymath} (1)

where ${\bf X}$ is a column Vector. The right Eigenvalues therefore satisfy
\begin{displaymath}
\vert{\hbox{\sf A}}-\lambda{\hbox{\sf I}}\vert=0.
\end{displaymath} (2)

A left eigenvector satisfies
\begin{displaymath}
{\bf X}{\hbox{\sf A}} =\lambda {\bf X},
\end{displaymath} (3)

where ${\bf X}$ is a row Vector, so
\begin{displaymath}
({\bf X}{\hbox{\sf A}})^{\rm T} = \lambda_L {\bf X}^{\rm T}
\end{displaymath} (4)


\begin{displaymath}
{\hbox{\sf A}}^{\rm T}{\bf X}^{\rm T} = \lambda_L {\bf X}^{\rm T},
\end{displaymath} (5)

where ${\bf X}^{\rm T}$ is the transpose of ${\bf X}$. The left Eigenvalues satisfy
\begin{displaymath}
\vert{\hbox{\sf A}}^{\rm T}-\lambda_L{\hbox{\sf I}}\vert = \...
...T}\vert
= \vert({\hbox{\sf A}}-\lambda_L{\hbox{\sf I}})\vert,
\end{displaymath} (6)

(since $\vert{\hbox{\sf A}}\vert=\vert{\hbox{\sf A}}^{\rm T}\vert$) where $\vert{\hbox{\sf A}}\vert$ is the Determinant of A. But this is the same equation satisfied by the right Eigenvalues, so the left and right Eigenvalues are the same. Let ${\bf X}_R$ be a Matrix formed by the columns of the right eigenvectors and ${\bf X}_L$ be a Matrix formed by the rows of the left eigenvectors. Let
\begin{displaymath}
{\hbox{\sf D}}\equiv \left[{\matrix{\lambda_1 & \cdots & 0\cr \vdots & \ddots & \vdots\cr 0 & \cdots & \lambda_n}}\right].
\end{displaymath} (7)

Then
\begin{displaymath}
{\hbox{\sf A}}{\bf X}_R = {\bf X}_R {\hbox{\sf D}}\qquad {\bf X}_L{\hbox{\sf A}}= {\hbox{\sf D}}{\bf X}_L
\end{displaymath} (8)


\begin{displaymath}
{\bf X}_L{\hbox{\sf A}}{\bf X}_R = {\bf X}_L{\bf X}_R {\hbox...
...}_L{\hbox{\sf A}}{\bf X}_R = {\hbox{\sf D}}{\bf X}_L{\bf X}_R,
\end{displaymath} (9)

so
\begin{displaymath}
{\bf X}_L{\bf X}_R{\hbox{\sf D}}= {\hbox{\sf D}}{\bf X}_L{\bf X}_R.
\end{displaymath} (10)

But this equation is of the form ${\hbox{\sf C}}{\hbox{\sf D}}= {\hbox{\sf D}}{\hbox{\sf C}}$ where ${\hbox{\sf D}}$ is a Diagonal Matrix, so it must be true that ${\hbox{\sf C}}\equiv {\bf X}_L{\bf X}_R$ is also diagonal. In particular, if A is a Symmetric Matrix, then the left and right eigenvectors are transposes of each other. If A is a Self-Adjoint Matrix, then the left and right eigenvectors are conjugate Hermitian Matrices.


Given a $3\times 3$ Matrix A with eigenvectors ${\bf x}_1$, ${\bf x}_2$, and ${\bf x}_3$ and corresponding Eigenvalues $\lambda_1$, $\lambda_2$, and $\lambda_3$, then an arbitrary Vector ${\bf y}$ can be written

\begin{displaymath}
{\bf y}=b_1{\bf x}_1+b_2{\bf x}_2+b_3{\bf x}_3.
\end{displaymath} (11)

Applying the Matrix A,
$\displaystyle {\hbox{\sf A}}{\bf y}$ $\textstyle =$ $\displaystyle b_1{\hbox{\sf A}}{\bf x}_1+b_2{\hbox{\sf A}}{\bf x}_2+b_3{\hbox{\sf A}}{\bf x}_3$  
  $\textstyle =$ $\displaystyle \lambda_1\left({b_1{\bf x}_1+{\lambda_2\over\lambda_1}b_2{\bf x}_2+{\lambda_3\over\lambda_1}b_3{\bf x}_3}\right),$ (12)

so
\begin{displaymath}
{\hbox{\sf A}}^n{\bf y}={\lambda_1}^n \left[{b_1{\bf x_1}+\l...
...\left({\lambda_3\over\lambda_1}\right)^n b_3{\bf x}_3}\right].
\end{displaymath} (13)

If $\lambda_1>\lambda_2, \lambda_3$, it therefore follows that
\begin{displaymath}
\lim_{n\to\infty} {\hbox{\sf A}}^n {\bf y}={\lambda_1}^n b_1{\bf x_1},
\end{displaymath} (14)

so repeated application of the matrix to an arbitrary vector results in a vector proportional to the Eigenvector having the largest Eigenvalue.

See also Eigenfunction, Eigenvalue


References

Arfken, G. ``Eigenvectors, Eigenvalues.'' §4.7 in Mathematical Methods for Physicists, 3rd ed. Orlando, FL: Academic Press, pp. 229-237, 1985.

Press, W. H.; Flannery, B. P.; Teukolsky, S. A.; and Vetterling, W. T. ``Eigensystems.'' Ch. 11 in Numerical Recipes in FORTRAN: The Art of Scientific Computing, 2nd ed. Cambridge, England: Cambridge University Press, pp. 449-489, 1992.



info prev up next book cdrom email home

© 1996-9 Eric W. Weisstein
1999-05-25