# Cayley Hamilton Theorem

by negation
Tags: cayley, hamilton, theorem
 Math Emeritus Sci Advisor Thanks PF Gold P: 39,693 The scalar "$\lambda$" is an "eigenvalue" for matrix A if and only if there exist a non-zero vector, v, such that $Av= \lambda v$. It can be shown that $\lambda$ is an eigenvalue for A if and only if it satisfies A's characteristic equation. A vector v, satisfying $Av= \lambda v$ is an eigenvector for A corresponding to eigenvalue $\lambda$ (some people require that v be non-zero to be an "eigenvector" but I prefer to include the 0 vector as an eigenvector for every eigenvalue). Further, if we can find n independent eigenvectors for A (always true if A has n distinct eigenvalues but often true even if the eigenvalues are not all distinct) then the matrix, P, having those eigenvectors as columns is invertible and $P^{-1}AP=D$ where D is the diagonal matrix having the eigenvalues of A on its diagonal. Then it is also true that $PDP^{-1}= A$ and $$A^n= (PDP^{-1})^n= (PDP^{-1})(PDP^{-1})\cdot\cdot\cdot(PDP^{-1})= PD(P^{-1}P)(D)(P^{-1}P)\cdot\cdot\cdot(P^{-1}P)DP= PD^nP^{-1}$$ Of course, $D^n$ is easy to calculate- it is the diagonal matrix having the nth power of the entries in D on its diagonal. Notice that this is "if we can find n independent eigenvectors for A". (Such a matrix is said to be "diagonalizable" matrix.) There exist non-diagonalizable matrices. They can be put in what is called "Jordan normal form" which is slightly more complicated than a diagonal matrix and it is a little more complicated to find powers.