Generalized Eigenvectors?

I see that a generalized eigenvector can be represented as such:

(A - λI)xk+1 = xk, where A is a square matrix, x is an eigenvector, λ is the eigenvalue I is the identity matrix.

This might be used, for example, if we have duplicate eigenvalues, and can only derive one eigenvector from the characteristic equation, then we can use this eigenvector to find other eigenvectors.

Can someone explain this formula? I don't really get it. My textbook has a similar formula:

xk+1 = Axk

which I understand perfectly. But how does the former formula make sense?
 PhysOrg.com science news on PhysOrg.com >> Leading 3-D printer firms to merge in $403M deal (Update)>> LA to give every student an iPad;$30M order>> CIA faulted for choosing Amazon over IBM on cloud contract
 Recognitions: Science Advisor Are you talking about an iterative method to find the eigenvalues and vectors? This looks like the inverse power method. The eigenvectors of $A$ and $(A - \lambda I)$ are identical, but the eigenvalues differ by $\lambda$. The inverse power method converges to the eigenvalue with the smallest modulus. If $\lambda$ is close to an eigenvalue of $A$, the corresponding eigenvalue of $(A - \lambda I)$ will be close to 0 and the iteration will converge rapidly to that eigenpair.
 Recognitions: Gold Member Science Advisor Staff Emeritus Suppose the characteristic equation for a matrix A is $(x- \lambda)^2= 0$. Then $\lambda$ is an eigenvalue having "algebraic multiplicity" 2. Now the "geometric multiplicity" of A, the number of independent eigenvectors corresponding to that eigenvalue, is either 1 or 2. The crucial point is that every matrix satisfies its own characteristic equation. That is, $(A- \lambda I)^2= 0$ so that $(A- \lambda I)^2v= 0$ for every vector v. Now, it might happen that $(A- \lambda I)v= 0$ for every vector, from which $(A- \lambda I)v= (A-\lambda I)((A- \lambda I)v)= (A- \lambda I)0= 0$ follows immediately. That would be the case if there are two independent eigenvectors- if the geometric multiplicity were 0. If not, if A has only one eigenvector, it must still be true that $(A- \lambda I)^2v= 0$, even if v is NOT an eigenvector of A. But in that case, $u= (A- \lambda I)v$ is NOT 0 but we still must have $(A- \lambda I)^2 v= (A- \lambda I)((A- \lambda I)v)= (A- \lambda I)u= 0$ which says that u is an eigenvector of A. That is, in order to have $(A- \lambda I)^2 v= 0$, either v is an eigenvector of A or $(A- \lambda v)$ is an eigenvector.

 Tags eigen values, eigen vectors, linear algebra, matrix algebra

 Similar discussions for: Generalized Eigenvectors? Thread Forum Replies Differential Equations 2 General Math 5 Linear & Abstract Algebra 10 Linear & Abstract Algebra 3 Classical Physics 3