1. Dec 10, 2009

### sjeddie

Is at least one eigenvector guaranteed to exist given that we have found at least one eigenvalue? So, for example, given that we have found an eigenvalue of multiplicity 2 of a matrix, are we guaranteed to find at least 1 eigenvector of that matrix? Why or why not?

2. Dec 10, 2009

### rochfor1

3. Dec 11, 2009

### sjeddie

Thanks.

So it is possible to have less eigenvectors than eigenvalues (according to wiki's explanation on geometric multiplicity), so is it possible to have no eigenvectors at all for some eigenvalues?

4. Dec 11, 2009

### rochfor1

Not quite. Every eigenvalue will have at least one eigenvector. It is possible, however, for a repeated eigenvalue to have less linearly independent eigenvectors (its geometric multiplicity) than the number of repetitions of that eigenvalue (its algebraic multiplicity). There will always be at least one eigenvector though.

5. Dec 11, 2009

### sjeddie

thank you very much rochfor1!

6. Dec 12, 2009

### HallsofIvy

The definition of "eigenvalue" is "$\lambda$ is an eigenvalue for linear operator A if and only if there exist a non-zero vector, v, such that $Av= \lambda v$".

Such a vector is, of course, an eigenvector so, by definition, there exist at least one eigenvector corresponding to any eigenvalue. And, in fact, any multiple of an eigenvector or any linear combination of eigevectors corresponding to the same eigenvalue is also an eigenvector- there necessarily exist an infinite number of eigenvectors corresponding to any eigenvalue- they form a subspace.

Rochfor1 is specifically talking about the number of independent eigenvectors corresponding to each eigenvalue- the dimension of that subspace. That's the "geometric multiplicity" of that eigenvalue.