# Generalized vectors. Eigenvalues/Eigenvectors.

1. Nov 19, 2012

### Zondrina

1. The problem statement, all variables and given/known data

Let $A \in M_{22} (\mathbb{R})$ with one single eigenvalue λ and one single eigenvector v. We denote w the generalized vector such that $(A - λI)w = v$. Prove that v and w are linearly independent.

2. Relevant equations

I know that if A has only one eigenvalue λ and one eigenvector v that the equation Av = λv is satisfied. That is, (A - λI)v = 0.

3. The attempt at a solution

I thought about this a bit, but I'm having trouble getting this one going. I thought about letting B = (A - λI) so that we get two equations :

Bv = 0 and Bw = v

Then I thought that it could be broken down into two cases, one where λ = 0 and one where λ ≠ 0, but I'm not sure this is the right path to take.

Any pointers?

2. Nov 19, 2012

### Dick

What's the definition of linearly independent?

3. Nov 19, 2012

### Zondrina

There are several equivalent things, but if $v_1, ..., v_n$ is a set of vectors, then $v_1, ..., v_n$ are L.I if the equation :

$c_1v_1, ..., c_nv_n = 0$

has only a trivial solution. That is all the scalars are zero.

4. Nov 19, 2012

### Dick

That's the one. So if c1*v+c2*w=0 you want to show c1 and c2 are 0. Start by applying B to that equation. What do you conclude from that?

Last edited: Nov 19, 2012
5. Nov 19, 2012

### Zondrina

Okay so we want to show : $c_1v + c_2w = 0$ has only the trivial solution.

We know that if we multiply both sides by B = (A-λI) we get :

$B(c_1v + c_2w) = 0$
$c_1Bv + c_2Bw = 0$

We know that Bv = 0 because v is an eigenvector for the matrix A.

$c_2Bw = 0$

We also know that Bw = v since w is our generalized vector.

$c_2v = 0$

Now, since eigenvectors are non-zero, it must be the case that c2 = 0.

6. Nov 19, 2012

### Dick

Sure. Now put c2=0 into your original equation. What do you conclude about c1?

7. Nov 19, 2012

### Zondrina

Now subbing c2 back into our original equation yields :

$c_1v = 0$

Once again, since v is an eigenvector, it is non-zero. Which tells us that c1 must be zero.

Thus, c1 = c2 = 0 which implies that v and w are linearly independent.

EDIT : Q.E.D

Thanks.

Last edited: Nov 19, 2012
8. Nov 20, 2012

### HallsofIvy

Staff Emeritus
That's true for any A such that $\lambda$ is an eigenvalue with eigenvector v. The fact that A has only that eigenvalue and one independent eigenvector, tells you that there exist a vector u such that $A- \lambda I)u\ne 0$ but that $(A- \lambda I)^2u= 0$.

(It is never true that a linear operator has "one single eigenvector" because any multiple of an eigenvector is an eigenvector.)

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook