Prove Eigenvectors Linearly Independent: v & w

  • Thread starter Thread starter evilpostingmong
  • Start date Start date
  • Tags Tags
    Eigenvector Proof
Click For Summary
SUMMARY

The discussion centers on proving that eigenvectors v and w corresponding to different nonzero eigenvalues are linearly independent. The proof utilizes the definitions of eigenvectors and eigenvalues, specifically A v = c v and A w = d w, where c ≠ d. By assuming a linear dependence relation w = k v and deriving a contradiction, it is established that the only solution to the equation c v + d w = 0 is c = 0 and d = 0, confirming the linear independence of v and w.

PREREQUISITES
  • Understanding of eigenvectors and eigenvalues in linear algebra.
  • Familiarity with matrix operations and properties of identity matrices.
  • Knowledge of proof techniques, particularly proof by contradiction.
  • Basic concepts of linear independence in vector spaces.
NEXT STEPS
  • Study the properties of eigenvalues and eigenvectors in depth.
  • Learn about proof techniques in linear algebra, focusing on contradiction proofs.
  • Explore the implications of linear independence in vector spaces.
  • Investigate applications of eigenvectors in systems of differential equations.
USEFUL FOR

Students of linear algebra, mathematicians, and anyone involved in theoretical physics or engineering who needs to understand the concepts of eigenvectors and their properties.

evilpostingmong
Messages
338
Reaction score
0

Homework Statement


If v and w are eigenvectors with different (nonzero) eigenvalues, prove that they are
linearly independent.

Homework Equations


The Attempt at a Solution


Define an operator A such that a is an nxn matrix, and Av=cIv with
c an eigenvalue and v and eigenvector. Define a basis
<v1...vn> in that v=vi and w=vk 1<=k<=n and 1<=i<=n,
and let ci,i, be an element in A. I is the identity matrix.

Consider I*v, a 1xn column matrix with its lonely nonzero (1) at position 1,i.
Let the value at ci,i=c.Multiplying I*v by A gives c*Iv . If I*w (another 1xn column
matrix) had 1 at position 1,i, it would
correspond with ci,i on A and we would get c*Iw. But we assume w has a different
eigenvalue. Therefore I*w must have its 1 at a different position to correspond with
a different value on A (call it k). Since I*v must have 1 at a row different from I*w,
let c*Ia1v+k*Ia2w=0, and since 1 is at different rows, and c and I and k are not zero,
a1 and a2 must be 0, so we have c*I*0*v+I*k*0*w=0*v+0*w=0, thus
a1 and a2 are trivial so v and w are linearly independent.

I kind of have a gut feeling that this may be too wordy.
 
Last edited:
Physics news on Phys.org
What you have looks correct, but - as you said - perhaps you are overdoing a little :)

I would just start with stating that v and w are eigenvectors:
(1a) A v = c v
(1b) A w = d w
for some numbers c and d. We know that c is not equal to d and neither is equal to 0.

Now not being linearly independent means that there does not exist a number k such that w = k v. This is not pleasant to work with, so a proof by contradiction suggests itself. Suppose that there does exist a k such that
(2) w = k v.

Now can you derive a contradiction?
(Note: the rest of the proof is rather straightforward, because all you have to work with are equations (1a), (1b) and (2)).
 
This is a special case of the last problem you posted. You don't need a basis and you don't need a matrix. If Av=av and Au=bu (a not equal b) you want to show that if cv+du=0 then both c and d are zero. Hit that equation with (A-aI).
 

Similar threads

Replies
2
Views
1K
Replies
5
Views
2K
Replies
15
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
1K
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K