Regarding the linear dependence of eigenvectors

  • #1
123
10
Let's say we have a set of eigenvectors of a certain n-square matrix. I understand why the vectors are linearly independent if each vector belongs to a distinct eigenvalue.
However the set is comprised of subsets of vectors, where the vectors of each subset belong to the same eigenvalue. For example, in a 7-square matrix, ##v_1, v_2, v_3## belong to ##λ_1##, ##v_4, v_5## belong to ##λ_2## and ##v_6, v_7## belong to ##λ_3##. How do we prove the vectors are still linearly independent?
 

Answers and Replies

  • #2
StoneTemplePython
Science Advisor
Gold Member
2019 Award
1,173
573
Let's say we have a set of eigenvectors of a certain n-square matrix. I understand why the vectors are linearly independent if each vector belongs to a distinct eigenvalue.
However the set is comprised of subsets of vectors, where the vectors of each subset belong to the same eigenvalue. For example, in a 7-square matrix, ##v_1, v_2, v_3## belong to ##λ_1##, ##v_4, v_5## belong to ##λ_2## and ##v_6, v_7## belong to ##λ_3##. How do we prove the vectors are still linearly independent?
The most literal answer is to do gramm schmidt (or perhaps row operations) on ##v_1, v_2, v_3##, then repeat on ##v_4, v_5## then finally on ##v_6, v_7##. If in each case you don't get the zero vector as part of your set, then you have linear independence.

Now more generally, note that there could be special structure in your matrix that guarantees linear independence between (well chosen) eigenvectors. E.g. absorbing states in absorbing state markov chains are mutually orthonormal. (Theres a similar, more general argument about recurrent classes and so forth but that's outside the scope.)

The biggest case to be aware of is if your matrix is Hermitian (or in Reals, symmetric), then via Schur decomposition it can always be diagonalized because the eigenvectors can be chosen to be mutually orthonormal. (There's a bit more to it with normal matrices, etc. but the big one is symmetry.)
 
  • #3
123
10
The most literal answer is to do gramm schmidt (or perhaps row operations) on ##v_1, v_2, v_3##, then repeat on ##v_4, v_5## then finally on ##v_6, v_7##. If in each case you don't get the zero vector as part of your set, then you have linear independence.

Now more generally, note that there could be special structure in your matrix that guarantees linear independence between (well chosen) eigenvectors. E.g. absorbing states in absorbing state markov chains are mutually orthonormal. (Theres a similar, more general argument about recurrent classes and so forth but that's outside the scope.)

The biggest case to be aware of is if your matrix is Hermitian (or in Reals, symmetric), then via Schur decomposition it can always be diagonalized because the eigenvectors can be chosen to be mutually orthonormal. (There's a bit more to it with normal matrices, etc. but the big one is symmetry.)
I see, thanks for the assist :).
 

Related Threads on Regarding the linear dependence of eigenvectors

  • Last Post
Replies
15
Views
1K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
6
Views
4K
  • Last Post
Replies
5
Views
2K
  • Last Post
Replies
12
Views
1K
  • Last Post
Replies
6
Views
8K
Replies
6
Views
11K
Replies
10
Views
3K
  • Last Post
Replies
3
Views
2K
Top