Regarding the linear dependence of eigenvectors

Click For Summary
SUMMARY

The discussion focuses on the linear independence of eigenvectors associated with a square matrix, specifically when multiple eigenvectors correspond to the same eigenvalue. It is established that applying the Gram-Schmidt process or row operations on subsets of eigenvectors can demonstrate their linear independence, provided that the zero vector is not produced. Additionally, it is noted that matrices with special structures, such as Hermitian or symmetric matrices, guarantee that eigenvectors can be chosen to be mutually orthonormal, thus ensuring linear independence through Schur decomposition.

PREREQUISITES
  • Understanding of eigenvalues and eigenvectors in linear algebra
  • Familiarity with the Gram-Schmidt process for orthogonalization
  • Knowledge of matrix types, specifically Hermitian and symmetric matrices
  • Basic concepts of Schur decomposition in linear algebra
NEXT STEPS
  • Study the Gram-Schmidt process in detail to apply it to eigenvector sets
  • Learn about Schur decomposition and its implications for matrix diagonalization
  • Explore properties of Hermitian and symmetric matrices in relation to eigenvectors
  • Investigate absorbing state Markov chains and their eigenvector properties
USEFUL FOR

Mathematicians, data scientists, and students of linear algebra seeking to deepen their understanding of eigenvector properties and their applications in various matrix types.

Adgorn
Messages
133
Reaction score
19
Let's say we have a set of eigenvectors of a certain n-square matrix. I understand why the vectors are linearly independent if each vector belongs to a distinct eigenvalue.
However the set is comprised of subsets of vectors, where the vectors of each subset belong to the same eigenvalue. For example, in a 7-square matrix, ##v_1, v_2, v_3## belong to ##λ_1##, ##v_4, v_5## belong to ##λ_2## and ##v_6, v_7## belong to ##λ_3##. How do we prove the vectors are still linearly independent?
 
Physics news on Phys.org
Adgorn said:
Let's say we have a set of eigenvectors of a certain n-square matrix. I understand why the vectors are linearly independent if each vector belongs to a distinct eigenvalue.
However the set is comprised of subsets of vectors, where the vectors of each subset belong to the same eigenvalue. For example, in a 7-square matrix, ##v_1, v_2, v_3## belong to ##λ_1##, ##v_4, v_5## belong to ##λ_2## and ##v_6, v_7## belong to ##λ_3##. How do we prove the vectors are still linearly independent?

The most literal answer is to do gramm schmidt (or perhaps row operations) on ##v_1, v_2, v_3##, then repeat on ##v_4, v_5## then finally on ##v_6, v_7##. If in each case you don't get the zero vector as part of your set, then you have linear independence.

Now more generally, note that there could be special structure in your matrix that guarantees linear independence between (well chosen) eigenvectors. E.g. absorbing states in absorbing state markov chains are mutually orthonormal. (there's a similar, more general argument about recurrent classes and so forth but that's outside the scope.)

The biggest case to be aware of is if your matrix is Hermitian (or in Reals, symmetric), then via Schur decomposition it can always be diagonalized because the eigenvectors can be chosen to be mutually orthonormal. (There's a bit more to it with normal matrices, etc. but the big one is symmetry.)
 
StoneTemplePython said:
The most literal answer is to do gramm schmidt (or perhaps row operations) on ##v_1, v_2, v_3##, then repeat on ##v_4, v_5## then finally on ##v_6, v_7##. If in each case you don't get the zero vector as part of your set, then you have linear independence.

Now more generally, note that there could be special structure in your matrix that guarantees linear independence between (well chosen) eigenvectors. E.g. absorbing states in absorbing state markov chains are mutually orthonormal. (there's a similar, more general argument about recurrent classes and so forth but that's outside the scope.)

The biggest case to be aware of is if your matrix is Hermitian (or in Reals, symmetric), then via Schur decomposition it can always be diagonalized because the eigenvectors can be chosen to be mutually orthonormal. (There's a bit more to it with normal matrices, etc. but the big one is symmetry.)
I see, thanks for the assist :).
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K