Linear independence of eigenvectors

In summary, when considering an arbitrary n x n matrix, it is not generally true that the eigenvectors will be linearly independent. In cases of repeated eigenvalues, it may or may not be possible to find independent eigenvectors. However, for positive definite matrices, it is guaranteed that a set of linearly independent (even orthonormal) eigenvectors can be found. This is also true for Hermitian matrices, including the product of a positive-definite and semi-positive-definite matrix. It is possible for a Hermitian matrix to have repeated eigenvalues, and this may also be the case for several eigenvalues, but n x n Hermitian matrices will always have n distinct eigenvalues.
  • #1
AxiomOfChoice
533
1
How can you show that an arbitrary [itex]n \times n[/itex] matrix has [itex]n[/itex] linearly independent eigenvectors? What if all you know about the matrix is that it's the product of a positive-definite matrix and a semi-positive-definite matrix?
 
Physics news on Phys.org
  • #2
It is NOT true in general that the eigenvectors are linearly independent. It's only true for those eigenvectors corresponding to DISTINCT eigenvalues.

In the case of repeated eigenvalues, it may or may not be possible to find independent eigenvectors. For example, the identity matrix has only one eigenvalue, 1, repeated n times. In this case, EVERY nonzero vector is an eigenvector, and if you simply choose n of them at random, most likely they won't be linearly independent. (However, in this case at least it's POSSIBLE to choose n independent eigenvectors.)

For some matrices with repeated eigenvectors, it's not even possible to choose n independent eigenvectors: for example, the matrix

[0 1]
[0 0]

has only one (repeated) eigenvalue, 0. Any eigenvector [x y]^T must satisfy y = 0, so this restricts the eigenvectors to a one-dimensional subspace, and therefore you can't have two independent eigenvectors.

On the other hand, for a positive definite matrix, you are guaranteed to be able to find a linearly independent (even orthonormal) set of eigenvectors. This is even true for any Hermitian matrix by the spectral theorem (see, e.g., Horn and Johnson, "Matrix Analysis," theorem 2.5.6).

Since a positive-definite or positive-semidefinite matrix is by definition Hermitian, and so therefore is their product, then the answer to your second question is yes, it's true, but the proof is not so elementary.
 
  • #3
Thanks - that's helpful.

Is it possible for a Hermitian matrix to have a repeated eigenvalue, or several repeated eigenvalues? Or do we know that [itex]n \times n[/itex] Hermitian matrices will have [itex]n[/itex] distinct eigenvalues?
 
  • #4
AxiomOfChoice said:
Thanks - that's helpful.

Is it possible for a Hermitian matrix to have a repeated eigenvalue, or several repeated eigenvalues? Or do we know that [itex]n \times n[/itex] Hermitian matrices will have [itex]n[/itex] distinct eigenvalues?

Sure, the identity matrix is Hermitian and all of its eigenvalues are 1.
 

1. What does it mean for eigenvectors to be linearly independent?

Linear independence of eigenvectors means that the set of eigenvectors associated with a given matrix are not multiples of each other. In other words, no eigenvector can be expressed as a linear combination of the others.

2. Why is it important to determine if eigenvectors are linearly independent?

Determining the linear independence of eigenvectors is important because it allows us to find a basis for the vector space spanned by the eigenvectors. This basis can then be used to simplify calculations and solve systems of equations.

3. How do I determine if a set of eigenvectors is linearly independent?

To determine if a set of eigenvectors is linearly independent, we can use the determinant method. This involves constructing a matrix with the eigenvectors as columns and taking the determinant. If the determinant is non-zero, the eigenvectors are linearly independent.

4. Can eigenvectors be linearly independent if they have different eigenvalues?

Yes, eigenvectors can be linearly independent even if they have different eigenvalues. The linear independence of eigenvectors is determined by their directions, not their magnitudes.

5. What happens if a set of eigenvectors is not linearly independent?

If a set of eigenvectors is not linearly independent, it means that there are redundant vectors in the set. This can lead to complications in calculations and may require finding a different set of eigenvectors that are linearly independent.

Similar threads

  • Linear and Abstract Algebra
Replies
24
Views
602
  • Linear and Abstract Algebra
Replies
6
Views
851
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
3K
  • Linear and Abstract Algebra
Replies
8
Views
1K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
860
  • Linear and Abstract Algebra
Replies
1
Views
796
  • Linear and Abstract Algebra
Replies
1
Views
583
  • Linear and Abstract Algebra
Replies
2
Views
2K
Back
Top