Linear independence of eigenvectors

Click For Summary
An arbitrary n x n matrix does not guarantee n linearly independent eigenvectors unless the eigenvalues are distinct. In cases of repeated eigenvalues, such as the identity matrix, it is possible to select independent eigenvectors, but often they may not be independent. Some matrices, like the one given, restrict eigenvectors to a lower-dimensional subspace, preventing the selection of independent vectors. Positive definite matrices ensure a complete set of linearly independent eigenvectors, as confirmed by the spectral theorem for Hermitian matrices. Hermitian matrices can have repeated eigenvalues, as demonstrated by the identity matrix.
AxiomOfChoice
Messages
531
Reaction score
1
How can you show that an arbitrary n \times n matrix has n linearly independent eigenvectors? What if all you know about the matrix is that it's the product of a positive-definite matrix and a semi-positive-definite matrix?
 
Physics news on Phys.org
It is NOT true in general that the eigenvectors are linearly independent. It's only true for those eigenvectors corresponding to DISTINCT eigenvalues.

In the case of repeated eigenvalues, it may or may not be possible to find independent eigenvectors. For example, the identity matrix has only one eigenvalue, 1, repeated n times. In this case, EVERY nonzero vector is an eigenvector, and if you simply choose n of them at random, most likely they won't be linearly independent. (However, in this case at least it's POSSIBLE to choose n independent eigenvectors.)

For some matrices with repeated eigenvectors, it's not even possible to choose n independent eigenvectors: for example, the matrix

[0 1]
[0 0]

has only one (repeated) eigenvalue, 0. Any eigenvector [x y]^T must satisfy y = 0, so this restricts the eigenvectors to a one-dimensional subspace, and therefore you can't have two independent eigenvectors.

On the other hand, for a positive definite matrix, you are guaranteed to be able to find a linearly independent (even orthonormal) set of eigenvectors. This is even true for any Hermitian matrix by the spectral theorem (see, e.g., Horn and Johnson, "Matrix Analysis," theorem 2.5.6).

Since a positive-definite or positive-semidefinite matrix is by definition Hermitian, and so therefore is their product, then the answer to your second question is yes, it's true, but the proof is not so elementary.
 
Thanks - that's helpful.

Is it possible for a Hermitian matrix to have a repeated eigenvalue, or several repeated eigenvalues? Or do we know that n \times n Hermitian matrices will have n distinct eigenvalues?
 
AxiomOfChoice said:
Thanks - that's helpful.

Is it possible for a Hermitian matrix to have a repeated eigenvalue, or several repeated eigenvalues? Or do we know that n \times n Hermitian matrices will have n distinct eigenvalues?

Sure, the identity matrix is Hermitian and all of its eigenvalues are 1.
 
I am studying the mathematical formalism behind non-commutative geometry approach to quantum gravity. I was reading about Hopf algebras and their Drinfeld twist with a specific example of the Moyal-Weyl twist defined as F=exp(-iλ/2θ^(μν)∂_μ⊗∂_ν) where λ is a constant parametar and θ antisymmetric constant tensor. {∂_μ} is the basis of the tangent vector space over the underlying spacetime Now, from my understanding the enveloping algebra which appears in the definition of the Hopf algebra...

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
9K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 14 ·
Replies
14
Views
3K