Understanding Multiplicity and Dimension in Eigenvectors

Click For Summary
The discussion centers on understanding the implications of a characteristic polynomial for a square matrix A, specifically regarding its eigenvalues and eigenvectors. The characteristic polynomial provided indicates that A is a 10x10 matrix with an eigenvalue of 0, confirming that A is singular and thus not invertible. The participants clarify that the nullspace dimension must be at least 1 due to the presence of the zero eigenvalue, and they debate the correct interpretation of multiplicities in relation to the dimensions of eigenspaces. Ultimately, it is concluded that the nullspace dimension is at most 2, corresponding to the multiplicity of the zero eigenvalue. The discussion highlights the importance of understanding eigenvalues and their multiplicities in determining matrix properties.
EvLer
Messages
454
Reaction score
0
Hi all, I have a homework problem that I would like someone to check:

this relates to the eigenvectors: in the problem we are given characteristic polynomial, where I put x instead of lambda:
p(x) = x^2*(x+5)^3*(x -7)^5
Also given A is a square matrix, and then these questions (my answers):

- size of A? (10x10?? by looking at the multiplicities?)
- can A be invertible? (I think "no", otherwise there's no eigenvectors would be possible to find)
- possible dimensions for nullspace of A (at least 3 and at most 10?)
- what can be said about dim. of x = 7 eigenspace? (that according to its multiplicity, dim. can be at most 5 but at least 1?)
Do I understand correctly this whole "multiplicity" concept? I am really shaky on the first question.

Thanks in advance.
 
Physics news on Phys.org
EvLer said:
- can A be invertible? (I think "no", otherwise there's no eigenvectors would be possible to find)
the determinant of the matrix should be the constant term of the characteristic polynomial. Which in this case is 0, so the matrix is singular. But in general, a matrix can be invertible and still have a full complement of eigenvectors.

- possible dimensions for nullspace of A (at least 3 and at most 10?)
I'm going with at least 1, because 0 is an eigenvalue.
 
Last edited:
Don Aman said:
the determinant of the matrix should be the constant term of the characteristic polynomial. Which in this case is 0, so the matrix is singular. But in general, a matrix can be invertible and still have a full complement of eigenvectors.
the way I understood it from lecture is that we are looking for values of lambda that would make A non-invertible if subtracted by diagonal from original A, so you are saying that we do not know if the original matrix is invertible or not?
possible dimensions of nullspace of A: I'm going with at least 1, because 0 is an eigenvalue.
Ok, I see that 0 is an eigenvalue, I kind of missed it the first time, but is it not true that each multiplicity of lambda is supposed to 'produce" at least one eigenvector? Then it would be at least 3...since it's nullspace of the whole matrix and not an eigenspace corresponding to particular lambda?
Sorry, that i am so hard-headed :redface: .
Thanks in advance .
 
A is invertible iff 0 isn't an eigenvalue. (That is due to the fact that the determinant is the product of the eigenvalus).
0, here, is an eigenvalue (since when you place it in your polynomial it vanishes), therefore A is singular. (I'm just repeating what Don Aman said, it looked like you didn't understand).
 
EvLer said:
the way I understood it from lecture is that we are looking for values of lambda that would make A non-invertible if subtracted by diagonal from original A
right. that is, after all, the definition of eigenvalue
so you are saying that we do not know if the original matrix is invertible or not?
No. I'm saying the original matrix is singular. Singular is another word for non-invertible. I'm also saying that your original reason for suspecting that it is singular:
EvLer said:
- can A be invertible? (I think "no", otherwise there's no eigenvectors would be possible to find)
is not good. A matrix with characteristic polynomial (x-1)(x-2) may have eigenvectors possible to find, and still be invertible.

The reasoning I gave originally was that the determinant of a matrix is the constant term in its characteristic polynomial. The characteristic polynomial you gave has no constant term, whereas the characteristic polynomial I gave has constant term 2. Thus mine comes from an invertible matrix, yours does not. And we don't know nor care whether either matrix has all its eigenvalues.

Ok, I see that 0 is an eigenvalue, I kind of missed it the first time, but is it not true that each multiplicity of lambda is supposed to 'produce" at least one eigenvector?
Right. So this matrix has at least one vector whose eigenvalue is 0. That vector spans the null space. If the matrix is diagonalizable, then the matrix has two vectors with eigenvalue 0, in which case the null space is at least dimension 2. Since the root x=0 has multiplicity 2, there cannot be more than 2 eigenvectors with eigenvalue 0.

If the other eigenspaces have dimension matching their multiplicities, then the null space has dimension less than or equal to 2. If they do not, I believe the null space is still 2. So I don't think it can ever be more than 2.
Then it would be at least 3...since it's nullspace of the whole matrix and not an eigenspace corresponding to particular lambda?
Sorry, that i am so hard-headed :redface: .
Thanks in advance .
I'm not sure where you're getting 3 from. A diagonal matrix has a null space given solely by the number of zeros on its diagonal, so that's why I counted the multiplicity of the 0 eigenvalue. Of course, we don't know whether our matrix is diagonalizable, so we have to be a little more careful.
 
Thanks for the extended version of the answer. I re-read my textbook too, I think I get it now :smile:
I was totally off on the nullspace, I missed that nullspace of A is AX = 0 corresponding to eigenvalue of 0, in which case it's multiplicity is 2, so that's 2 at most and 1 at least.

Thanks to all again!
 
I am studying the mathematical formalism behind non-commutative geometry approach to quantum gravity. I was reading about Hopf algebras and their Drinfeld twist with a specific example of the Moyal-Weyl twist defined as F=exp(-iλ/2θ^(μν)∂_μ⊗∂_ν) where λ is a constant parametar and θ antisymmetric constant tensor. {∂_μ} is the basis of the tangent vector space over the underlying spacetime Now, from my understanding the enveloping algebra which appears in the definition of the Hopf algebra...

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 22 ·
Replies
22
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 2 ·
Replies
2
Views
6K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K