Understanding Multiplicity and Dimension in Eigenvectors

In summary, the conversation revolved around understanding the concept of eigenvectors and how they relate to the characteristic polynomial of a matrix. It was discussed that the size of a matrix can be determined by looking at the multiplicities of the polynomial's roots. It was also mentioned that a matrix can be invertible and still have a full complement of eigenvectors. The discussion then shifted to the possible dimensions for the nullspace of a matrix, with the conclusion that it can range from 1 to 2, depending on the multiplicity of the eigenvalue 0. Overall, the conversation helped clarify the understanding of the "multiplicity" concept and the relationship between eigenvalues and eigenvectors.
  • #1
EvLer
458
0
Hi all, I have a homework problem that I would like someone to check:

this relates to the eigenvectors: in the problem we are given characteristic polynomial, where I put x instead of lambda:
p(x) = x^2*(x+5)^3*(x -7)^5
Also given A is a square matrix, and then these questions (my answers):

- size of A? (10x10?? by looking at the multiplicities?)
- can A be invertible? (I think "no", otherwise there's no eigenvectors would be possible to find)
- possible dimensions for nullspace of A (at least 3 and at most 10?)
- what can be said about dim. of x = 7 eigenspace? (that according to its multiplicity, dim. can be at most 5 but at least 1?)
Do I understand correctly this whole "multiplicity" concept? I am really shaky on the first question.

Thanks in advance.
 
Physics news on Phys.org
  • #2
EvLer said:
- can A be invertible? (I think "no", otherwise there's no eigenvectors would be possible to find)
the determinant of the matrix should be the constant term of the characteristic polynomial. Which in this case is 0, so the matrix is singular. But in general, a matrix can be invertible and still have a full complement of eigenvectors.

- possible dimensions for nullspace of A (at least 3 and at most 10?)
I'm going with at least 1, because 0 is an eigenvalue.
 
Last edited:
  • #3
Don Aman said:
the determinant of the matrix should be the constant term of the characteristic polynomial. Which in this case is 0, so the matrix is singular. But in general, a matrix can be invertible and still have a full complement of eigenvectors.
the way I understood it from lecture is that we are looking for values of lambda that would make A non-invertible if subtracted by diagonal from original A, so you are saying that we do not know if the original matrix is invertible or not?
possible dimensions of nullspace of A: I'm going with at least 1, because 0 is an eigenvalue.
Ok, I see that 0 is an eigenvalue, I kind of missed it the first time, but is it not true that each multiplicity of lambda is supposed to 'produce" at least one eigenvector? Then it would be at least 3...since it's nullspace of the whole matrix and not an eigenspace corresponding to particular lambda?
Sorry, that i am so hard-headed :redface: .
Thanks in advance .
 
  • #4
A is invertible iff 0 isn't an eigenvalue. (That is due to the fact that the determinant is the product of the eigenvalus).
0, here, is an eigenvalue (since when you place it in your polynomial it vanishes), therefore A is singular. (I'm just repeating what Don Aman said, it looked like you didn't understand).
 
  • #5
EvLer said:
the way I understood it from lecture is that we are looking for values of lambda that would make A non-invertible if subtracted by diagonal from original A
right. that is, after all, the definition of eigenvalue
so you are saying that we do not know if the original matrix is invertible or not?
No. I'm saying the original matrix is singular. Singular is another word for non-invertible. I'm also saying that your original reason for suspecting that it is singular:
EvLer said:
- can A be invertible? (I think "no", otherwise there's no eigenvectors would be possible to find)
is not good. A matrix with characteristic polynomial (x-1)(x-2) may have eigenvectors possible to find, and still be invertible.

The reasoning I gave originally was that the determinant of a matrix is the constant term in its characteristic polynomial. The characteristic polynomial you gave has no constant term, whereas the characteristic polynomial I gave has constant term 2. Thus mine comes from an invertible matrix, yours does not. And we don't know nor care whether either matrix has all its eigenvalues.

Ok, I see that 0 is an eigenvalue, I kind of missed it the first time, but is it not true that each multiplicity of lambda is supposed to 'produce" at least one eigenvector?
Right. So this matrix has at least one vector whose eigenvalue is 0. That vector spans the null space. If the matrix is diagonalizable, then the matrix has two vectors with eigenvalue 0, in which case the null space is at least dimension 2. Since the root x=0 has multiplicity 2, there cannot be more than 2 eigenvectors with eigenvalue 0.

If the other eigenspaces have dimension matching their multiplicities, then the null space has dimension less than or equal to 2. If they do not, I believe the null space is still 2. So I don't think it can ever be more than 2.
Then it would be at least 3...since it's nullspace of the whole matrix and not an eigenspace corresponding to particular lambda?
Sorry, that i am so hard-headed :redface: .
Thanks in advance .
I'm not sure where you're getting 3 from. A diagonal matrix has a null space given solely by the number of zeros on its diagonal, so that's why I counted the multiplicity of the 0 eigenvalue. Of course, we don't know whether our matrix is diagonalizable, so we have to be a little more careful.
 
  • #6
Thanks for the extended version of the answer. I re-read my textbook too, I think I get it now :smile:
I was totally off on the nullspace, I missed that nullspace of A is AX = 0 corresponding to eigenvalue of 0, in which case it's multiplicity is 2, so that's 2 at most and 1 at least.

Thanks to all again!
 

1. What is the difference between multiplicity and dimension?

Multiplicity refers to the number of times a particular value appears in a data set, while dimension refers to the number of independent variables or features in a dataset.

2. How do multiplicity and dimension affect data analysis?

Multiplicity can affect the accuracy of statistical analyses if not taken into account, while dimension can affect the complexity and interpretability of results.

3. Can multiplicity and dimension be reduced in a dataset?

Yes, multiplicity can be reduced through techniques such as data cleaning and outlier removal, while dimension can be reduced through feature selection or dimensionality reduction techniques.

4. How does the choice of dimension impact the performance of machine learning models?

The choice of dimension can significantly affect the performance of machine learning models, as higher dimensionality can lead to overfitting and lower accuracy.

5. Are there any limitations to using multiplicity and dimension in data analysis?

Yes, both multiplicity and dimension have limitations and should be carefully considered in data analysis, as they can lead to biased results or make the data too complex to interpret.

Similar threads

  • Linear and Abstract Algebra
Replies
22
Views
902
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
881
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
10
Views
983
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
2K
  • Linear and Abstract Algebra
Replies
16
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
5K
  • Beyond the Standard Models
Replies
4
Views
2K
Back
Top