MHB When there is a double root for the eigenvalue, how many eigenvectors?

Petrus
Messages
702
Reaction score
0
Hello MHB,
I got one question. If I want to find basis ker and it got double root in eigenvalue but in that eigenvalue i find one eigenvector(/basis) what kind of decission can I make? Is it that if a eigenvalue got double root Then it Will ALWAYS have Two eigenvector(/basis)?

Regards,
$$|\pi\rangle$$
 
Physics news on Phys.org
Re: 1 basis or Two basis for double root to ker?

Petrus said:
If I want to find basis ker and it got double root in eigenvalue but in that eigenvalue i find one eigenvector(/basis) what kind of decission can I make? Is it that if a eigenvalue got double root Then it Will ALWAYS have Two eigenvector(/basis)?
Not necessarily. When there is a double root for the eigenvalue there will always be at least one eigenvector. There may or may not be a second, linearly independent, eigenvector. For example, the matrices $\begin{bmatrix}1&0\\ 0&1 \end{bmatrix}$ and $\begin{bmatrix}1&1\\ 0&1 \end{bmatrix}$ both have a repeated eigenvalue $1$, but the first one has two linearly independent eigenvectors and the second one only has one.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
5
Views
4K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K