Eigenvalue with multiplicity k resulting in k orthogonal eigenvectors?

Click For Summary

Discussion Overview

The discussion revolves around the properties of eigenvalues and eigenvectors of symmetric matrices, specifically addressing the claim that an eigenvalue with multiplicity k has k orthogonal eigenvectors. Participants explore examples and counterexamples, questioning the interpretation of this property and the existence of orthogonal eigenvectors.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant presents a symmetric matrix and finds that the eigenvectors corresponding to an eigenvalue with multiplicity 2 are not orthogonal, challenging the stated property.
  • The author of the notes acknowledges that the found eigenvectors are not orthogonal but suggests that there are indeed orthogonal eigenvectors to be found.
  • Another participant emphasizes that there are infinitely many eigenvectors for any eigenvalue, and that the eigenvectors corresponding to a single eigenvalue of multiplicity k span a subspace of dimension k, allowing for the construction of orthogonal eigenvectors.
  • Participants discuss the process of deriving general solutions for eigenvectors and the conditions under which they can be orthogonal.

Areas of Agreement / Disagreement

Participants generally agree that eigenvalues corresponding to different eigenvalues are orthogonal, but there is disagreement regarding the interpretation of eigenvectors corresponding to the same eigenvalue and whether the initial claim about orthogonality holds true in all cases. The discussion remains unresolved regarding the existence of orthogonal eigenvectors for the specific example provided.

Contextual Notes

Participants note that the eigenvectors found are not the only eigenvectors, and the discussion highlights the complexity of deriving orthogonal eigenvectors from a given eigenspace. There is an acknowledgment of the need for further exploration to find orthogonal pairs.

Who May Find This Useful

This discussion may be useful for students and practitioners of linear algebra, particularly those interested in the properties of eigenvalues and eigenvectors of symmetric matrices.

el_llavero
Messages
29
Reaction score
0
I am somewhat confused about this property of an eigenvalue when A is a symmetric matrix, I will state it exactly as it was presented to me.

"Properties of the eigenvalue when A is symmetric.
If an eigenvalue \lambda has multiplicity k, there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."

So I decided to test this property with a few matrices and I encountered one particular matrix that may provide a counterexample to this property.


()^T == column vector

A symmetric matrix: A= (5,4,2),(4,5,2),(2,2,2) with corresponding eigenvalue eigenvector pairs:
(-1,1,0)^T,(-1/2,0,1)^T map to eigenvalue 1
(2,2,1)^T map to eigenvalue 10

eigenvalue 1 has multiplicity 2, the eigenvectors corresponding to 1: (-1,1,0)^T,(-1/2,0,1)^T , their dot product: (-1,1,0)^T (dot) (-1/2,0,1)^T = 1/2, not 0, therefore eigenvectors corresponding to this root are not orthogonal to each other, however they are linearly idependent.

So I asked the author of these notes if i perhaps misinterpreted this property, is the correct interpretation perhaps,

If an eigenvalue has multiplicity k, there will be k (repeated k times), eigenvectors corresponding to this root that are orthogonal to eigenvectors corresponding to different roots?

and the author replied

"Yes, you are right that the two eigenvectors corresponding root one are not orthogonal. But the property says there ARE orthogonal ones. So you can keep looking for orthogonal ones.

How about the pair of (1,-1,0) and (1,1,-4) instead of the pair of (1,-1,0) and (-1/2,0,1)?"

So just to restate and compare

If an eigenvalue \lambda has multiplicity k, there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."

but the author says "that the two eigenvectors corresponding root one are not orthogonal"
and then says that the property says "there ARE orthogonal ones. So you can keep looking for orthogonal ones."

Can someone help me make sense of this?

- I found the eigenvectors corresponding to the eigenvalue(root),
- there were two so the eigenvalue has multiplicity 2
- however the 2 eigenvectors (corresponding to) eigenvalue (root) were not orthogonal v1 (dot) v2 != 0
**** But I'm supposed to keep looking for orthogonal ones ?

The property says:

there will be k (repeated k times),
orthogonal eigenvectors corresponding to this root."


The author says:

that the two eigenvectors corresponding root one are not orthogonal

(The property says) != (The author says)

but I'm supposed to keep looking for orthogonal ones?? I'm supposed to somehow derive the pair (1,-1,0) and (1,1,-4) instead of the pair of (1,-1,0) and (-1/2,0,1)?

Does this make sense to anyone? If so, please help me make sense of it. and enlighten me as to how I can derive the pair (1,-1,0) and (1,1,-4). thanks in advance
 
Physics news on Phys.org
I'm not sure what part is causing you trouble. It is true that, for a symmetric matrix, eigenvalues corresponding to different eigenvalues will be orthogonal. But you seem to be under the impression that the two eigenvectors you found are the only eigenvectors. You quote the author as saying "Yes, you are right that the two eigenvectors corresponding root one are not orthogonal." (added emphasis). I doubt he/she said exactly that. More like he/she said that those two eigenvectors that you found were not orthogonal. It is just wrong to talk about "the" two eigenvectors. There are always an infinite number of eigenvectors for any eigenvalue.

It is also true that the eigenvectors corresponding to a single eigenvalue of multiplicity k (again for a symmetric matrix) span a subspace of dimension k. Of course, you can always construct an orthonormal basis for the subspace so there will be k orthogonal eigenvectors.

But any subspace will also contain eigenvectors that are NOT orthogonal. In your example, any eigenvectors corresponding to eigenvalue 1 must satisfy 2x+ 2y+ z= 0 or z= -2x- 2y. The easy way to get eigenvectors that span the "eigenspace" is take x= 1, y= 0 to get z= -2 giving an eigenvector <1, 0, -2> and then take x= 0, y= 1 to get z= -2 again giving another eigenvector <0, 1, -2>. Those are not orthogonal since their dot product is 4 but if, instead of x= 1, y= 0, you take x= -4, y= 5, you get z= 8- 10= -2 and so <-4, 5, -2> is an eigenvector. <1, 0, -2>.<-4, 5, -2>= 0. Those eigenvectors are orthogonal.
 
Last edited by a moderator:
HallsofIvy said:
I'm not sure what part is causing you trouble. It is true that, for a symmetric matrix, eigenvalues corresponding to different eigenvalues will be orthogonal. But you seem to be under the impression that the two eigenvectors you found are the only eigenvectors. You quote the author as saying "Yes, you are right that the two eigenvectors corresponding root one are not orthogonal." (added emphasis). I doubt he/she said exactly that. More like he/she said that those two eigenvectors that you found were not orthogonal. It is just wrong to talk about "the" two eigenvectors. There are always an infinite number of eigenvectors for any eigenvalue.


Let me add a disclaimer to my post:
I have not taken a linear algebra class, however the course I will be taking this fall09 semester uses a lot of linear algebra, where the main linear algebra techniques are covered in a preliminary pdf. I think I have done a good job of self learning many of the topics in that pdf, however in the resources I've been using there is nothing regarding properties of the eigenvalue when a matrix is symmetric or properties of eigenvalues and eigenvectors in general, other then how to derive them from a matrix and some other basic information .

HallsofIvy,


For not being sure what part is causing me trouble you really clarified many things for me. Unfortunately, I did quote the author word for word and he did say " the two eigenvectors corresponding root one." This was probably my main source of confusion, I was assuming there were only two eigenvectors, which I now know is wrong. I never encountered any information that explicitly stated "there are always an infinite number of eigenvectors for any eigenvalue" and the author's reply didn't shed much light on this either. I am familiar with many of the concepts you talk about in the rest of your post, I'll comment on the rest of your post after I've gone through it more thoroughly and gotten a handle on the infinite eigenvectors concept.
 
I went back in the primary book I'm using and saw that i was dealing with a homogeneous system of linear equations, because all the constant terms are zero, therefore there are many solutions. Then I read further into deriving general solutions, in this case

In terms of row vectors, we can express the general solution as: (-x₂-(1/2)x₃,x₂,x₃) and seperating the vairables in the general solution results in
(-x₂-(1/2)x₃,x₂,x₃)=x₂(-1,1,0)+x₃(-1/2,0,1), however to make the vectors easier to work with we'll turn the fraction into a whole number by writing the general solution in terms of x₁ and x₂, i.e., x₁=-x₂-(1/2)x₃, -(1/2)x₃=x₁+x₂, x₃=-2x₁-2x₂, giving the general solution (x₁,x₂,-2x₁-2x₂) to get the basis, separate the variables in the general solution
(x₁,x₂,-2x₁-2x₂) = x₁(1,0,-2)+x₂(0,1,-2)

For two vectors to be orthogonal they must have the propery that their dot product is equal to zero. For a subspace W with dimension 2 in a subspace of R³

where the basis for W is the set of vectors{ (a1, a2, a3), (b1, b2, b3)}
The vectors must satisfy a1*b1+ a2*b2+ a3*b3=0

in our particular case the basis for the subspace is the set of vectors {(1,0,-2), (0,1,-2)}
with the corresponding general solution (x₁,x₂,-2x₁-2x₂)

to get two vectors orthogonal to each other with the same basis solve for one of the variables in the equation x₁+x₂-4x₁x₂=0

x₁= ((x₂)/(4x₂-1)) if x₂≠(1/4)

plug in a value for x₁and x₂into the general solution (x₁,x₂,-2x₁-2x₂) that satisfy x₁= ((x₂)/(4x₂-1)) if x₂≠(1/4), then separate the variables in the general solution and you got yourself two orthogonal vectors.
 
I think i may have some inaccuracies in the above post.

Nevertheless I've cleared up my confusion with these orthogonal eigenvectors and how they preserve all the properties of spaces. I saw how you plugged in values for x and y in order to cancel out the 4. However I did some more research and found a general way of obtaining an orthogonal set from a set of linearly independent vectors, and to then produce an orthonormal set. It's called the Gram-Schmidt orthogonalization process.
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 7 ·
Replies
7
Views
10K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 29 ·
Replies
29
Views
3K