Finding eigenvectors of a matrix that has 2 equal eigenvalues

Click For Summary

Discussion Overview

The discussion revolves around the process of finding eigenvectors for a matrix with repeated eigenvalues, specifically addressing the case of matrix A with eigenvalues -3, 3, and 3. Participants explore the implications of having equal eigenvalues on the number of linearly independent eigenvectors and the methods used to compute them.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant states that matrix A has eigenvalues -3, 3, and 3, and questions whether it should have three linearly independent eigenvectors.
  • Another participant explains that a matrix with r equal eigenvalues can have anywhere from one to r eigenvectors, providing examples to illustrate this point.
  • Some participants note discrepancies in the number of eigenvectors found, with one claiming to find only one eigenvector for the repeated eigenvalue while others find two linearly independent solutions.
  • A later reply confirms that two linearly independent eigenvectors for the eigenvalue 3 can be found, correcting an earlier mistake in calculations.
  • One participant emphasizes that the assumption of having three linearly independent eigenvectors due to three eigenvalues is incorrect, highlighting a general principle regarding eigenvalues and eigenvectors.

Areas of Agreement / Disagreement

There is no consensus on the initial assumption that three eigenvalues imply three linearly independent eigenvectors. Some participants agree on the general principle that this is not necessarily true, while others initially struggle with this concept.

Contextual Notes

Participants express uncertainty regarding the calculation methods and the implications of repeated eigenvalues on the number of eigenvectors. There are references to specific calculations that led to confusion, indicating potential limitations in understanding the relationship between eigenvalues and eigenvectors.

aija
Messages
15
Reaction score
0
Matrix A=
2 1 2
1 2 -2
2 -2 -1

It's known that it has eigenvalues d1=-3, d2=d3=3Because it has 3 eigenvalues, it should have 3 linearly independent eigenvectors, right?

I tried to solve it on paper and got only 1 linearly independent vector from d1=-3 and 1 from d2=d3=3.

The method I used was:
[A-dI]v=0
and from this equation I used Gaussian elimination to find v1, v2 and v3

Even wolfram alpha finds only 1 solution from this:
http://www.wolframalpha.com/input/?i=-x+++y+++2z+=+0,+x+-+5y+-+2z+=+0,+2x+-+2y+-+4z+=+0
^
this is the system of equations from [A-3I]v=0 (3 is the eigenvalue d2=d3)

I don't see any way to get 2 linearly independent vectors from this solution
y=0, z=x/2

all i get is vectors
t*[2 0 1]T, t is a member of ℝ

here's matrix A in wolfram alpha: http://www.wolframalpha.com/input/?i={{2,+1,+2},+{1,+2,+-2},+{2,+-2,+-1}}

It shows that there is an eigenvector v3 = [1 1 0]T, but i don't see how to get it. Obviously my way to solve this problem doesn't work, so what did I forget to do in my solution or what did I do wrong and why doesn't it work this way?

PS. I'm not sure if this should be in the homework section, because this is more like a general problem and I don't understand why doesn't it work the way i tried to solve it. Matrix A could be any matrix with two equal eigenvalues.
 
Last edited:
Physics news on Phys.org
when a matrix has r equal eigenvalues, the number of eigenvectors (using complex numbers) can be anywhere from one to r.

e.g. a square 2by2 matrix which has a "1" in the upper right hand corner, and all other entries zero, has only one eigenvector.

a square r by r matrix which has s ones just above the diagonal and all other entries zero, should have r-s eigenvectors.
 
mathwonk said:
when a matrix has r equal eigenvalues, the number of eigenvectors (using complex numbers) can be anywhere from one to r.

e.g. a square 2by2 matrix which has a "1" in the upper right hand corner, and all other entries zero, has only one eigenvector.

a square r by r matrix which has s ones just above the diagonal and all other entries zero, should have r-s eigenvectors.
Ok, but according to wolfram alpha this matrix still has 3 eigenvectors, and I'm wondering why can i only find the first two eigenvectors using the method i used?
 
When I solve (A-3I)X=0, I find two linearly independent solutions, i.e. eigenvectors to the eigenvalue 3:
[1 1 0]T and [2 0 1]T.
 
Erland said:
When I solve (A-3I)X=0, I find two linearly independent solutions, i.e. eigenvectors to the eigenvalue 3:
[1 1 0]T and [2 0 1]T.
Thanks,

I tried it again and now I get it. I just made a little mistake calculating 2-3 (not -5)

It's weird because I counted this twice (did the same mistake twice) and checked that I had counted everything totally right but didn't notice this.
 
my point was that your question:

"Because it has 3 eigenvalues {-3,3,3}, it should have 3 linearly independent eigenvectors, right?"

has answer
"no, not right."

and misunderstanding this general principle is more harmful in the long run than adding 2 and -3 and getting -5.
 
mathwonk said:
my point was that your question:

"Because it has 3 eigenvalues {-3,3,3}, it should have 3 linearly independent eigenvectors, right?"

has answer
"no, not right."

and misunderstanding this general principle is more harmful in the long run than adding 2 and -3 and getting -5.
yes, i understood that as well, thanks
 

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
10K