Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Repeated Eigen Values and their Eigen Vectors

  1. Jul 19, 2012 #1
    I have a matrix and can't seem to get my head around finding all the eigen vectors.

    The matrix is A:

    (1 0 0 0
    1 0 0 1
    0 1 0 0
    0 0 1 0)

    I got the eigen values as:
    λ1 = 1, λ2 = λ3 = λ4 = 0

    For λ1:

    The eigen vector V1 is (0, 1, 1, 1).

    For λ2 -> λ4:

    The only eigen vector I could make out is: V2 (0, 0, 0, 0).

    To calculate the remaining eigen vectors I solved for P using the formula (A-λI)P = K
    Where K is an eigenvector of the matrix A associated with the eigenvalue (In this case V2). But substituting λ2 and V2 into the equation will only lead back again to the same equation which I don't think is correct. Is there a better method I can follow or is there something I am not doing correctly?

    Thanks
     
  2. jcsd
  3. Jul 19, 2012 #2

    Simon Bridge

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    One of your eigenvalues is 3-fold degenerate.
    This means that the system of equations is not independent.
    To get three distinct eigenvectors you need to add an additional contraint that they be orthogonal.

    http://mathworld.wolfram.com/Eigenvalue.html
    ... scroll down to "degenerate".

    Hmmm ... I'm getting:
    Code (Text):

    octave:2> A
    A =

       1   0   0   0
       1   0   0   1
       0   1   0   0
       0   0   1   0

    octave:3> l=eig(A)
    l =

      -0.50000 + 0.86603i
      -0.50000 - 0.86603i
       1.00000 + 0.00000i
       1.00000 + 0.00000i


     
     
    Last edited: Jul 19, 2012
  4. Jul 19, 2012 #3
    Thanks Simon I will have a look into that.
     
  5. Jul 19, 2012 #4

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    No. The very definition of "eigenvalue" is "There exist a non-zero vector, v, such that [itex]Av= \lambda v[/itex] so there MUST be a non-zero eigenvector (Many people will not count the zero vector as an eigenvector. I prefer to so that "the set of all eigenvectors corresponding to a given eigenvalue" will be a subspace. In any case, there must exist non-zero eigenvectors.)
    Any eigenvector corresponding to eigenvalue 0 must satisfy
    [tex]\begin{bmatrix}1 & 0 & 0 & 0 \\ 1 & 0 & 0 & 1\\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \end{bmatrix}\begin{bmatrix}x \\ y \\ z \\ u\end{bmatrix}= \begin{bmatrix}x \\ x+ u\\ y \\ z\end{bmatrix}= 0\begin{bmatrix}x \\ y \\ z \\ u\end{bmatrix}[/tex]

    which gives the four equations x= 0, x+ u= 0, y= 0, z= 0. Yes, that has x= y= z= u= 0 as the only root. And what that means is there does NOT exist such a non-zero vector and 0 is NOT an eigenvalue!

    When I calculate the characteristic equation, I get [itex](1- \lambda)(1- \lambda^3)= 0[/itex] which does NOT have 0 as a root.

    Those are NOT, strictly speaking, "eigenvectors". They are "generalized eigenvectors".

    0 is NOT an eigenvalue.
     
    Last edited: Jul 19, 2012
  6. Jul 19, 2012 #5

    Simon Bridge

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    Blondii may not have realized that you can have complex roots ;)
     
  7. Jul 19, 2012 #6
    Thanks HallsofIvy I realize now where my mistake was. I was not calculating the determinant of the characteristic equation properly and always kept stoping after the first phase (I guess that's what you get for staying awake too long).

    Thanks guys, your input was much appreciated.
     
  8. Jul 19, 2012 #7
    So after calculating would my eigen values be:

    λ1 = 1
    λ2 = 1
    λ3 = ([itex]\frac{1}{2}[/itex])(-1 + [itex]\sqrt{3}[/itex]i)
    λ4 = ([itex]\frac{1}{2}[/itex])(-1 - [itex]\sqrt{3}[/itex]i)

    Is this correct now?
     
  9. Jul 19, 2012 #8
    With these new corrections there will be repeated roots of 1 which will also generate linearly independent eigenvectors one of which is the zero vector [0,0,0,0].
     
  10. Jul 20, 2012 #9

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    No, the zero vector is NOT "independent" of any vectors! You should NEVER find the 0 vector as an eigenvector. Stop that!

    With eigenvalue 1, doing what I did before gives the four equations:
    x= x
    x+ u= y
    z= y
    u= z

    The last two equations tell us y= z= u. Replacing u by y in the second equation, x+ y= y we get x= 0. That is, < x, y, z, u>= <0, y, y, y>= y<0, 1, 1, 1>. <0, 1, 1, 1> is an eigenvector and, because this only spans a 1 dimensional space, you need to find a "generalized eigenvalue" by solving (A- I)v= <0, 1, 1, 1>.
     
  11. Jul 20, 2012 #10
    I understand the zero vector can't be used, but after solving the equation earlier that seems to be the only possible solution because all other vectors that will eventuate out of solving that equation still wont be linearly independent to <0,1,1,1>.
    I tried inputing the matrix into Mathematica to calculate its eigenvalues and vectors and it still returned the zero eigen vector along with the two complex and real eigen vectors.

    Am not really sure now, how to go about this? If the zero eigen vector can't be used and even trying to calculate the general eigen vector won't return a favorable outcome, is their some higher math that needs to be incorporated?
     
  12. Jul 20, 2012 #11
    Ok after considering the equation again I came up with the eigen vector <2,1,1,1>.
    Would this be acceptable since it seems to be linearly independent and sill satisfies the equation (A- I)v= <0, 1, 1, 1>?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Repeated Eigen Values and their Eigen Vectors
  1. Eigen values (Replies: 6)

Loading...