1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Very simple matrix mechanics problem

  1. Jan 2, 2007 #1
    1. The problem statement, all variables and given/known data

    This is more a question on 'what do my notes mean by x' so bear with me. The question is 'show that the matrix [tex] \\left(
    \\begin{array}{cc}
    0 & 1\\\\
    0 & 0
    \\end{array}
    \\right) [/tex] does not have a complete set of eigenvectors'. I'm given the explanation that a set of eigenvectors is complete if an aribtrary column vector can be constructed by

    [tex]b_{r} = \\alpha^{(k)} v_{r}^{(k)}[/tex]

    ...Ok, so? how does this make the set 'complete' and how am I supposed to make a judgement on the nature of the original matrix if it's not mentioned in the equation I'm told to use?

    2. Relevant equations

    See above

    3. The attempt at a solution

    None. I don't have the faintest idea what I'm supposed to be doing.

    edit: bah. I've cut and paste the matrix code from elsewhere on this forum and all of a sudden it doesn't want to do as it's told. I give up. The matrix is | 0 1 |
    | 0 0 |
     
    Last edited: Jan 2, 2007
  2. jcsd
  3. Jan 2, 2007 #2

    cristo

    User Avatar
    Staff Emeritus
    Science Advisor

    Find the eigenvectors of the matrix. Then find an arbitrary vector that cannot be constructed by your formula given.
     
  4. Jan 3, 2007 #3
    If you have studied "matrix mechanics" or linear algebra you should know that a matrix that is not invertible cannot have a complete set of eigenvectors.
     
  5. Jan 3, 2007 #4

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    Careful. [[0,0],[0,0]] has a complete set of eigenvectors.
     
  6. Jan 3, 2007 #5
    I've supposedly 'studied' matrix mechanics. This consisted of two lectures with an utter idiot who was filling in for the regular lecturer. Nobody in that class had the faintest idea what was going on. I'm still not aware of how, in context, a matrix can have eigenvalues - I'm sure it's obvious but the concept has only been introduced to me through wave mechanics and so I don't have the faintest idea what I'm doing.

    This question is for a computational assignment - I don't have a course in matrix mechanics and the current lecturer is under the erroneous assumption that this class has some grounding in it. I haven't been able to bring this up with him because (i) it's the christmas break and this assignment is due the first day back and (ii) he's never there himself and gets his postgrads to teach the class.

    I have, since I posted this question, been able to make *some* headway. Using the idea that (matrix - eigenvalue * identity matrix) = 0, i've been able to find the eigenvalues. As the unsymmetric matrix only has one eigenvalue, it obviously only has one eigenvector which is one short of what is required to be a complete set.

    Finding the eigenvectors, I have another problem with. The second part asks for the eigenvalues and eigenvectors of [[0,1],[1,0]]. Finding the eigenvalues is fine, and the eigenvector that corresponds to lambda = 1 is ok, but when lambda = -1, there are two possible eigenvectors, [-1, 1] and [1, -1]. A buddy insists that these are equivalent, and I cannot see why.

    Please bear in mind that I KNOW NOTHING ABOUT MATRIX MECHANICS AND AM MAKING THIS UP AS I GO ALONG.

    Thank you for your patience.
     
  7. Jan 3, 2007 #6

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    You are getting there. If v is an eigenvector then c*v (where c is a number) is also an eigenvector for the same eigenvalue, right? So v and c*v aren't really 'different' eigenvectors (they are linearly dependent). What does that say about your two 'different' eigenvectors for eigenvalue -1?
     
  8. Jan 3, 2007 #7

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    BTW, untrue that an asymmetric matrix 'obviously' has only one eigenvalue. While you are practicing try [[0,1],[4,0]].
     
  9. Jan 4, 2007 #8
    I was under the impression that there was *always* (no exceptions) a one-to-one correspondence between eigenvalues and eigenvectors. It would follow that if there is only one extant eigenvalue for some particular matrix, it only has one eigenvector. We're given the 'lemma' (I believe that's the correct usage of the term) that a complete set of eigenvectors always has n members, where n is the dimension of the matrix being considered - so I'd take this as being sufficient as a 'proof'.
     
  10. Jan 4, 2007 #9

    cristo

    User Avatar
    Staff Emeritus
    Science Advisor

    A complete set of eigenvectors always has n linearly independent members. However, this is not a proof that to each eigenvalue there exists one and only one eigenvector. Consider the equation for an eigenvector:[tex]A\bold{x}=\lambda \bold{x}[/tex]. Now, clearly [itex]c\bold{x}[/itex] where c is in |R, is an eigenvector also. Thus, to each eigenvalue, there exist infinitely many eigenvectors (corresponding to the mulitiplication by a constant).
     
    Last edited: Jan 4, 2007
  11. Jan 4, 2007 #10

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    A "complete set of eigenvectors", for a linear transformation A, is a basis for the vector space consisting of eigenvectors of A. If A is written as a matrix in terms of that basis, it is diagonal with its eigenvalues on the main diagonal and 0s elsewhere.
     
  12. Jan 4, 2007 #11
    This is beyond the rigorousness of this assignment, as far as I can see.

    This is clear; but since it's already been said that:

    ...Is it true that each eigenvalue will only have one of these sets of eigenvectors i.e. a group of infinitely many, as cristo described, all of the form (real number) * (some matrix) ? That is, each eigenvalue will not correspond to more than one linearly independent eigenvector?

    I'm basically trying to satisfy the question asked: "prove that these do not form a complete set" without invoking the messy column vector proof I stated in my first post. Is the assertion above sufficient to do that for a fairly lenient marker unconcerned with the mathematical ins and outs?
     
  13. Jan 4, 2007 #12

    cristo

    User Avatar
    Staff Emeritus
    Science Advisor

    That's correct.

    Take your set of evectors to be one evector corresponding to each evalue. Then, to show that the set is not complete, find an arbitrary vector that cannot be expressed as linear combination of these evectors. (i.e. show that this set of linearly independent evectors does not form a basis for the vector space).
     
  14. Jan 4, 2007 #13
    Aha. when you explain it like that, it becomes blindingly obvious what the equation is trying to express. I'm imagining it as showing that the unsymmetric matrix doesn't define a full coordinate system for its vectors to exist in and thus is not 'complete'. Is this a reasonable ontology for the situatin, or am I somewhat off the mark?
     
  15. Jan 4, 2007 #14

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    Sort of. If you are not too fussy about the use of words like 'coordinate system'. The matrix has a characteristic equation having 0 as a double root indicating that there might be two independent evectors associated with it. In fact there is only one. So the eigenvectors don't span the whole space.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Very simple matrix mechanics problem
Loading...