# Very simple matrix mechanics problem

#### Sojourner01

1. Homework Statement

This is more a question on 'what do my notes mean by x' so bear with me. The question is 'show that the matrix $$\\left( \\begin{array}{cc} 0 & 1\\\\ 0 & 0 \\end{array} \\right)$$ does not have a complete set of eigenvectors'. I'm given the explanation that a set of eigenvectors is complete if an aribtrary column vector can be constructed by

$$b_{r} = \\alpha^{(k)} v_{r}^{(k)}$$

...Ok, so? how does this make the set 'complete' and how am I supposed to make a judgement on the nature of the original matrix if it's not mentioned in the equation I'm told to use?

2. Homework Equations

See above

3. The Attempt at a Solution

None. I don't have the faintest idea what I'm supposed to be doing.

edit: bah. I've cut and paste the matrix code from elsewhere on this forum and all of a sudden it doesn't want to do as it's told. I give up. The matrix is | 0 1 |
| 0 0 |

Last edited:
Related Advanced Physics Homework Help News on Phys.org

#### cristo

Staff Emeritus
Find the eigenvectors of the matrix. Then find an arbitrary vector that cannot be constructed by your formula given.

#### Jezuz

If you have studied "matrix mechanics" or linear algebra you should know that a matrix that is not invertible cannot have a complete set of eigenvectors.

#### Dick

Homework Helper
If you have studied "matrix mechanics" or linear algebra you should know that a matrix that is not invertible cannot have a complete set of eigenvectors.
Careful. [[0,0],[0,0]] has a complete set of eigenvectors.

#### Sojourner01

I've supposedly 'studied' matrix mechanics. This consisted of two lectures with an utter idiot who was filling in for the regular lecturer. Nobody in that class had the faintest idea what was going on. I'm still not aware of how, in context, a matrix can have eigenvalues - I'm sure it's obvious but the concept has only been introduced to me through wave mechanics and so I don't have the faintest idea what I'm doing.

This question is for a computational assignment - I don't have a course in matrix mechanics and the current lecturer is under the erroneous assumption that this class has some grounding in it. I haven't been able to bring this up with him because (i) it's the christmas break and this assignment is due the first day back and (ii) he's never there himself and gets his postgrads to teach the class.

I have, since I posted this question, been able to make *some* headway. Using the idea that (matrix - eigenvalue * identity matrix) = 0, i've been able to find the eigenvalues. As the unsymmetric matrix only has one eigenvalue, it obviously only has one eigenvector which is one short of what is required to be a complete set.

Finding the eigenvectors, I have another problem with. The second part asks for the eigenvalues and eigenvectors of [[0,1],[1,0]]. Finding the eigenvalues is fine, and the eigenvector that corresponds to lambda = 1 is ok, but when lambda = -1, there are two possible eigenvectors, [-1, 1] and [1, -1]. A buddy insists that these are equivalent, and I cannot see why.

Please bear in mind that I KNOW NOTHING ABOUT MATRIX MECHANICS AND AM MAKING THIS UP AS I GO ALONG.

#### Dick

Homework Helper
You are getting there. If v is an eigenvector then c*v (where c is a number) is also an eigenvector for the same eigenvalue, right? So v and c*v aren't really 'different' eigenvectors (they are linearly dependent). What does that say about your two 'different' eigenvectors for eigenvalue -1?

#### Dick

Homework Helper
BTW, untrue that an asymmetric matrix 'obviously' has only one eigenvalue. While you are practicing try [[0,1],[4,0]].

#### Sojourner01

I was under the impression that there was *always* (no exceptions) a one-to-one correspondence between eigenvalues and eigenvectors. It would follow that if there is only one extant eigenvalue for some particular matrix, it only has one eigenvector. We're given the 'lemma' (I believe that's the correct usage of the term) that a complete set of eigenvectors always has n members, where n is the dimension of the matrix being considered - so I'd take this as being sufficient as a 'proof'.

#### cristo

Staff Emeritus
A complete set of eigenvectors always has n linearly independent members. However, this is not a proof that to each eigenvalue there exists one and only one eigenvector. Consider the equation for an eigenvector:$$A\bold{x}=\lambda \bold{x}$$. Now, clearly $c\bold{x}$ where c is in |R, is an eigenvector also. Thus, to each eigenvalue, there exist infinitely many eigenvectors (corresponding to the mulitiplication by a constant).

Last edited:

#### HallsofIvy

Homework Helper
A "complete set of eigenvectors", for a linear transformation A, is a basis for the vector space consisting of eigenvectors of A. If A is written as a matrix in terms of that basis, it is diagonal with its eigenvalues on the main diagonal and 0s elsewhere.

#### Sojourner01

This is beyond the rigorousness of this assignment, as far as I can see.

cristo said:
However, this is not a proof that to each eigenvalue there exists one and only one eigenvector. Consider the equation for an eigenvector: [...] where c is in |R, is an eigenvector also. Thus, to each eigenvalue, there exist infinitely many eigenvectors (corresponding to the mulitiplication by a constant).
This is clear; but since it's already been said that:

Dick said:
So v and c*v aren't really 'different' eigenvectors (they are linearly dependent).
...Is it true that each eigenvalue will only have one of these sets of eigenvectors i.e. a group of infinitely many, as cristo described, all of the form (real number) * (some matrix) ? That is, each eigenvalue will not correspond to more than one linearly independent eigenvector?

I'm basically trying to satisfy the question asked: "prove that these do not form a complete set" without invoking the messy column vector proof I stated in my first post. Is the assertion above sufficient to do that for a fairly lenient marker unconcerned with the mathematical ins and outs?

#### cristo

Staff Emeritus
That is, each eigenvalue will not correspond to more than one linearly independent eigenvector?
That's correct.

I'm basically trying to satisfy the question asked: "prove that these do not form a complete set" without invoking the messy column vector proof I stated in my first post. Is the assertion above sufficient to do that for a fairly lenient marker unconcerned with the mathematical ins and outs?
Take your set of evectors to be one evector corresponding to each evalue. Then, to show that the set is not complete, find an arbitrary vector that cannot be expressed as linear combination of these evectors. (i.e. show that this set of linearly independent evectors does not form a basis for the vector space).

#### Sojourner01

Aha. when you explain it like that, it becomes blindingly obvious what the equation is trying to express. I'm imagining it as showing that the unsymmetric matrix doesn't define a full coordinate system for its vectors to exist in and thus is not 'complete'. Is this a reasonable ontology for the situatin, or am I somewhat off the mark?