Very simple matrix mechanics problem

It would probably depend on how you define 'correspond' and how you define 'linearly independent'. I don't think it would be too hard to come up with a matrix that has a complex eigenvalue and a real eigenvalue, and the eigenvectors for the real eigenvalue are all of the form (real number)*(some matrix), while the eigenvectors for the complex eigenvalue are all of the form (complex number)*(some matrix). Then, depending on what you mean by 'correspond', you could have the real eigenvalue 'correspond' to two linearly independent eigenvectors, or you could say that it 'corresponds' to only one, since the
  • #1
Sojourner01
373
0

Homework Statement



This is more a question on 'what do my notes mean by x' so bear with me. The question is 'show that the matrix [tex] \\left(
\\begin{array}{cc}
0 & 1\\\\
0 & 0
\\end{array}
\\right) [/tex] does not have a complete set of eigenvectors'. I'm given the explanation that a set of eigenvectors is complete if an aribtrary column vector can be constructed by

[tex]b_{r} = \\alpha^{(k)} v_{r}^{(k)}[/tex]

...Ok, so? how does this make the set 'complete' and how am I supposed to make a judgement on the nature of the original matrix if it's not mentioned in the equation I'm told to use?

Homework Equations



See above

The Attempt at a Solution



None. I don't have the faintest idea what I'm supposed to be doing.

edit: bah. I've cut and paste the matrix code from elsewhere on this forum and all of a sudden it doesn't want to do as it's told. I give up. The matrix is | 0 1 |
| 0 0 |
 
Last edited:
Physics news on Phys.org
  • #2
Find the eigenvectors of the matrix. Then find an arbitrary vector that cannot be constructed by your formula given.
 
  • #3
If you have studied "matrix mechanics" or linear algebra you should know that a matrix that is not invertible cannot have a complete set of eigenvectors.
 
  • #4
Jezuz said:
If you have studied "matrix mechanics" or linear algebra you should know that a matrix that is not invertible cannot have a complete set of eigenvectors.

Careful. [[0,0],[0,0]] has a complete set of eigenvectors.
 
  • #5
I've supposedly 'studied' matrix mechanics. This consisted of two lectures with an utter idiot who was filling in for the regular lecturer. Nobody in that class had the faintest idea what was going on. I'm still not aware of how, in context, a matrix can have eigenvalues - I'm sure it's obvious but the concept has only been introduced to me through wave mechanics and so I don't have the faintest idea what I'm doing.

This question is for a computational assignment - I don't have a course in matrix mechanics and the current lecturer is under the erroneous assumption that this class has some grounding in it. I haven't been able to bring this up with him because (i) it's the christmas break and this assignment is due the first day back and (ii) he's never there himself and gets his postgrads to teach the class.

I have, since I posted this question, been able to make *some* headway. Using the idea that (matrix - eigenvalue * identity matrix) = 0, I've been able to find the eigenvalues. As the unsymmetric matrix only has one eigenvalue, it obviously only has one eigenvector which is one short of what is required to be a complete set.

Finding the eigenvectors, I have another problem with. The second part asks for the eigenvalues and eigenvectors of [[0,1],[1,0]]. Finding the eigenvalues is fine, and the eigenvector that corresponds to lambda = 1 is ok, but when lambda = -1, there are two possible eigenvectors, [-1, 1] and [1, -1]. A buddy insists that these are equivalent, and I cannot see why.

Please bear in mind that I KNOW NOTHING ABOUT MATRIX MECHANICS AND AM MAKING THIS UP AS I GO ALONG.

Thank you for your patience.
 
  • #6
You are getting there. If v is an eigenvector then c*v (where c is a number) is also an eigenvector for the same eigenvalue, right? So v and c*v aren't really 'different' eigenvectors (they are linearly dependent). What does that say about your two 'different' eigenvectors for eigenvalue -1?
 
  • #7
BTW, untrue that an asymmetric matrix 'obviously' has only one eigenvalue. While you are practicing try [[0,1],[4,0]].
 
  • #8
I was under the impression that there was *always* (no exceptions) a one-to-one correspondence between eigenvalues and eigenvectors. It would follow that if there is only one extant eigenvalue for some particular matrix, it only has one eigenvector. We're given the 'lemma' (I believe that's the correct usage of the term) that a complete set of eigenvectors always has n members, where n is the dimension of the matrix being considered - so I'd take this as being sufficient as a 'proof'.
 
  • #9
A complete set of eigenvectors always has n linearly independent members. However, this is not a proof that to each eigenvalue there exists one and only one eigenvector. Consider the equation for an eigenvector:[tex]A\bold{x}=\lambda \bold{x}[/tex]. Now, clearly [itex]c\bold{x}[/itex] where c is in |R, is an eigenvector also. Thus, to each eigenvalue, there exist infinitely many eigenvectors (corresponding to the mulitiplication by a constant).
 
Last edited:
  • #10
A "complete set of eigenvectors", for a linear transformation A, is a basis for the vector space consisting of eigenvectors of A. If A is written as a matrix in terms of that basis, it is diagonal with its eigenvalues on the main diagonal and 0s elsewhere.
 
  • #11
This is beyond the rigorousness of this assignment, as far as I can see.

cristo said:
However, this is not a proof that to each eigenvalue there exists one and only one eigenvector. Consider the equation for an eigenvector: [...] where c is in |R, is an eigenvector also. Thus, to each eigenvalue, there exist infinitely many eigenvectors (corresponding to the mulitiplication by a constant).

This is clear; but since it's already been said that:

Dick said:
So v and c*v aren't really 'different' eigenvectors (they are linearly dependent).

...Is it true that each eigenvalue will only have one of these sets of eigenvectors i.e. a group of infinitely many, as cristo described, all of the form (real number) * (some matrix) ? That is, each eigenvalue will not correspond to more than one linearly independent eigenvector?

I'm basically trying to satisfy the question asked: "prove that these do not form a complete set" without invoking the messy column vector proof I stated in my first post. Is the assertion above sufficient to do that for a fairly lenient marker unconcerned with the mathematical ins and outs?
 
  • #12
Sojourner01 said:
That is, each eigenvalue will not correspond to more than one linearly independent eigenvector?
That's correct.

I'm basically trying to satisfy the question asked: "prove that these do not form a complete set" without invoking the messy column vector proof I stated in my first post. Is the assertion above sufficient to do that for a fairly lenient marker unconcerned with the mathematical ins and outs?

Take your set of evectors to be one evector corresponding to each evalue. Then, to show that the set is not complete, find an arbitrary vector that cannot be expressed as linear combination of these evectors. (i.e. show that this set of linearly independent evectors does not form a basis for the vector space).
 
  • #13
Aha. when you explain it like that, it becomes blindingly obvious what the equation is trying to express. I'm imagining it as showing that the unsymmetric matrix doesn't define a full coordinate system for its vectors to exist in and thus is not 'complete'. Is this a reasonable ontology for the situatin, or am I somewhat off the mark?
 
  • #14
Sort of. If you are not too fussy about the use of words like 'coordinate system'. The matrix has a characteristic equation having 0 as a double root indicating that there might be two independent evectors associated with it. In fact there is only one. So the eigenvectors don't span the whole space.
 

What is a matrix?

A matrix is a rectangular array of numbers, symbols, or expressions arranged in rows and columns. It is used to represent and manipulate data in various areas of mathematics, science, and engineering.

What is matrix mechanics?

Matrix mechanics is a mathematical framework used to describe and analyze the behavior of physical systems, particularly in quantum mechanics. It involves using matrices to represent physical observables and calculate their values.

What is a simple matrix mechanics problem?

A simple matrix mechanics problem is a problem that involves manipulating matrices to solve a given mathematical or physical question. It typically involves basic operations such as addition, subtraction, multiplication, and inversion of matrices.

What are the applications of matrix mechanics?

Matrix mechanics has various applications in mathematics, physics, and engineering. It is used to analyze the behavior of quantum systems, model chemical reactions, solve linear equations, and more.

What skills are required to solve a very simple matrix mechanics problem?

To solve a very simple matrix mechanics problem, one needs to have a basic understanding of matrix operations, linear algebra, and mathematical concepts such as determinants and eigenvectors. Proficiency in algebra and calculus is also helpful.

Similar threads

  • Advanced Physics Homework Help
Replies
3
Views
1K
  • Differential Equations
Replies
3
Views
2K
  • Advanced Physics Homework Help
Replies
8
Views
2K
  • Advanced Physics Homework Help
Replies
10
Views
2K
  • Advanced Physics Homework Help
Replies
3
Views
1K
  • Advanced Physics Homework Help
Replies
3
Views
2K
  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Precalculus Mathematics Homework Help
2
Replies
57
Views
3K
Back
Top