# Dirac Notation, Observables, and Eigenvalues, OH MY!

by Chelsea S
Tags: dirac notation, eigenvalues, observables
 P: 12 Alright... So I'm in an 'introductory' Q.M class in college right now, it's the only one that this two-year college has, so I don't have an upper division Q.M Profs to talk to about this, and since my prof is equally confused, I turn to the internet. Okay, so everyone knows that <ψ|Aψ> = , where A is my observable, and is my eigenvalue for the equation. Yay, verily, yay. Now, we were going through a proof, and to begin he said what he was going to do, but didn't tell us why he was doing it (this is how he teaches, mysterious ways to keep our attention) anyway... He gets to the end and the statement looks like this: = <ψ|ABψ> - <ψ|Bψ> = AB <ψ|ψ> - <ψ|ψ> And everyone in the class says "wait, how in the hell did you not get eigenvalues for the AB combination in the wave function in the first set? (this one: AB<ψ|ψ>) ... And there was silence... "Well, two years ago when I wrote these notes it made sense. I have no idea" he says. So we discussed it for a little while, came up with some ideas as to why it probably couldn't. Physcially, I think (and please correct me if my thoughts are wrong here): You have two particles that you want to measure *whatever* on in the same wave function. Your probability of measuring both of them at the same time where you want them to be would be less than if you had just a single one. That's what I would like to think anyway. So if indeed <ψ|ABψ> = , the probability would be greater to find them both in the same wave function. That's why I think it doesn't work. I can't find anything anywhere about the nitty gritty of a rule that says that this can't work, and I can't come up with any math-like reasons or proofs as to why it can't work. So.. Why does <ψ|ABψ> = AB and not <ψ|ABψ> =
Homework
HW Helper
Thanks
P: 12,974
No wonder you are confused!

A and B are operators corresponding to two different observables right? Do they commute?
I don't have the context for <f|g> ...
Why would you be thinking in terms of more than one particle?
What has position to do with it?

I'd normally write <ψ|Aψ> = <A> rather than <a> but I suppose there's nothing wrong with it.

I'd have expected AB|ψ> to represent operating first by B and then A. Is |ψ> supposed to be a simultanious eigenvector of A and B?

I'm not sure AB<ψ|ψ> makes sense: after all AB is the product of the operators and don't normally mean anything without something to operate on.

As written:
<ψ|ABψ> = AB <ψ|ψ> - <a><b><ψ|ψ> + <a><ψ|Bψ>

So I think more context is needed.
What is supposed to be demonstrated here?

A bit of a nitpick<a> would be the average value of the observable, not the eigenvalue.... unless |ψ> is an eigenvector of A. In which case we'd write: <ψ|A|ψ>=a<ψ|ψ>=a ... in fact, we'd probably call the eigenvector |a>. Eigenvalues belong to eigenvectors, not equations.
 P: 134 You seem to be confused about what is what, such as equating scalars with operators. Why not to pick Shankar and make it clear?
P: 123
OK, I think what you are trying to proof is the general uncertainty principle. Now, forget everything after the third line… if a professor really did this, you’d better tell him to quit teaching QM.
First of all, there is no need to mention eigenvalues, so A will be an operator and <A> = <ψ|AΨ> its expected value. In the fourth line, just let the first term as it was: <ψ|ABψ> = <AB> and the rest terms should not have the factor <ψ|ψ> (remember the definition of expected value). Do some algebra to get: <f|g> = <AB>-<A><B> and doing the same for <g|f> you should get: <g|f> = <BA>-<A><B> . Subtract the results to get:
<f|g> - <g|f> = <AB>-<BA> = <AB-BA> = <[AB]>
Plug this in: σB2σA2 ≥ |1/2i(<f|g>-<g|f>)|2 to get:
σB σA≥ 1/2|<[AB]>|
 P: 12 Oh, okay that makes sense! So... you CAN just pull out as if it were a single operator... And, yes, this was a professor. Thanks for the reply, I was seriously going to use a whole notebook worth of paper trying to figure out the way he did it XD
P: 123
 Quote by Chelsea S So... you CAN just pull out as if it were a single operator...
Of course... "AB" IS a single operator: it is the composition of two operators witch is its self an operator.
 P: 12 That is totally what I thought! I asked him if there was a reason why the two together just didn't make a 'separate' observable, and he was like "Yeah i'm not sure if it does." I said, "well do they have an effect on eachother when they're multiplied together?" he said "I'll have to look it up. " Okay. Whew.
P: 123
 Quote by Chelsea S That is totally what I thought! I asked him if there was a reason why the two together just didn't make a 'separate' observable, and he was like "Yeah i'm not sure if it does." I said, "well do they have an effect on eachother when they're multiplied together?" he said "I'll have to look it up. " Okay. Whew.

Well, you have to distinguish the terms "operator" and "observable" . The opeator is nothing more than a function on a Hilbert space: its input is a state vector and its output is again a state vector. But every operator doesn' t corespond to an observable. A necessery condition for an operator to represent an observable, is that the operator must be hermitian. Now if you have two operators, say A and B, which represent two observables (so they are hermitian), then their composition AB does give a new operator (i.e. a function on Hilbert space) but this new operator is not always hermitian:
(AB)+ = B+A+ = BA
Therefore, if A and B commute (i.e. if the observables they represent can be measured simultaneously), than AB could represent an observable because it is then hermitian.
P: 296
 Quote by cosmic dust Therefore, if A and B commute (i.e. if the observables they represent can be measured simultaneously), than AB could represent an observable because it is then hermitian.
Could you elaborate on why A and B commuting mean the observables they represent can be measured simultaneously? What I get from them commuting is that the order in which you observe doesn't matter, but that there's still an order...I don't see the simultaneity.
P: 123
 Quote by DocZaius Could you elaborate on why A and B commuting mean the observables they represent can be measured simultaneously? What I get from them commuting is that the order in which you observe doesn't matter, but that there's still an order...I don't see the simultaneity.
One can measure two observables simultaneously, if there is a state vector that caries DEFINITE information for both of them. This means that the same state vectort is an eigenvector of both the operators that represent the observables. But two operators cannot have a common eigenvector, i.e. a vector that satisfies both equations:
Aψ = aψ , Βψ = bψ
if they do not commute. Therefore, if they commute then they do share a set of common eigenvectors, which vectors cary definite information for both of them. So , if the physical system is described by such a vector, then a measurement can give definite information for both of them.
P: 296
 Quote by cosmic dust One can measure two observables simultaneously, if there is a state vector that caries DEFINITE information for both of them. This means that the same state vectort is an eigenvector of both the operators that represent the observables. But two operators cannot have a common eigenvector, i.e. a vector that satisfies both equations: Aψ = aψ , Βψ = bψ if they do not commute. Therefore, if they commute then they do share a set of common eigenvectors, which vectors cary definite information for both of them. So , if the physical system is described by such a vector, then a measurement can give definite information for both of them.
Although two operators not commuting does mean they can't have a common eigenvector, I don't see why two operators commuting necessarily means they have a common eigenvector...only that they could. Could you show why they must necessarily have common eigenvector(s)?

Here's what I mean:

A and B commute so AB-BA=0
Lets consider an eigenfunction of A as being f with eigenvalue a. Multiply it by both sides of equation:
ABf-BAf=0
ABf-aBf=0

Now here it is obvious that f could also be an eigenfunction of B, but I don't see why it must. All I see here is that Bf is an eigenfunction of A.
 P: 123 I’m having a problem with my tex, so I will outline what you have to do in order to prove it. First, consider that the set of the eigenvectors of A is a complete set. Therefore you can expand Bf on this set of eigenvectors. Act with A on this expansion: on one side you should have ABf. Act with B on the eigenvalue equation of A: on one side you should have BAf while on the other side you use again the expansion of Bf. Now, subtract the two relations: one side should be zero because of the commutation relation. Since the eigen-vectors of A are linearly independent, each coefficient of this expansion should be zero. You have two possibilities: 1) the eigen-values are the same, so the coefficient of the expansion of Bf can have any value, 2) the eigen-values are different so the coefficient of the expansion of Bf should be zero. Therefore, you see that the expansion of Bf contains only f times a constant, i.e. f is an eigen-vector of B.
P: 123
 Quote by cosmic dust I’m having a problem with my tex, so I will outline what you have to do in order to prove it.
Problem solved! Let me show you what I mean. Consider the eigen-values equation for operator the A: $$A{{\psi }_{n}}={{a}_{n}}{{\psi }_{n}}$$
Assuming that the set of the eigenvetors of A is a complete set, we can expand $$B{{\psi }_{n}}$$ as:
$$B{{\psi }_{n}}=\sum\limits_{m}{{{c}_{nm}}{{\psi }_{m}}}$$
Acting on this relation with A we get:
$$AB{{\psi }_{n}}=\sum\limits_{m}{{{c}_{nm}}{{a}_{m}}{{\psi }_{m}}}$$
Acting with B on the eigenvalues equation of A, we get:
$$BA{{\psi }_{n}}={{a}_{n}}B{{\psi }_{n}}=\sum\limits_{m}{{{c}_{nm}}{{a}_{n}}{{\psi }_{m}}}$$
Subtract the above last two relations is order to get:
$$\left[ A,B \right]{{\psi }_{n}}=\sum\limits_{m}{{{c}_{nm}}\left( {{a}_{m}}-{{a}_{n}} \right){{\psi }_{m}}}=0$$
assuming that the operators commute. Since the eigen-vectors of A are linearly independent, this expression will hold iff:
$${{c}_{nm}}\left( {{a}_{m}}-{{a}_{n}} \right)=0$$
for all n,m . There are two possibilities: $$\left. 1 \right)\,\,n=m\to {{a}_{n}}-{{a}_{n}}=0$$ so cnn can have any finite value, say bn $$\left. 2 \right)\,\,m\ne n\to {{a}_{m}}-{{a}_{n}}\ne 0\Rightarrow {{c}_{nm}}=0$$
The two possibilities can be summarized in the expression:$${{c}_{nm}}={{\delta }_{mn}}{{b}_{n}}$$
So, the expansion of $$B{{\psi }_{n}}$$ will eventually be:
$$B{{\psi }_{n}}=\sum\limits_{m}{{{\delta }_{nm}}{{b}_{n}}{{\psi }_{m}}}={{b}_{n}}{{\psi }_{n}}$$
i.e. ψn is an eigen-vector of B .
Homework
HW Helper
Thanks
P: 12,974
 Quote by DocZaius Although two operators not commuting does mean they can't have a common eigenvector, I don't see why two operators commuting necessarily means they have a common eigenvector...only that they could. Could you show why they must necessarily have common eigenvector(s)?
I think the previous answer was not as precise as you'd like ...

if ψ is a state vector, and A and B are (hermitian) operators, then the eigenvectors of those operators may be used a (complete) basis. ψ may be expanded in terms of the eigenvectors of either of them.

It is possible to prepare the system so that it is in an eigenstate of either operator, but ψ is not necessarily an eigenvector of either. In general, however, an eigenvector of A need not be an eigenvector of B. If it is not, then a measurement of B on an eigenvector of A will destroy information about that state.

If [A,B]=0, then it is possible to prepare the system so that it is in an eigenstate of both. The system is not necessarily in such a state. But if it were, then it is possible to measure A and B's observables, in principle, at the same time.

[A,B]=0 does not imply that all eigenvectors of A are also eigenvectors of B. See:

However, there must be at least one.

You can see this by expanding the eigenvectors of A in terms of B and then using the completeness of the set of eigenstates of A and B respectively... I think this is the proof you were looking for(?)

Also see:
Stephanus J. L. Eijndhoven, Johannes (1986); A Mathematical Introduction to Dirac's Formalism p301-2 [Elsevier Science Publishers B.V.]
... and others. It is a standard proof in texts of matrix math.

Note - in general, for two matrixes A and B, I don't think [A,B]=0 is sufficient to ensure there are simultaneous eigenvectors. But that is not the situation under discussion - here we are considering operators of observables ... these are hermitian, and, we have a complete basis for each.
 P: 296 Thank you that was helpful.
 Homework Sci Advisor HW Helper Thanks P: 12,974 No worries. Since we awereonly dealing with observables - there was no need to worry if a stated relation was true in general (the math-folk can sort it out). But, since you have expressed an interest, it may be an interesting exercise for you - if A and B are two commuting matrices for which a complete set of eigenvectors is available, do they need to be hermitian to guarantee at least one shared eigenvector between them?

 Related Discussions Introductory Physics Homework 3 Advanced Physics Homework 5 Quantum Physics 8 Advanced Physics Homework 4 Special & General Relativity 6