Question on expectation value.

In summary: A) If you get the same value for A every time, that means that ΔA=0. Is this the answer to the first part?B) \hat{A}\Psi = a\Psi This is how you relate \hat{A} and \Psi What can I say about this relationship?C) If ΔA=0, does this tell you something about the commutator relationship [\widehat{A},\widehat{H}]? I know for, say, position and momentum, if the uncertainty in x is 0, then the uncertainty in p is infinite because they don't commute. But I don't know if A and H
  • #1
LogicX
181
1

Homework Statement



Given an observable quantity A, when will it happen that the same value for A will be measured every time?

What is the relationship between the operator [itex]\hat{A}[/itex] and [itex]\Psi[/itex] for this case?

and

What is the relationship between [itex]\widehat{A}[/itex] and [itex]\widehat{H}[/itex], the hamiltonian, for this case?

Homework Equations



Expectation value equation:

[itex]\int\Psi*\widehat{A}\Psi[/itex]

Also I feel like it may have something to do with the Schrodinger equation.

The Attempt at a Solution



I'm not sure if the first part is asking for a mathematical answer or not. The measured value would be the same every time only if there was a single allowed value for that physical quantity.

I'm not sure about the last two parts either. Energy is still quantized, except that only one solution exists... what does this mean for the relationship between eigenfunction and operator? Does it mean that the operator gives the same function back, like with an eigenvalue of 1? That is just a shot in the dark, I'm kinda clueless over here.

I'm not even sure how to relate A to H with an equation, let alone comment on this specific case.

Thanks.
 
Physics news on Phys.org
  • #3
First, the problem statement has nothing to do with the expectation value of ##\hat{A}##. It has to do with making a measurement of the observable. Those are two different things.

Second, the problem has to do with what happens when you make a measurement. Read up on that in your QM textbook, and this problem will make more sense to you.
 
  • #4
Edit: deleted my post while I work on the problem a bit more.
 
  • #5
vela said:
First, the problem statement has nothing to do with the expectation value of ##\hat{A}##. It has to do with making a measurement of the observable. Those are two different things.

Second, the problem has to do with what happens when you make a measurement. Read up on that in your QM textbook, and this problem will make more sense to you.

Sorry! Correct - the expectation value isn't relevant here :-)
 
  • #6
So... I can't find anything on the Ehrenfest theorem in my notes or textbook. I realize it may be a way of solving this, but my professor would not expect us to have to find it on our own. Any other hints on other methods I could use?
 
  • #7
The Ehrenfest theorem doesn't apply here because the problem isn't about expectation values. It's more about understanding how states evolve with time. It's a conceptual problem. Don't assume ##\hat{A}## has only one allowed measurement result.
 
  • #8
vela said:
The Ehrenfest theorem doesn't apply here because the problem isn't about expectation values. It's more about understanding how states evolve with time. It's a conceptual problem. Don't assume ##\hat{A}## has only one allowed measurement result.

Ok then I'm back to square one. I'm glad it is a conceptual problem though. This is literally the only problem left on my practice test that I haven't been able to figure out, but it is worrying me.

How else could the same value be measured every time unless the wavefunction gave a probability of 1 at that point? If other results were possible, wouldn't they mean that the measurement would not be the same every time? Or am I approaching this from the wrong direction?

From your last post... yes I know that the wavefunction "collapses" when you make a measurement (not from my textbook which doesn't really mention such a thing, but rather from reading these forums). What does this have to do with the specific case in the above problem?
 
  • #9
Ok so I went to talk to my professor today and he wouldn't help me with this question because he basically said it was going to be on the test, but that it should be a simple answer. So now I'm really desperate because I can't get help except from here, and it is on the test.

A) If you get the same value for A every time, that means that ΔA=0. Is this the answer to the first part?

B) [itex]\hat{A}\Psi = a\Psi[/itex] This is how you relate [itex]\hat{A}[/itex] and [itex]\Psi[/itex] What can I say about this relationship?

C) If ΔA=0, does this tell you something about the commutator relationship [[itex]\widehat{A},\widehat{H}[/itex]]? I know for, say, position and momentum, if the uncertainty in x is 0, then the uncertainty in p is infinite because they don't commute. But I don't know if A and H commute, so I seem to be stuck.

Also, let's say they didn't commute. When position and momentum don't commute I say "there is a Heisenberg uncertainty relationship between the observables position and momentum." If A and H don't commute, would I say "there is a Heisenberg uncertainty relationship between the observable A and the energy of the system"? I'm just checking I understand this correctly.

Help me physics forums, you're my only hope.
 
Last edited:
  • #10
What is the result of observation A for a general state, and what is the new state after that?
 
  • #11
bloby said:
What is the result of observation A for a general state,

The wavefunction collapses into a single possible eigenstate.

and what is the new state after that?

Same as above? A single eigenstate?
 
  • #12
And how does this eigenstate evolve until the next measure?
 
  • #13
bloby said:
And how does this eigenstate evolve until the next measure?

Does it go back to being a set of possible states again? This is really stretching my grasp of this concept. Like I said, I've never actually seen "wavefunction collapse" in my textbook, only wikipedia...

Obviously you are trying to get at something here. Could you show me how this relates to my question? I need a concrete hint to go on. So far I have been thinking about this for days and I can't figure it out.

EDIT: If I get the same measurement for A each time, does it mean that [itex]\hat{A}\Psi = a\Psi[/itex] has only one possible eigenstate?

EDIT 2: It would be helpful also if someone could give me a large enough hint so that I could actually try to solve the problem on my own rather than short posts with long stretches in between where I am no closer to an answer... not trying to be rude, I'm just very anxious and frustrated with this "simple" question.
 
Last edited:
  • #14
I m not an expert... No it means that the state is an eigenstate each time. The evolution is given by Schrödinger eq. You must find an hamiltonian such that this state remains an eigenstate.
 
  • #15
bloby said:
I m not an expert... No it means that the state is an eigenstate each time. The evolution is given by Schrödinger eq. You must find an hamiltonian such that this state remains an eigenstate.

I thought no matter what, it is an eigenstate when you measure something?

I'm dying over here folks.

Does (B) mean that Psi is an eigenfunction with the operator A?
 
Last edited:
  • #16
Before the measure generally not, just after surely yes.

As long as you do not measure you can follow your state with the Schr. eq. and predict the probability to get one value or another by projecting the state one the respective eigenvectors of A. But by measuring A you pick up a value and the state becomes the corresponding eigenvector. Then you can again follow the state which may or may not remain an eigenvector.
 
  • #17
bloby said:
Before the measure generally not, just after surely yes.

As long as you do not measure you can follow your state with the Schr. eq. and predict the probability to get one value or another by projecting the state one the respective eigenvectors of A. But by measuring A you pick up a value and the state becomes the corresponding eigenvector. Then you can again follow the state which may or may not remain an eigenvector.

So... how does this answer my question? Again, not trying to be rude, I just feel a little obtuse because I can't figure this out.
 
  • #18
Say the observable ##\hat{A}## has eigenstates ##|a_1\rangle##, ##|a_2\rangle##, ##|a_3\rangle##, …. such that ##\hat{A}|a_i\rangle = a_i |a_i\rangle##. For simplicity, let's assume these eigenstates are non-degenerate. We can express the state of the system ##|\psi(t)\rangle## at time t as a linear combination of these eigenstates.
$$|\psi(t)\rangle = c_1(t) |a_1\rangle + c_2(t) |a_2\rangle + \cdots $$ You're given that every time you measure ##\hat{A}##, you get the same result. What does that imply about the form ##|\psi(t)\rangle## must take?
 
  • #19
vela said:
Say the observable ##\hat{A}## has eigenstates ##|a_1\rangle##, ##|a_2\rangle##, ##|a_3\rangle##, …. such that ##\hat{A}|a_i\rangle = a_i |a_i\rangle##. For simplicity, let's assume these eigenstates are non-degenerate. We can express the state of the system ##|\psi(t)\rangle## at time t as a linear combination of these eigenstates.
$$|\psi(t)\rangle = c_1(t) |a_1\rangle + c_2(t) |a_2\rangle + \cdots $$ You're given that every time you measure ##\hat{A}##, you get the same result. What does that imply about the form ##|\psi(t)\rangle## must take?

When you say you measure A and get the same result each time, does this mean the eigenstate is the same each time?

Ok so ##\hat{A}## has various eigenstates. ψ at a certain time is a linear combination of these eigenstates. The fact that the eigenstate is the same each time (?), does this mean that a1, a2 etc. all specify the same state, i.e. they are equal?

Sorry if my responses don't make much sense. But I really appreciate you trying to help me.
 
  • #20
Let's backtrack a bit. Suppose the state is as given above. What's the probability of measuring ai at time t?

The eigenstates are distinct. In fact, I suggested we look at the non-degenerate case, so we can go even better and say all the eigenvalues are distinct.
 
  • #21
vela said:
Let's backtrack a bit. Suppose the state is as given above. What's the probability of measuring ai at time t?

The probability of measuring a particular eigenvalue is |ai|2.
 
Last edited:
  • #22
No... The ai's are the eigenvalues. They're the possible results of a measurement of ##\hat{A}##.
 
  • #23
vela said:
No... The ai's are the eigenvalues. They're the possible results of a measurement of ##\hat{A}##.

Sorry! I meant |ck|2

EDIT: I'll be here all night constantly hitting refresh by the way :D
 
Last edited:
  • #24
Good.

Now when the problem says a measurement of ##\hat{A}## always results in ai, it's telling you the probability of obtaining ai is 1. So now you should be able to write down what the state of the system is and, more important, deduce how it must evolve over time. If you get this, you'll have the answers to the first two questions.
 
  • #25
vela said:
Good.

Now when the problem says a measurement of ##\hat{A}## always results in ai, it's telling you the probability of obtaining ai is 1. So now you should be able to write down what the state of the system is and, more important, deduce how it must evolve over time. If you get this, you'll have the answers to the first two questions.

So, if the probability obtaining of ai is 1, and the probability of obtaining ai is proportional to |ck|2 that means that |ck|2 must also equal 1. But we are talking about one specific ck here, which is why I am confused because I feel like this should mean one coefficient, say c1, is 1, and the rest would be 0. Any other value than 0, and those eigenstates would also have a probability of being chosen.

EDIT: When you say ai, that corresponds to the a1, a2 etc. in the linear combination right? Because I'm thinking of this in terms of, say c1=1 so therefore a1 is the ai that will be chosen each time. But I feel like I may be misinterpreting notations again...

One day left...
 
Last edited:
  • #26
Yes, that's essentially right. You can't quite conclude c1=1, but you can say the |c1|=1 and the rest are 0.
 
  • #27
vela said:
Yes, that's essentially right. You can't quite conclude c1=1, but you can say the |c1|=1 and the rest are 0.

So how do I use this to directly relate [itex]\hat{A}[/itex] and ψ? Would, on a test, I just write that a wavefunction can be written as a linear combination of eigenstates and the go through the aforementioned reasoning. This would be reasoning for part A where they ask "under what circumstance would [itex]\hat{A}[/itex] have the same measurement each time."

Also, was my previous answer that ΔA=0 for this case correct? If if you measure the same value each time then <A>2 and <A2> are the same, leading to ΔA=0. If this is correct, I think I could just use this as my condition for the circumstance of the measurement being the same each time.

Now in the second part, the question asked specifically for the relationship between [itex]\hat{A}[/itex] and the wavefunction though, and while I now understand what form the wavefunction is in, I'm not sure I can directly relate the two other than what I already used to show how A influences the wavefunction. Is what I have found out really the answer for the second part or do you think more reasoning is required?



On to the third part:

The Schrodinger equation is:

[itex]\Psi\widehat{H} = E\Psi[/itex]

Now that I know what I know about [itex]\Psi[/itex], does this tell me something about the Hamiltonian based on this equation? Like, E would be the same each time so the Hamiltonian is an operator that acts on [itex]\Psi[/itex] to give the same function back again or something? I have to relate [itex]\widehat{H}[/itex] to [itex]\widehat{A}[/itex] though.

However, I remember my prof telling us to consider the operator relationship between A and H for this part, which just screams commutator to me. He said we would have to use the results from part A and B to solve the third part.
 
Last edited:
  • #28
YOU CAN PROBABLY IGNORE MOST OF MY LAST POST!

I think I may have gotten the answer for part C.

Since [itex]\Psi[/itex] is an eigenfunction of both H and A, this means that both H and A can have sharply defined values, and thus they commute?

I have this written in my notes that because ψ is an eigenfunction of ##\widehat{L}_{z}## and ##\widehat{L}^{2}## that they both have defined values. Is this case analogous to H and A? Is the fact that H is an eigenfunction of ψ contained in the Schrodinger equation, and that A is an eigenfunction of ψ proven by the fact that the system is always in the eigenstate corresponding to the operator A?

I'm not sure of the proof of this reasoning, but it does answer the question. Also, I think I would need to show that H and A have the same set of eigenfunctions.

So overall, here is my answer:

A) the system must be in the eigenstate Ψ associated to that eigenvalue of the operator (as we proved in this thread).

B) This means ψ is an eigenfunction of A.

C) H is an eigenfunction of ψ as well, so H and A commute.
 
Last edited:
  • #29
You need to think about how the state ψ evolves with time, which has to do with H and the Schrodinger equation. In general, if the system is in an eigenstate of some operator A, it won't stay in that eigenstate as the state evolves over time. In this problem, however, it does.
 

1. What is an expectation value?

An expectation value is a concept used in statistics and probability to represent the average value that is expected to be obtained from a random variable over a large number of trials. It is calculated by multiplying each possible outcome by its corresponding probability and summing all of these products together.

2. How is an expectation value calculated?

An expectation value is calculated by multiplying each possible outcome of a random variable by its corresponding probability, and then summing all of these products together. This calculation can also be represented mathematically as the integral of the random variable multiplied by its probability distribution function.

3. What is the significance of expectation values?

Expectation values are important in statistics and probability as they provide a measure of the central tendency of a random variable. They also help to predict the most likely outcome of a random experiment or event.

4. Can the expectation value of a random variable be negative?

Yes, the expectation value of a random variable can be negative if the possible outcomes of the variable have negative values and their corresponding probabilities are significant.

5. How are expectation values used in real-world applications?

Expectation values are used in various real-world applications, such as in finance to calculate expected returns on investments, in physics to predict the average outcome of a measurement, and in machine learning to evaluate the performance of algorithms. They are also used in risk assessment and decision-making processes.

Similar threads

  • Advanced Physics Homework Help
Replies
1
Views
302
  • Advanced Physics Homework Help
Replies
6
Views
3K
Replies
14
Views
1K
  • Advanced Physics Homework Help
Replies
3
Views
1K
  • Advanced Physics Homework Help
Replies
10
Views
567
  • Advanced Physics Homework Help
Replies
31
Views
3K
  • Advanced Physics Homework Help
Replies
2
Views
2K
  • Advanced Physics Homework Help
Replies
4
Views
4K
  • Advanced Physics Homework Help
Replies
8
Views
3K
Back
Top