# Expectation Value of Hamiltonian with Superposition

1. Feb 21, 2016

### BOAS

1. The problem statement, all variables and given/known data

Particle in one dimensional box, with potential $V(x) = 0 , 0 \leq x \leq L$ and infinity outside.

$\psi (x,t) = \frac{1}{\sqrt{8}} (\sqrt{5} \psi_1 (x,t) + i \sqrt{3} \psi_3 (x,t))$

Calculate the expectation value of the Hamilton operator $\hat{H}$ . Compare it with the energy eigenvalues $E_1$, $E_2$, and $E_3$.

2. Relevant equations

3. The attempt at a solution

The subscript refers to the different eigenvalue solutions.

$\psi_1 = e^{-i \frac{E_1 t}{\hbar} } \sqrt{\frac{2}{L}} \sin \frac{\pi}{L} x$

and

$\psi_3 = e^{-i \frac{E_3 t}{\hbar} } \sqrt{\frac{2}{L}} \sin \frac{3 \pi}{L} x$

Using the fact that $\hat{H} \phi = E \phi$ I find that

$\hat{H} \phi = -\frac{\hbar^2}{2m} \frac{\partial^2 \phi}{\partial x^2}$ and cancelling out the stationary wave equation, I get that

$E = \frac{\hbar^2 \pi^2}{2m L^2 \sqrt{8}} (\sqrt{5} + 9 \sqrt{3})$

and since the expectation value of the hamiltonian is the total energy, that should be what i'm looking for.

I am confused about how I compare this value to $E_n$ when I have a superposition of stationary states. Does the expression $E_n = \frac{\hbar^2}{2m} (\frac{n \pi}{L})$ apply to all stationary states?

i.e is it literally a case of plugging in $n = 1,2,3$ ?

I'm not sure if i'm explaining myself clearly. How does the case where $n = 1$ effect my $\psi_3$?

2. Feb 21, 2016

### blue_leaf77

That's not correct, you should calculate $\langle \psi | H| \psi \rangle$ instead.

3. Feb 21, 2016

### BOAS

Is that equivalent to $\langle \hat{H} \rangle = \int^{\infty}_{\infty} dx \psi* \hat{H} \psi$ ?

4. Feb 21, 2016

### blue_leaf77

Yes, it's equivalent to that integral form.

5. Feb 21, 2016

### PeroK

$E_n = \frac{\hbar^2}{2m} (\frac{n \pi}{L})^2$

6. Feb 21, 2016

### BOAS

Ok,

$\hat{H} \psi = - \frac{\hbar^2}{2m} \frac{\partial^2 \psi}{\partial x^2} + V \psi= \frac{\hbar^2}{2m} \sqrt{\frac{2}{L}} (\frac{\sqrt{5}}{\sqrt{8}} e^{\frac{-i E_1 t}{\hbar}} \frac{\pi^2}{L^2} \sin \frac{\pi}{L}x + i \frac{\sqrt{3}}{\sqrt{8}} e^{\frac{-i E_3 t}{\hbar}} \frac{9 \pi^2}{L^2} \sin \frac{3 \pi}{L} x)$

This simplifies a little

$\hat{H} \psi = \sqrt{\frac{2}{L}} ( E_1\frac{\sqrt{5}}{\sqrt{8}} e^{\frac{-i E_1 t}{\hbar}} \sin \frac{\pi}{L}x + i E_3 \frac{\sqrt{3}}{\sqrt{8}} e^{\frac{-i E_3 t}{\hbar}} \sin \frac{3 \pi}{L} x)$

$\hat{H} \psi = \frac{1}{2 \sqrt{L}} ( E_1 \sqrt{5} e^{\frac{-i E_1 t}{\hbar}} \sin \frac{\pi}{L}x + i E_3 \sqrt{3} e^{\frac{-i E_3 t}{\hbar}} \sin \frac{3 \pi}{L} x)$

$\phi * = \frac{1}{2 \sqrt{L}} (\sqrt{5} e^{\frac{i E_1 t}{\hbar}} \sin \frac{\pi}{L} x - i \sqrt{3} e^{\frac{i E_3 t}{\hbar}} \sin \frac{3 \pi}{L} x)$

$\phi * \hat{H} \phi = \frac{1}{4L} ( E_1 \sqrt{5} e^{\frac{-i E_1 t}{\hbar}} \sin \frac{\pi}{L}x + i E_3 \sqrt{3} e^{\frac{-i E_3 t}{\hbar}} \sin \frac{3 \pi}{L} x)(\sqrt{5} e^{\frac{i E_1 t}{\hbar}} \sin \frac{\pi}{L} x - i \sqrt{3} e^{\frac{i E_3 t}{\hbar}} \sin \frac{3 \pi}{L} x)$

$\phi * \hat{H} \phi = \frac{1}{4L} (5 E_1 \sin^2 \frac{\pi}{L}x + 3 E_3 \sin^2 \frac{3 \pi}{L}x + i \sqrt{3} \sqrt{5} E_3 e^{\frac{-i(E_3 - E_1)t}{\hbar}} \sin \frac{3 \pi}{L}x \sin \frac{\pi}{L} x - i \sqrt{3} \sqrt{5} E_1 e^{\frac{-i(E_1 - E_3)t}{\hbar}} \sin \frac{\pi}{L} x \sin \frac{3 \pi}{L} x)$

When I integrate this, am I correct in thinking I can apply the orthogonality theorem to get rid of all but the sine squared terms?

7. Feb 21, 2016

### blue_leaf77

Yes you are, because the eigenstates of a Hamiltonian are orthogonal.

8. Feb 21, 2016

### PeroK

There's no advantage in replacing $\psi_n$ with the appropriate $sin$ function (unless you were trying to prove orthogonality). It is better and simpler to keep it general for as long as possible.

Your original mistake was to use the coefficients $a_1, a_3$ rather than $|a_1|^2, |a_3|^2$ when you took your shortcut.

It's also simpler to keep it as $E_1, E_3$ for as long as possible. That last little part of the question is almost a hint to do this!

9. Feb 21, 2016

### BOAS

I can definitely appreciate this after getting myself into plenty of trouble with accounting for all the terms.

Thanks for the tips.

10. Feb 21, 2016

### jimbododge

Isn't the integral over the sin^2 terms divergent??

11. Feb 21, 2016

### blue_leaf77

The sine functions involved in this calculation are actually only within the interval $0<x<L$. Outside this interval the wavefunctions vanish.

12. Feb 21, 2016

### BOAS

I have found that $\langle \hat{H} \rangle = \frac{2 \hbar^2 \pi^2}{m L^2}$ which is the same expression as for $E_2$.

I think it makes sense that my expression lies between $E_1$ and $E_3$ (afterall, it contains components of both, so isn't purely one or the other), and since the energy is quantised, it can only be $E_2$.

13. Feb 21, 2016

### PeroK

Yes and no. That's the right answer, but if the coefficents were different, you could get any answer between $E_1$ and $E_3$. There's no obligation on the expected energy to be precisely $E_2$. Any specific measurement of energy will give either $E_1$ or $E_3$ but the average (expected) value doesn't have to be $E_2$.

14. Feb 21, 2016

### BOAS

Hmm, this is confusing.

$\langle \hat{H} \rangle = \Sigma_n E_n |c_n|^2$ describes what you're saying about a specific measurement giving either $E_1$ or $E_3$. Afterall. $E_2$ has no associated probability.

So the numbers of this question were chosen to give E_2. That seems somewhat misleading. Ok, so depending on the weighting given to each wavefunction, you might expect the expectation value to shift towards the 'heavier' one.

15. Feb 21, 2016

### PeroK

Whoever set the question probably didn't anticipate your interpretation! Yes, weighting is exactly how to look at it. If $a_1$ were small and $a_3$ nearly one, then $<H>$ would be nearly $E_3$; and vice versa.

16. Feb 21, 2016

### blue_leaf77

Either there is a typo or you were calculating it wrong - the factor of two should be another number.
EDIT: Sorry, I forgot that you must have cancel 4 in the numerator with 2 in the denominator. You answer is right.

17. Feb 21, 2016

### BOAS

No problem - I appreciate your help.

18. Feb 21, 2016

### jimbododge

Shouldn't the probability of state 2 be zero, seeing as it was not involved in the original sum of functions? How can the probabilities of each state vary between different functions if they're given by the equation for En? I found that the total energy is given by 5/8 E1 + 3/8 E3... Seeing as these probabilities add to 1, would these represent the relative probabilities of the two functions?

Thanks so much!

19. Feb 21, 2016

### blue_leaf77

I don't get what you intend to say there.
Average energy is a more correct term than the total energy. The numerical value above is correct though.
Yes, the probabilities should indeed add up to one and they also represent the probabilities of finding the system in the corresponding states. What is it that confuses you?