# A Hamiltonian identity

etotheipi
Homework Statement:
Show that $$\langle H \rangle = \sum_{n=1}^{\infty} E_n |c_n|^2$$
Relevant Equations:
N/A
If we can identify ##|c_n|^2## as the probability of having an energy ##E_n##, then that equation is just the bog standard one for expectation. But the book has not proved this yet, so I assumed it wants a derivation from the start.

I tried \begin{align*} \Psi(x,t) = \sum_n c_n \psi_n(x)e^{-\frac{iE_n}{\hbar}t} \implies |\Psi|^2 &= \left(\sum_n c_n \psi_n(x)e^{-\frac{iE_n}{\hbar}t}\right)\left(\sum_n c_n^* \psi^*_n(x)e^{\frac{iE_n}{\hbar}t}\right) \end{align*}And since the ##\psi_n##'s are orthogonal e.g. ##\int \psi_n \psi^*_m dx= \delta_{mn}##. Then I did$$\int |\Psi|^2 dx = \sum_n |c_n|^2 \int |\psi_n(x)|^2 dx = \sum_n |c_n|^2 = 1$$using the fact that all of the ##\psi_n(x)##s are normalised. That's consistent with ##|c_n|^2## being a probability, but it's still not a proof of anything. I wondered if someone could point me in the right direction? I thought to try $$\langle H \rangle = -\frac{\hbar^2}{2m}\int \Psi^* \left(\frac{\partial^2}{\partial x^2}\right) \Psi dx$$but that doesn't seem too helpful here... thanks!

Last edited by a moderator:

PeroK
Homework Helper
Gold Member
2020 Award
What is the defining property of the functions ##\psi_n(x)##, vis-a-vis the Hamiltonian?

• etotheipi
etotheipi
What is the defining property of the functions ##\psi_n(x)##, vis-a-vis the Hamiltonian?

All that I can think of would be ##\hat{H} \psi_n(x) = E_n \psi_n(x)## (e.g. they're eigenvectors) maybe I should try putting that into the expectation equation?

• PeroK
etotheipi
(Edit: Cleaned up some of the LateX)

OK I think I got it, $$\langle H \rangle = \int \left(\sum_m c_m^* \psi^*_m(x)e^{\frac{iE_m}{\hbar}t}\right) \hat{H} \left(\sum_n c_n \psi_n(x)e^{-\frac{iE_n}{\hbar}t}\right) dx$$ $$= \sum_m \sum_n \left(c_m^* c_n \, e^{\frac{it}{\hbar}(E_m - E_n)} E_n \int \psi_m^* \psi_n dx \right)$$ $$=\sum_m \sum_n \left( c_n^* c_m e^{\frac{it}{\hbar}(E_m - E_n)} E_n \delta_{mn} \right)$$ $$= \sum_n |c_n|^2 E_n$$using the orthonormality to kill some terms in the integral, and the result ##\hat{H} \psi_n(x) = E_n \psi_n(x)##.Gosh it's tricky to write this up Last edited by a moderator:
• PeroK
etotheipi
@PeroK one more question did come to mind; is the above sufficient to deduce (or only suggestive) that ##|c_n|^2## are the probabilities? I have seen that ##c_n = \langle e_n | \Psi \rangle##, i.e. it is the ##|e_n \rangle## component of ##|\Psi\rangle##, from which it would follow from Born's rule that ##P = \langle e_n | \Psi \rangle^* \langle e_n | \Psi \rangle = |c_n|^2##, but I thought Born's rule was formally a postulate?

PeroK
Homework Helper
Gold Member
2020 Award
@PeroK one more question did come to mind; is the above sufficient to deduce (or only suggestive) that ##|c_n|^2## are the probabilities? I have seen that ##c_n = \langle e_n | \Psi \rangle##, i.e. it is the ##|e_n \rangle## component of ##|\Psi\rangle##, from which it would follow from Born's rule that ##P = \langle e_n | \Psi \rangle^* \langle e_n | \Psi \rangle = |c_n|^2##, but I thought Born's rule was formally a postulate?
The key point is to identify ##\langle \hat H \rangle## as the expected value, which is equivalent to the Born rule. Otherwise, it's just some number. Alternatively, if you identify the coefficients as probability amplitudes, then that justifies the above as the expectation value; and vice versa.

• etotheipi
etotheipi
Yeah that does make sense, thanks!

On a more mundane note (last question, I promise! ) I noticed that you used the notation ##\langle \hat H \rangle## (instead of ##\langle H \rangle##) which was also used in those Cresser notes you mentioned a while back, which I'm just starting to go through also. Since we often use ##q## to denote an observable and ##\hat{q}## to denote the operator associated with that observable, is there any difference between ##\langle q \rangle## and ##\langle \hat{q} \rangle## - i.e. is the expectation of an observable synonymous with the expectation of an operator?

PeroK
Homework Helper
Gold Member
2020 Award
Yeah that does make sense, thanks!

On a more mundane note (last question, I promise! ) I noticed that you used the notation ##\langle \hat H \rangle## (instead of ##\langle H \rangle##) which was also used in those Cresser notes you mentioned a while back, which I'm just starting to go through also. Since we often use ##q## to denote an observable and ##\hat{q}## to denote the operator associated with that observable, is there any difference between ##\langle q \rangle## and ##\langle \hat{q} \rangle## - i.e. is the expectation of an observable synonymous with the expectation of an operator?
I don't think it makes much difference. You could write ##\langle E \rangle## or ##\langle H \rangle## to mean the expected value of the numerical quantity you are measuring. And, you could also write ##\langle \hat H \rangle## as the "expected value" of the operator, which is defined as ##\langle \Psi | \hat H | \Psi \rangle##, where in this case the inner product is given by the integral.

• etotheipi
etotheipi
I don't think it makes much difference. You could write ##\langle E \rangle## or ##\langle H \rangle## to mean the expected value of the numerical quantity you are measuring. And, you could also write ##\langle \hat H \rangle## as the "expected value" of the operator, which is defined as ##\langle \Psi | \hat H | \Psi \rangle##, where in this case the inner product is given by the integral.