Is the energy density normalized differently in the quantum case?

Hypersphere
Messages
189
Reaction score
8
Hi all,

This is all in the context of interaction between (two-level) atoms and an electromagnetic field, basically the Wigner-Weisskopf model. In particular, I tried to derive the value of the atom-field interaction constant and show that it satisfied
|g_\mathbf{k}|^2=\frac{\omega_\mathbf{k}}{2\hbar \epsilon_0 V} \left( d^2 \cos^2 \theta \right)
where d is the dipole moment and \theta is the angle between the dipole moment and the polarization vector.

http://www.stanford.edu/~rsasaki/AP387/chap6 claim that the vacuum field amplitude satisfy the normalization
\int \epsilon_0 E^2 d^3r = \frac{\hbar \omega}{2}
which does lead to the above form of |g|^2, but from classical electrodynamics (eg. eq. (6.106) in Jackson, 3rd ed.) I'm used to defining the energy density of the electric field as
u_E=\frac{1}{2} \epsilon_0 E^2

Now, the notes seem to use a energy density that is 2u_E. Is there a good explanation for this, or does it boil down to one of these conventions? Thanks in advance.
 
Physics news on Phys.org
Actually, the author of those notes probably just switched to a complex field
E_V=\sqrt{\frac{\epsilon_0}{2}}E + i\frac{B}{\sqrt{2\mu_0}}
in which case the energy density comes out as
u=\int |E_V|^2 d^3 r = \int \left( \frac{\epsilon_0}{2}E^2 + \frac{B^2}{2\mu_0} \right) d^3 r
as it should.
 
Back
Top