Suppose an atom emits a photon by jumping from one quantum state to
another. The photon that's emitted does not have a definite energy.
Its energy is intrinsically uncertain, in accordance with (one version
of) the Heisenberg uncertainty principle. If you know enough about
the emission process, you can calculate the probability distribution
of energies. In particular, you can calculate the average value
(a.k.a. the expected value) of the energy. You can also calculate the
standard deviation of the distribution of energies, a quantity that is
usually called the "uncertainty" in the energy. You can't calculate
"the energy," because the photon doesn't have a definite energy.
Now, it's certainly possible to produce a bunch of photons that all
have the same average energy, but that are not all the same. In
particular, you can produce one photon with a pretty precisely
specified energy and another with a much less well-known energy. (Say
the first one has an energy of 1.000 eV plus or minus 0.001 eV, and
the second one has an energy of 1.000 eV plus or minus 0.1 eV.) Those
two photons will have the same average wavelength, but different
coherence lengths. (The coherence length is, more or less, the
physical extent of the photon's wavefunction -- i.e., how many
"wiggles" it has.) In particular, the photon with the
precisely-specified energy has a longer coherence length (more
wiggles) than the one with the vague energy. This is just a statement
of (one version of) the uncertainty principle.
You can, as a purely theoretical construct, consider a photon whose
energy is precisely specified: E = 1.000...000... eV, with
absolutely no uncertainty. A photon like that would have
an infinitely long coherence length (infinitely many wiggles).
Its wave function would be a sine wave, extending all the way
from x = - infinity to x = + infinity.
I know that you don't believe me about this. You think I'm lying, or
oversimplifying, or something. I don't know how to convince you I'm
not, except to repeat myself, and put the statement into a box
for good measure:
----------------------8<---CLIP-'N'-SAVE---8<------------------------
| |
| A photon with a definitely-specified energy has a wavefunction |
| that is a literal, honest-to-god sine wave, extending from |
| x = literal, honest-to-god minus infinity to x = literal, |
| honest-to-god plus infinity. |
| |
---------------------------------------------------------------------
Now, a photon that is emitted by some realistic process (like an atom
jumping from one state to another) doesn't produce such a sine wave,
because it doesn't produce a photon of definite energy. It produces a
photon with some uncertainty in the energy, or in other words with
some finite coherence length. That coherence length depends on the
details of exactly how the photon was produced. For any given average
energy, you can imagine photons that have very long coherence lengths
or very short coherence lengths. Which one you get in any particular
situation depends on the exact details of the process by which your
photon was produced, which is why there's no single answer to the
question "how long is the wavetrain of a single photon."
Your statement above (the one that inspired this rant) is more or less
true, if you interpret "same energy" to mean "same definite energy."
Two photons that both have the same infinitely-precisely-specified
energy are the same (up to differences like direction of propagation,
phase, and polarization): they both look like pure, infinitely long
sine waves. But not all photons (in fact, no photons in the real
world) are emitted with definite energy, so this statement, while
true, isn't very helpful.