I have a puzzle I seem to be unable to answer satisfactorily. It concerns the 'Uncertainty Principle': If you can locate a particle (any kind of quanta) at a given location within an uncertainty of deltaX then the uncertainty in its momentum deltaP must be at least h-bar/deltaX. (Please don't quibble about factors of 2, or pi, because that does not matter.) Here is the gedanken-experiment setup. We prepare some near pure material and we implant a single atom of a known impurity in the crystal. The crystal (being near pure) does not have many natural impurities in it, and we will also assume that these impurities are of different type from the atom we purposely implanted. Now using various techniques like X-ray crystallography or neutron scattering, the location of the atom we purposely implanted can be determined to an accuracy comparable to the size of a typical atom in the crystal. So, let's take deltaX = 5.0E-8 cm. Now we are ready to do the experiment: The experiment consists of shining light on this material, of which wavelength is carefully chosen. The chosen wavelength must NOT excite the atoms of the crystal, or the other impurities in it, but it will excite the implanted atom, when absorbed. Most likely the energy of the photon will be a few eV. It could be an optical or at most an ultraviolet photon. Anyway, after shining the light for sufficiently long, the impurity will absorb one of the photons, jump to an excited state, and then make a cascade of transitions down to its ground state, which we can observe. In other words, when the implanted atom jumps to the excited state we know for sure that it absorbed one of the photons we have been hitting the crystal with. Since we know the impurity location to an accuracy of 5E-8 cm, we also know that the inaccuracy of the position of the absorbed photon is also 5E-8. (No other atom in the crystal can absorb it.) Therefore the uncertainty in the momentum of the photon when absorbed within the atomic dimensions must be deltaP >= h-bar/5.0E-8 cm, or deltaP >= 400 eV/c. But this is sheer nonsense (because the photon has no mass and the uncertainty in its energy must be 400 eV.) The problem is perfectly OK for electrons. You can locate an electron within a few angstroms with the same deltaP of 400 eV/c. Because of the electron mass, this translates into 0.32 eV of kinetic energy, not 400 eV. But we cannot locate a photon as accurately as 5 angstroms, without incurring an uncertainty of 400 eV in its energy. And yet, this gendanken-experiment will work in reality, right?? So where did I go wrong????