I read somewhere that Heisenberg described his uncertainty principle by saying that you can't measure position more accurately than the wavelength of light (which makes sense), so Δx > λ. This is what I don't get. He then says that p=h/λ, so Δp > h/λ2 Δλ. He the multiplies and sets Δλ ≈ λ to get: ΔxΔp > h Why does the initial momenta of the photon, p=h/λ, determine the uncertainty of the momentum in the object scattered by light? And what if you knew the momentum of the photon exactly, then Δp=0?