I started out with a video of someone "measuring" the speed of light by microwaving a chocolate bar. Not surprisingly, this turned out to be partially bogus. http://morningcoffeephysics.com/measuring-the-speed-of-light-with-chocolate-and-a-microwave-oven/ This sparked some more questions. Let's (falsely) assume that our chocolate bar is heated by a standing microwave. I think I am right in that the probability that a photon will be absorbed in any particular location is the square of a sine wave. Is it true that this probability does not vary from moment to moment? I hope that this is clear.