# Probability in relation to distance

## Main Question or Discussion Point

This is a very easy question. I just finished reading Feynmans book QED, and I was wondering, the probability of a photon going from one place to another is dependant on the distance between the two points right? Is there any underlying mechanism which explains this? I know that the probability is determined by the difference between all the possible routes the photon can take, but I dont see how this has anything to do with the distance.
Thanks.

Related Quantum Physics News on Phys.org
Calculating the 'difference' between possible routes requires integration of some quantity over the whole path. That's how the distance comes in.

It's just distance=speed of light X time interval.

Feynman explains how the photon acts a bit like a clock- its 'phase' change is proportional to the time interval- though of course after the dial has gone all the way around the clock its phase goes back to zero.

The time interval is given by distance/ speed of light. Thus, since we know what the speed of light is, and it's a constant- the relative phase change only depends on the distance travelled.

The probabilities are proportional to the average of the total phase squared. You've got to add the phases over all paths in the right way- such that 10 o'clock + 4 hours = 2 o'clock, or 1 o'clock - 2 hours = 11 o'clock etc.

Thus, the probabilities are really only a function of location- once we sum over all possible paths and take into account the phase change over each path.

Ok, so I think I understand, that its built in in a sense. Just to clarify my understanding. If you were to have 4 electrons say, very roughly lines up like so:
0 - A
0 - B
0 - C
0 - D
So B and D were about the same distance from C, and A was a fair bit further off in the direction of B. And suppose C emits a photon. There would be a probability that the photon would go to each electron correct, and it would be about the same for B and D. But would the fact that there is a tendancy for the photon to go to A increase the likelihood of it going to B, because they are both in the same general direction?

Not sure I completely understand your Q.

I think you'd have to know how the photon is scattered off the electron- about which I'm not very knowledgeable. It may be that the intensity at B is a little increased because of back-scattering from A.

Incidentally, in the case of one photon- the probabilities must also depend on time (as well as distance). There's no probability for the photon to go faster than the speed of light!

It's only when an average is taken over time that the probability becomes a function of position only. It's almost always the case that we don't know the departure time of a single photon from a light source. We usually only know the average number of photons per second- in which case we're forced into time averaging.

Basically what I am wondering is, I know there is interferance in the probability wave when you're talking about which path a photon takes from point A to B, one path can enhance or cancel out another. Is there also interference when your talking about the probability of the same photon going from A to point B or C. Do the two interfere with each other, or are they seperate?

Yes- interference occurs. Look up the two-slit experiment (e.g. on Wikipedia)

Is there also interference when you're talking about the probability of the same photon going from A to point B or C
You have to add up (integrate) the number of ways from A->B and A->C. The more ways there are the greater the amplitude.

You have to add up (integrate) the number of ways from A->B and A->C. The more ways there are the greater the amplitude.
Not necessarily. There are an infinite number of possible paths between any two points.

Also- the integral is oscillatory- so you have to consider whether paths are interfering constructively or destructively.

The two slit expt. shows that in some places the probability drops when the second slit is opened- due to destructive interference.

There are an infinite number of possible paths between any two points.
This is no barrier to integration or summation. But I should have said 'the greater the integral'.
the integral is oscillatory- so you have to consider whether paths are interfering constructively or destructively
Obviously. I was keeping it (too ?) simple.

This is a very easy question. I just finished reading Feynmans book QED, and I was wondering, the probability of a photon going from one place to another is dependant on the distance between the two points right? Is there any underlying mechanism which explains this? I know that the probability is determined by the difference between all the possible routes the photon can take, but I dont see how this has anything to do with the distance.
Thanks.
I think the question to ask is whether the most probable route is also the shortest route. Then the path integral which determines the most probable route might also serve as a metric.