Single photon wave like propagation

  • Thread starter Edi
  • Start date
  • #1
Edi
177
1

Main Question or Discussion Point

So if a single photon is emitted and it travels as a wave, then it should lose its energy by distance squared. But how can it possibly do it? -its a single photon, so its intensity cannot really change, can it? What - you will have a half of photon? What does that even mean? How can it be like that. .. it only seems that it may loose energy by decreasing its frequency .. but it doesn't seem to be the case.
.. we do see redshifts and they become stronger as stars/ galaxies are further from us and it would be consistent wit this as the further it travels- less energy (more points in space to cover) - lower frequency. .. and it wouldn't require expanding universe and something like that.. witch is kinda big deal, id its right - witch it shouldn't be, because the rate at witch such a decrease in wavelength would happen .. should be way more faster. If it does happen at all.

now. one answerable question might be : how fast would such a decrease in frequency happen? (the one photon expansion/ decrease in energy per point as the points get more in space)
another - can it even be like that?
- how DOES single photon propagate in space and why doesn't all EM waves travel in the same way even if they are one as a group - sooner or later there will be just one photon per point (whatever that means) and continue to propagate just like a single photon from the start.

And the answerer cannot be just like : they don't travel as waves. They must, because then there would be quite some more problems.

oh, I have so many questions about photons. I don't even know where to start. :O
oh, wait - i just started.
I just cannot get my head around it. Photons are gonna drive me crazy!
 

Answers and Replies

  • #2
Cthugha
Science Advisor
1,931
276
So if a single photon is emitted and it travels as a wave, then it should lose its energy by distance squared.
No, it should not. Also classical electromagnetic waves do not lose energy over distance. However, due to the larger area of the wavefront, the energy density at each point of the wavefront is reduced causing the overall energy to be constant.

For a single photon (assuming isotropic emission geometry) this will just mean that the probability to detect it at some well defined single position will become smaller as the distance to the light source increases because the probability to detect it will be distributed equally among all possible points on the wavefront.
 
  • #3
Edi
177
1
ok, I miss-said it. What I meant was that it will decrease in energy density per point. But for ar SINGLE photon to decrease in energy density.. the photon itself becomes weaker...

Hmm, ok, say the probability decreases. But, in that case, when something detects it - it detects with its full energy? As a single photon shot from a laser beam (although there is no laser beam cause there is just a single photon, but you get the idea )

So let me get this straight - one feels the energy density of a wave decrease s further from the source it gets (as we all know - energy transmission, signal strength .. ) because the probability of detection decreases and the number of actually detected photons decrease to to the probability. (?) - yes, that actually makes sense.. as far as quantum mechanics can make sense.
 
  • #4
Edi
177
1
Hmm, but explain me these probabilities - .. it really means that something, no matter how unlikely, CAN happen. There is a chance to flip a coin and get tails 100 time in a row. Very low chance, but it can happen. So, in this wave thingie - there IS a CHANCE to detect/ absorb all the photons coming from a source at a single point of measurement.
In other words - there is a CHANCE to send a signal/ energy form A to B wirelessly with no energy/ photon loss at all (minus practical engineering losses ). Although incredibly low chance, but still .. or not?
 
  • #5
1,901
45
If you use an high collimated laser beam, the area doesn't spread out, so the energy is not "diluted" at all. Infact energy is not "lost" as you wrote.
 
  • #6
Edi
177
1
ok, ok i miss-said it all right. I dint mean it is lost - energy per point is lost thou. Total isn't (in case of a wave spreading in all directions)
 
  • #7
368
12
Classical electrodynamics doesn't know about photons. There's just the energy density of the electromagnetic field, which does spread out as you get further away.

In quantum mechanics, the photon's energy density doesn't spread out--it's the probability of detecting the photon that does. That is, it becomes less and less likely that you will detect the photon in any given volume the farther away you get--but if you do detect it there, you always detect the whole thing.
 

Related Threads on Single photon wave like propagation

Replies
15
Views
2K
  • Last Post
2
Replies
34
Views
17K
  • Last Post
Replies
8
Views
2K
Replies
6
Views
1K
Replies
7
Views
859
Replies
12
Views
849
  • Last Post
2
Replies
35
Views
3K
Replies
0
Views
968
  • Last Post
Replies
0
Views
2K
Replies
19
Views
17K
Top