# Does light travel forever?

• B
• mess

#### mess

TL;DR Summary
If light emitted from a candle falls off in intensity the further you get away from it, at some distance the observation of the light should reach the minimal possible energy and disappear.
I don't think this is what is observed, so where is my understanding wrong?

Last edited:

Hmm... let's try an analogy.
I've got 1000 dollars.
I give 10 people around me an equal amount of money.
Now there's 10 people around me with 100 dollars each.
Now each of them gives 10 people around each other an equal amount of money.
Now there's 100 people with 10 dollars each.
And so on...
Does the money finally disappear? • russ_watters and mess
Hmm... let's try an analogy.
I've got 1000 dollars.
I give 10 people around me an equal amount of money.
Now there's 10 people around me with 100 dollars each.
Now each of them gives 10 people around each other an equal amount of money.
Now there's 100 people with 10 dollars each.
And so on...
Does the money finally disappear? I want to be one of these people, give me the dollar

• DaveE and DennisN
Summary:: If light emitted from a candle falls off in intensity the further you get away from it, at some distance the observation of the light should reach the minimal possible energy and disappear.

I don't think this is not what is observed, so where is my understanding wrong?
Intensity is the quantity of light detected per unit time. There is no minimum for that. One photon every million years? One photon every billion years?

• entropy1, mess, jim mcnamara and 1 other person
Hmm... let's try an analogy.
I've got 1000 dollars.
I give 10 people around me an equal amount of money.
Now there's 10 people around me with 100 dollars each.
Now each of them gives 10 people around each other an equal amount of money.
Now there's 100 people with 10 dollars each.
And so on...
Does the money finally disappear? At some point, you can no longer divide it amongst other people; you will have 1000 people with 1 dollar and it will stop at that point. Does that not mean light will stop traveling at that point?

Intensity is the quantity of light detected per unit time. There is no minimum for that. One photon every million years? One photon every billion years?

That makes sense!

• DennisN and berkeman
The key term here is in vacuum.

Here on Earth light loses intensity very quickly.

Does the money finally disappear?

They say money talks, but all it ever said to me was goodbye.

so where is my understanding wrong

Here (in addition to what else has been said)

light should reach the minimal possible energy

There is no minimum possible energy.

• • entropy1 and DennisN
Interesting that this was asked in the QM forum, yet many answers are sort of classically framed.

You could say that the photon is either detected or not, but with a fixed energy. But the probability of an interaction occurring decreases with distance. There doesn't have to be any minimum probability.

• mess
Interesting that this was asked in the QM forum, yet many answers are sort of classically framed.

I'm guessing the OP put the question in the QM forum because of the "minimum possible energy" assumption, which, as @Vanadium 50 has pointed out, is not correct. The answer to the OP question is actually pretty much the same whether you use classical or quantum electrodynamics.

• DaveE
Summary:: If light emitted from a candle falls off in intensity the further you get away from it, at some distance the observation of the light should reach the minimal possible energy and disappear.

Imagine a single particle of light was emitted in outer space in some direction away from our solar system. Conceivably, it would never interact with matter were it to head to deep space. It might travel forever, and would not otherwise decay.

• Delta2
Summary:: If light emitted from a candle falls off in intensity the further you get away from it, at some distance the observation of the light should reach the minimal possible energy and disappear.

I don't think this is what is observed, so where is my understanding wrong?
The energy density falls off with ##1/r^2##. The total energy is conserved. Taking into account quantization means that radiation of a given frequency can only be deteted by absorbing at least one photon by the detector. If you have a true one-photon state then you can either detect exactly one photon (with some probability depending on the location of the detector and the specific state of the photon) or none photon.

Imagine a single particle of light was emitted in outer space in some direction away from our solar system. Conceivably, it would never interact with matter were it to head to deep space. It might travel forever, and would not otherwise decay.
It might travel forever (for all time), but there are places it will never reach.

Nobody has mention expansion of space, and no intense laser will ever reach the event horizon, nor will light emitted there ever reach here (the location in space where Earth was were its worldline to continue for eternity, which it will, one way or another). The event horizon is currently about 15 BLY (comoving distance) away and shrinking, which means there are all kinds of things continuously passing through it, never to be seen again. Eventually there will be nothing left except our local super cluster.

So for an observer close to our event horizon, light from Earth will reach him in finite time, but will probably be redshifted to the point of not interacting. Sufficiently long wavelength light will bypass matter, which is why they use IR (and lower) telescopes to view the center of our galaxy which is completely obscured at visible wavelengths.

Nobody has mention expansion of space, and no intense laser will ever reach the event horizon, nor will light emitted there ever reach here

To be precise, you are talking about two different horizons. The "no light emitted there will ever reach here" horizon is the event horizon. The "no light emitted here will ever reach there horizon" is a different one, which does not appear to have a commonly used name.

Also, the presence of these horizons is not due just to "expanding space" (the expansion of the universe), but due to the accelerating expansion due to dark energy. In a universe without dark energy, there would be no cosmological horizons of the kind you describe.

To be precise, you are talking about two different horizons. The "no light emitted there will ever reach here" horizon is the event horizon. The "no light emitted here will ever reach there horizon" is a different one, which does not appear to have a commonly used name.
Agree. Both are a function of the peculiar motion of the receiving (light destination) worldline, but not of the emitting worldlines, but only of emitting events. For a pair of worldlines both with zero peculiar velocity (and similar gravitational potential), the two lines cross each other at a point of cosmological time simultaneous with 'now'.

Also, the presence of these horizons is not due just to "expanding space" (the expansion of the universe), but due to the accelerating expansion due to dark energy. In a universe without dark energy, there would be no cosmological horizons of the kind you describe.
Yes, I should have worded it more precisely. No acceleration means no horizons of these sorts. Light would get anywhere given enough time. There's always been such a horizon despite the acceleration only going back so far. Hence the finite size (r=~46 BLY currently) of the visible universe, or finite reach of our particle horizon, the two corresponding in a way to the two kinds of event horizons above: One incoming and one outgoing.

There's always been such a horizon despite the acceleration only going back so far.

Event horizons are only defined globally in a spacetime; it would make no sense to say there were there only "before" or "after" a certain time. They're not "things" that can exist at certain times but not others. They are global boundaries.

Hence the finite size (r=~46 BLY currently) of the visible universe, or finite reach of our particle horizon, the two corresponding in a way to the two kinds of event horizons above: One incoming and one outgoing.

No, the particle horizon and the observable universe are not event horizons and do not "correspond" to the two kinds of event horizons in the way you are saying.

Summary:: If light emitted from a candle falls off in intensity the further you get away from it, at some distance the observation of the light should reach the minimal possible energy and disappear.

I don't think this is what is observed, so where is my understanding wrong?
I think you're thinking:
"Light intensity decreases ##\sim\frac{1}{r^2}##. But wait! A single photon has a fixed energy ##E=h\nu##. When the light intensity (loosely speaking) falls off below such an energy, the photon must disappear (either you have one photon with energy ##E=h\nu## or you have no photon, no half-photons allowed!)."
If that's what you're thinking, I suppose you also think that the intensity drops ##\sim\frac{1}{r^2}## but in a discrete way (more and more photons disappearing the further light travels).

The mistake here is that it's not the energy of the photons to drop ##\sim\frac{1}{r^2}##. Photons will travel forever if they do not interact with anything. The law ##\sim\frac{1}{r^2}## for light intensity simply comes from geometry!
A point-like light source "shoots" photons in every direction, so that the wave-front is a sphere getting bigger and bigger the more the light travels. The total energy and intensity of the wave-front stays constant forever if there is nothing interacting with the emitted light. But the energy is diluted across all the wave-front spherical surface. The more the light travels, the bigger the spherical surface , so the energy density decreases. Energy stays contant but the surface of the sphere increases ##\sim r^2##, and energy density thus decreases ##\sim\frac{1}{r^2}##.

According to Wiki (https://en.wikipedia.org/wiki/Cosmic_microwave_background) the light from the Big Bang became of longer wavelength due to the expansion of the Universe and now forms the Cosmic Microwave Background. So I presume that, as the Universe expands, light emitted today will eventually acquire a longer wavelength. If we radiated microwave radiation into space today, I presume that for practical purposes it will quickly become undetectable against the noise of the CMB.

• vanhees71
I think you're thinking:
"Light intensity decreases ##\sim\frac{1}{r^2}##. But wait! A single photon has a fixed energy ##E=h\nu##. When the light intensity (loosely speaking) falls off below such an energy, the photon must disappear (either you have one photon with energy ##E=h\nu## or you have no photon, no half-photons allowed!)."
If that's what you're thinking, I suppose you also think that the intensity drops ##\sim\frac{1}{r^2}## but in a discrete way (more and more photons disappearing the further light travels).

The mistake here is that it's not the energy of the photons to drop ##\sim\frac{1}{r^2}##. Photons will travel forever if they do not interact with anything. The law ##\sim\frac{1}{r^2}## for light intensity simply comes from geometry!
A point-like light source "shoots" photons in every direction, so that the wave-front is a sphere getting bigger and bigger the more the light travels. The total energy and intensity of the wave-front stays constant forever if there is nothing interacting with the emitted light. But the energy is diluted across all the wave-front spherical surface. The more the light travels, the bigger the spherical surface , so the energy density decreases. Energy stays contant but the surface of the sphere increases ##\sim r^2##, and energy density thus decreases ##\sim\frac{1}{r^2}##.

Yes this is exactly what I am stuck on. So if the energy is diluted overall across the sphere, won't the total energy at some point reach that of less then the energy of one photon?

if the energy is diluted overall across the sphere

The energy per unit area (more precisely the rate of energy flow per unit area) is diluted, but the total energy is not. Note, however, that this is a classical way of viewing it, in which the concept of "photon" is meaningless. For the quantum view, see below.

wont the total energy at some point reach that of less then the energy of one photon?

From a "photon" point of view, the dilution is not in energy (or more precisely energy flow) per unit area, as above; it is in the probability per unit area of detecting a photon. That probability per unit area can get arbitrarily small; there is no minimum required value for it. The total probability, integrated over the entire area of a sphere, remains constant: it is always 1; this is the quantum analogue of the total energy remaining constant.

Note that even here we are leaving out further complications: for example, you have been implicitly assuming that the source is emitting photons of one particular energy, but a candle is not that kind of light source; the light a candle emits has a distribution of possible energies per photon, not just one. A more precise treatment would have to evaluate the probability of detecting photons of the whole range of possible energies, and that range includes energies that can get arbitrarily small (more precisely, the range is limited by the detector--any real detector will have a finite lower limit to the photon energies it can detect--rather than by the radiation itself).

• vanhees71
Yes this is exactly what I am stuck on. So if the energy is diluted overall across the sphere, won't the total energy at some point reach that of less then the energy of one photon?
It is much easier than you think. I'll give a semi-classical explanation. For a more formal and accurate quantum explanation please see the above answer by PeterDonis. By the way, we're all missing the fact that in the real world a candle needs oxygen around it to burn, so that the photons emitted by the candle are not free from interactions!
Anyway, that's how it works from a semi-classical point view:
• From an ideal point-like light source, photons are emitted "spherically" (in all directions, uniformly)
• Let's assume that 1000 photons are emitted at the same moment
• Those 1000 photons constitute the "spherical wave-front" of the emitted light
• Photons do not lose energy if they do not interact with anything once emitted, but they get "spreaded out" on the spherical surface the bigger that gets
• After a certain distance traveled by light, let's suppose the surface area of the wave-front sphere is now 1000 quare meters. This means that the "photon density" is now 1 photon per square meter (1000 photons distributed over 1000 m2). I think you accept this, but see next point.
• After some more distance traveled by the light, let's suppose the surface area of the wave-front sphere is now 2000 quare meters. This means that the "photon density" is now 0.5 photons per square meter (1000 photons distributed over 2000 m2). I think this is what you think is not allowed ("fractioned photons"). But, actually, that doesn't really mean that there is 0.5 photons per square meter, but that on a certain square meter you may find a photon, and on the square meter next to it you may find zero photons.
As an analogy, suppose you toss a coin 4 times and you get heads 3 times, tails 1 time. You would say that you got 75% (which is 75 out of 100) heads and 25% (which is 25 out of 100) tails. But did you really toss the coin 100 times and get 75 heads and 25 heads? No, you didn't!
A density of 0.5 photons per square meter means you have 50% probability of finding a photon inside one certain square meter.

So I presume that, as the Universe expands, light emitted today will eventually acquire a longer wavelength.

It would be fair to say that it appears to some observers as having a longer wavelength. Redshift/blueshift is more of a description of the relative frames of the emitter and the observer. From a quantum mechanical perspective, the photon does not lose energy during its journey.

• tech99 and vanhees71
Yes this is exactly what I am stuck on. So if the energy is diluted overall across the sphere, won't the total energy at some point reach that of less then the energy of one photon?
It's the energy flow (Poynting vectors, i.e., the energy per unit area) that falls like ##1/r^2## with ##r## the distance from the source. This doesn't mean that the single photon looses energy, which stays always the same as long as nothing disturbs the photon (e.g., some matter where it gets absorbed). To the contrary, because energy is conserved the energy current must fall off by a factor ##1/r^2## since the total flow through any surface must stay constant and a spherical shell's area goes like ##r^2##. So for the total energy to stay constant you must have the energy current density go like ##1/r^2##.

You can of course not measure a part of the photon's energy "smeared" over a sphere of radius ##r## but you can either measure the full photon's energy, ##\hbar \omega## (i.e., you register the full photon) or nothing (i.e., you don't register the photon at all). The famous conclusion by Born is that the energy density is proportional to the probability to find the photon at a given place. Now the energy flow gives the energy per unit area per unit time going through a surface. Since this energy current goes like ##1/r^2## this implies that the probability current that the photon hits a point on a spherical shell falls off like ##1/r^2##, i.e., the farther you get away with your detector the smaller the probability gets to register the photon.

It has been briefly alluded to but you can see with naked eye light, i.e. lots of photons, that has been traveling uninterrupted (from the Andromeda galaxy) for 2 million years. Astronomers of course can see much older light. Till we get to some radioastronomers who 'see' light that has been traveling since the birth of the Universe minus a trivial 300,000 years! ( The cosmic microwave background).
And lots of them - detected as continuous microwave, not individual photons.

When they started they were very short wavelength gamma rays. They have got stretched out by the expansion of the universe. On the face of it this corresponds to a loss of energy - I will leave the physicists to explain this. But note they still have more energy per photon than the ones you are employing for your TV and radio Wi-fi etc! And that's the most common ones, there must be rarer more energetic ones. I don't know if any individual CMB photons have been detected, but maybe 'detected' is an inappropriate word, it would rather be a question of 'distinguished' from other photons, probably not.

Now I've said it you probably realize you knew most of it already.

• vanhees71
To me it would seem the question is "Do photons decay - do they have a half life?"

To me it would seem the question is "Do photons decay - do they have a half life?"

I don't think that's the question the OP is asking, but the answer to it is no.

• vanhees71