Does light propagate as a wave of little bullets?

In summary: The wave does eventually stop getting bigger, but it does keep getting stronger. Eventually, it becomes so strong that it can overcome any obstacles and reach any observer.In summary, according to Einstein, light is a wave that propagates in form of waves of little bullets.
  • #1
According to Einstein light would be a particle and a wave.

So I infer that it propagates in vacuum in form of waves of little bullets (particles).

This explanation is very insuficient.

So tell me how do waves increase in size since it's made of little bullets (particles)... a wave gets bigger and bigger as far as it moves away from its original spot. So the number of particles of a wave increases? How?

Please tell me what you think about this.
 
Science news on Phys.org
  • #2
Light travels as a wave. The "bullets" are photons which are the excitation of whatever the wave hits. If it doesn't hit anything, it just stays traveling as a wave forever.
 
  • Like
Likes sophiecentaur
  • #3
phinds said:
Light travels as a wave. The "bullets" are photons which are the excitation of whatever the wave hits. If it doesn't hit anything, it just stays traveling as a wave forever.

There is a high chance of you being right, but in High School teachers tell us to make exercices showing lines of little bullets reflecting on mirrors, is this wrong?
 
  • #4
Power of observation. Double Slit Experiment
 
  • #5
bdrobin519 said:
Power of observation. Double Slit Experiment

Is it wrong when high school students draw lines of little bullets reflecting on mirrors?
 
  • #6
lordoftheselands said:
According to Einstein light would be a particle and a wave.
Einstein said no such thing and light does not propagate like little bullets. As @phinds said above, light propagates like a wave, and that's because it IS a wave, the electromagnetic waves that are predicted by Maxwell's equations.

Two things contribute to the misunderstanding here.
First, we will often hear that there are things called "photons" and that these are the "particles of light". This is true, but only because the way quantum physicists use the word "particle" it is nothing like a small bullet - a flash of light is not a bunch of photons moving together the way a river is a bunch of water molecules moving together.
Second, Einstein did explain the otherwise-mystifying photoelectric effect by suggesting that electromagnetic radiation encountering matter would always deliver its energy to the matter in discrete chunks called photons. Once you hear that it is almost impossible to rid yourself of the little bullet picture, because that's exactly how bullets behave. But it's still wrong.
 
  • Like
Likes davenn, tech99, russ_watters and 3 others
  • #7
lordoftheselands said:
Is it wrong when high school students drawn lines of little bullets reflecting on mirrors?
No. That's geometric optics, which is an excellent approximation to how visible light works when it interacts with mirrors. This approximation is so good and so much easier to work with that no one would ever seriously consider solving their problem any other way if the geometric approximation is good enough.

To go beyond that approximation, or to take on the problems that can't be solved using geometric optics we might use the wave equations - we'll learn how to do that in our second year of college physics if we're in a hard and fast-moving program - or we can do an exact solution using quantum electrodynamics while we're working on our PhD.
 
  • #8
Nugatory said:
No. That's geometric optics, which is an excellent approximation to how visible light works when it interacts with mirrors. This approximation is so good and so much easier to work with that no one would ever seriously consider solving their problem any other way if the geometric approximation is good enough.

To go beyond that approximation, or to take on the problems that can't be solved using geometric optics we might use the wave equations - we'll learn how to do that in our second year of college physics if we're in a hard and fast-moving program - or we can do an exact solution using quantum electrodynamics while we're working on our PhD.

In case light propagation is in form of waves, do you agree that the waves get bigger and bigger as far as they move away from its source of origin? Do you agree that this is impossible except in case whe say that the wave one day would get "thinner" (maybe "weaker" would be the word) and at some point of time vanish or at least stop increasing its size? In case it stops increasing its size, wouldn't you agree that someone who is too far away could maybe not be able to see the light in case the light reaches the person on a blind spot? It seems illogical to admit that light is a wave that propagates eternaly reaching all points of view (propagating in all directions), getting bigger and bigger in size and reaching all possible observers.
 
  • #9
lordoftheselands said:
Do you agree that this is impossible except in case whe say that the wave one day would get "thinner" (maybe "weaker" would be the word) and at some point of time vanish or at least stop increasing its size?
The wave gets weaker and weaker as it spreads more widely. But it never stops spreading out and never vanishes. At least not in the classical model (Maxwell's equations). The wave will eventually be indetectably weak. But not gone.

If you factor in an expanding universe (and choose a co-moving coordinate system) then the light both spreads out and loses energy to cosmological red shift. But it still never goes away. Again, it will eventually be indetectably weak. But not gone.

If you do not choose a co-moving coordinate system, the same observable things happen. But the explanations may vary.

The photon interpretation is weird. I do not want to pontificate. But there is no distance limit.
 
  • Like
Likes dextercioby
  • #10
lordoftheselands said:
Is it wrong when high school students draw lines of little bullets reflecting on mirrors?
What you are drawing is rays of light. This so-called ray approximation works if conditions are such that the wave properties of light are negligible. Note, though, that the ray approximation only works if each ray consists of a large number of photons. The same is true of the wave approximation, it is valid only when you have a large number of photons.

Note that the work that Einstein did with the photoelectric effect established that light can behave like particles. But that was 1905. Twenty years later or so quantum theory was much more fully developed into what we now call the new quantum theory. Things like the wave-particle duality are part of the old quantum theory.
 
  • Like
Likes dextercioby
  • #11
lordoftheselands said:
Is it wrong when high school students draw lines of little bullets reflecting on mirrors?
No if that's all the variables you want to consider
 
Last edited by a moderator:
  • #12
jbriggs444 said:
The wave gets weaker and weaker as it spreads more widely. But it never stops spreading out and never vanishes. At least not in the classical model (Maxwell's equations). The wave will eventually be indetectably weak. But not gone.

If you factor in an expanding universe (and choose a co-moving coordinate system) then the light both spreads out and loses energy to cosmological red shift. But it still never goes away. Again, it will eventually be indetectably weak. But not gone.

If you do not choose a co-moving coordinate system, the same observable things happen. But the explanations may vary.

The photon interpretation is weird. I do not want to pontificate. But there is no distance limit.

Is there enough energy in a light wave to make it spread forever? So it gets bigger and bigger reaching all spots in the universe? Seems impossible because of the lack of energy for this. I don't understand why people do not question themselves about this.
 
  • #13
lordoftheselands said:
Is there enough energy in a light wave to make it spread forever? So it gets bigger and bigger reaching all spots in the universe? Seems impossible because of the lack of energy for this. I don't understand why people do not question themselves about this.
As the wave spreads out its intensity decreases accordingly. It is "weaker" but covers more area. Energy is conserved. You can see this yourself if you look at a car headlight from 1m, 10m, 100m, etc.
 
  • #14
DaveE said:
As the wave spreads out its intensity decreases accordingly. It is "weaker" but covers more area. Energy is conserved. You can see this yourself if you look at a car headlight from 1m, 10m, 100m, etc.

It works for short distances but when I ask myself about a light source at a distant galaxy creating light waves... I'm not sure if it would reach us. Seems impossible for the waves to cover an absurd bigger area without any consequence. How can exist energy for this? You can't assume that the wave would cover an increasingly bigger area without thinking about the necessary energy for this.
 
  • #15
It does not matter that it seems impossible.
What is "necessary energy" ? These are statements devoid of meaning. Physics is a quantitative science, and numbers can be very large or very small.
 
  • #16
lordoftheselands said:
Seems impossible for the waves to cover an absurd bigger area without any consequence. How can exist energy for this?
It's always the same amount of energy (assuming we're not getting to cosmological distances here), just spread out over more area. Classically this would just make the source dimmer and dimmer. In a quantum model (which you do need for a sufficiently faint source) you eventually start to see shot noise from the arrival time statistics of photons, and the received energy in any region fluctuates around the classically predicted average.
 
  • Like
Likes hutchphd, Drakkith, russ_watters and 1 other person
  • #17
lordoftheselands said:
According to Einstein light would be a particle and a wave.

So I infer that it propagates in vacuum in form of waves of little bullets (particles).

This explanation is very insuficient.

So tell me how do waves increase in size since it's made of little bullets (particles)... a wave gets bigger and bigger as far as it moves away from its original spot. So the number of particles of a wave increases? How?

Please tell me what you think about this.
You should note that Einstein's idea of light quanta as "little bullets" has been overcome only 20 years after its conjecture. Ironically, though Einstein got the Nobel prize particularly for his theory of "light quanta", it's the only one of his works, which has been found explicitly being wrong.

The modern notion of "photons" as particular states of the quantized electromagnetic field tells us that especially photons, which are described by a quantized massless field of spin one, do not even have a position observable in the usual sense, and thus they are not "localizable" at all, and all this indeed builds on Einstein's work on the (special) theory of relativity! If you try to "localize" photons, i.e., the electromagnetic field you can think of using some cavity made of ideally reflecting walls. That's pretty easily done since treating the electromagnetic field with the appropriate boundary condition you find that it's described by a set of independent harmonic oscillators, which are quantized in the way you may know from the introductory quantum-mechanics lecture. The corresponding "cavity photons" are excited states of the electromagnetic field referring to the eigenmodes of this field determined by the boundary conditions of the cavity. They are not localized at all! If you take the energy density as a measure for detecting "a photon", you see that it's spread over the entire cavity. So the photon is in fact "minimally localized"!

The same holds for the concept of "wave-particle duality". There is no such thing in modern quantum theory. All you can know about a (massive) particle, which indeed has a position observable and which can to a certain extent be localized, is the probability for finding it at any given place, given the state of this particle. The probabilities are described by wave equations (at least in non-relativistic quantum mechanics) and thus show also properties of fields and wave solutions for such fields.

To make a long story short: The "old quantum theory" a la Einstein, de Broglie, and Bohr is outdated and substituted by the most successful theory of all times, "modern quantum theory", including relativistic quantum field theory.
 
  • Like
Likes dextercioby, sophiecentaur, hutchphd and 1 other person
  • #18
lordoftheselands said:
It works for short distances but when I ask myself about a light source at a distant galaxy creating light waves... I'm not sure if it would reach us. Seems impossible for the waves to cover an absurd bigger area without any consequence. How can exist energy for this? You can't assume that the wave would cover an increasingly bigger area without thinking about the necessary energy for this.
If you look directly at the Sun it will blind you. Meanwhile, distant galaxies are not visible to the naked eye.
 
  • Like
Likes Ibix
  • #19
lordoftheselands said:
How can exist energy for this?
Light does not "use up" any energy while travelling. It has unbounded range.

If this defies your understanding then the problem is not with light. It is with your understanding.
 
  • Like
Likes vela, Ibix and phinds
  • #20
@lordoftheselands , it looks like you've been asking this same question for at least a year now and have been getting the same answer. What exactly isn't satisfactory about it? Light propagates like a wave. When the wavelength is small compared to the optical elements it encounters, the wave properties are not so important and light can be treated as continuous rays that travel in straight lines (we call this the geometric optics limit). "Little bullets" is always wrong, and comes from a common misunderstanding of what photons are.
 
  • Like
Likes dextercioby, DrClaude, russ_watters and 1 other person
  • #21
lordoftheselands said:
Seems impossible for the waves to cover an absurd bigger area without any consequence.
But there is a consequence. The light gets dimmer. If you consider the limit of very large distances the light can be so dim that you need a telescope to see it.
 
  • Like
Likes vanhees71, russ_watters and Ibix
  • #22
lordoftheselands said:
Is there enough energy in a light wave to make it spread forever? So it gets bigger and bigger reaching all spots in the universe? Seems impossible because of the lack of energy for this. I don't understand why people do not question themselves about this.
Easy math tells us an inverse square function never crosses zero, so I don't see the problem.
 
  • Like
Likes vanhees71
  • #23
lordoftheselands said:
Is there enough energy in a light wave to make it spread forever? So it gets bigger and bigger reaching all spots in the universe? Seems impossible because of the lack of energy for this. I don't understand why people do not question themselves about this.
You're correct in the theory of observation. However how can you possibly measure all energy reactions relating through the double slit experiment standards?
 
  • #24
It continues forever as long as it's perceived
 
  • Skeptical
Likes davenn
  • #25
lordoftheselands said:
I don't understand why people do not question themselves about this.

Because we learn physics properly from textbooks, using all the math that is needed.
 
  • Like
Likes vanhees71
  • #26
Ibix said:
It's always the same amount of energy (assuming we're not getting to cosmological distances here), just spread out over more area. Classically this would just make the source dimmer and dimmer. In a quantum model (which you do need for a sufficiently faint source) you eventually start to see shot noise from the arrival time statistics of photons, and the received energy in any region fluctuates around the classically predicted average.
Are there stars which we can see only due to this fluctuation, and we shouldn't see according to the classical model?
 
  • #27
A.T. said:
Are there stars which we can see only due to this fluctuation, and we shouldn't see according to the classical model?
I don't know, but I doubt it. You don't just need to receive a photon to see a star - you need to integrate for a while to know that there's a source there, not a random photon in your detector. And once you're doing that you'd be integrating up to detection levels in a classical model too.

So I guess you might detect a photon from something you would not expect to be able to see classically. I don't think you'd ever be able to tell that you had done so. And given the complexity of the photon model it may not even make sense to talk about whetheran individual photon detection is actually traceable to a given source - I'll defer to others on that.
 
  • #28
bdrobin519 said:
You're correct in the theory of observation. However how can you possibly measure all energy reactions relating through the double slit experiment standards?
You can't. When unobserved the reach of the radiation is greatly increased. Expect that.
 
  • #29
Ibix said:
It's always the same amount of energy (assuming we're not getting to cosmological distances here), just spread out over more area. Classically this would just make the source dimmer and dimmer. In a quantum model (which you do need for a sufficiently faint source) you eventually start to see shot noise from the arrival time statistics of photons, and the received energy in any region fluctuates around the classically predicted average.
This is the key to understanding how light works as it spreads out in my opinion. As those of us who do astrophotography know, light from very distant sources, like stars and galaxies, is so dim that you can open the shutter of your telescope's camera for 20 minutes and receive perhaps a few dozen to a few hundred photons from your target, or less.

The energy of a light wave is quantized into photons. A very, very bright source that is very close will deposit its energy in huge numbers of discrete events (photons), but they are still quantized interactions. This does not change as the sources moves to greater and greater distances. The only thing that changes is the number of events per unit of time, which eventually falls to below one. In other words, the number of photons detected per second goes from many millions or billions (or more) per second for something like the Sun, down to fractions of a photon per second for distant light sources. Of course we don't detect fractional photons, we just receive one every minute on average, for example.

So no, a light wave never runs out of energy during its travel (unless all of its energy is deposited into objects by the time it reaches some distance). We just detect fewer and fewer photons per second as the source gets dimmer or the light wave travels farther.

A.T. said:
Are there stars which we can see only due to this fluctuation, and we shouldn't see according to the classical model?
The 'fluctuation' is nothing more than the random arrival of photons around some mean value, which happens regardless of the intensity or distance of the source. In fact, the fluctuations are actually larger for a brighter source. They just don't increase linearly with increasing source intensity, which is why brighter sources are easier to see in images.

And its funny you should ask about the classical model. As far as I understand, in radio astronomy the energy of each photon is so low and the numbers so high for most sources that we can treat it classically anyways. There's still noise to deal with, but the incoming signal is virtually continuous, not discrete.

The only real difference between the two is that in the classical case you don't have to worry about shot noise, which is the random variation in the number of photons that arrive per unit of time. That is, if you take a picture and you expect to see 100 photons over the course of the exposure from your target you can say that the signal is 100 but the shot noise is the square root of that signal, or 10, for a signal-to-noise ratio of 100/10 = 10. So your picture might contain 100 photons from that target, or it might have 90, or 110, or almost any other amount. If you were to measure the number of photons in a series of images the number would vary in each image, but with enough images they would average out to 100. A source with an expected signal of 1000 would have an expected noise of the square root of 1000, for a signal to noise ratio of 1000/31.62 = 31.64. So an increase in the signal by 10x results in only a 3x gain in noise, raising the SNR.

This random variation is one of the main reasons dim objects are so hard to detect. A slightly brighter pixel might be a star or it might be the shot noise of the background. You just don't know until you've taken enough images to 'beat the noise'. In classical physics the signal is continuous, so there is no shot noise. For a signal with sufficiently large numbers of photons the noise is so small relative to the signal that we can virtually ignore it.
 
  • #30
bdrobin519 said:
You can't. When unobserved the reach of the radiation is greatly increased. Expect that.
It takes an expert experimentalist to detect the increased reach of unobserved radiation.
 
  • Like
Likes sophiecentaur, bdrobin519 and nasu
  • #31
jbriggs444 said:
It takes an expert experimentalist to detect the increased reach of unobserved radiation.
If you can even slightly believe it's possible; there just might exist a variable unaccounted for. Now how to measure it is the next question.
 
  • Skeptical
Likes sophiecentaur
  • #32
bdrobin519 said:
If you can even slightly believe it's possible; there just might exist a variable unaccounted for. Now how to measure it is the next question.
How to measure radiation without measuring it? That is, indeed, a question.
 
  • Haha
  • Like
Likes sophiecentaur, bdrobin519 and jbriggs444
  • #33
lordoftheselands said:
There is a high chance of you being right, but in High School teachers tell us to make exercices showing lines of little bullets reflecting on mirrors, is this wrong?
A late comment about this but we need to be careful in the use of the word "wrong". Ray Optics assumes that those narrow beams have zero width and do not converge or diverge on themselves. If high school teachers covered themselves by qualifying everything they try to tell kids then they'd never get to the end of any Physics course. Just see how wide this thread has become, just because PF is trying to tie ups all the loose ends of this 'straightforward' tropic.
Ray Optics works; end of.
 
  • Like
Likes dextercioby and Frabjous
  • #34
sophiecentaur said:
Ray Optics works
...within limits, of which you sometimes need to be aware. But yes - Asimov's The Relativity of Wrong should be required reading the first time any student encounters a science instructor introducing a better model.
 
Last edited by a moderator:
  • #35
Ibix said:
...within limits, of which you sometimes need to be aware. But yes - Asimov's The Relativity of Wrong should be required reading the first time any student encounters a science instructor introducing a better model.
OK then. Ray Optics works for Ray Optics problems.??
 
  • Haha
  • Like
Likes DrClaude and vanhees71

Suggested for: Does light propagate as a wave of little bullets?

Replies
12
Views
995
Replies
4
Views
214
Replies
4
Views
1K
Replies
11
Views
1K
Replies
4
Views
907
Replies
1
Views
974
Replies
1
Views
1K
Replies
5
Views
841
Back
Top