Energy degradation of light

  • #1

Main Question or Discussion Point

hi,
since light is energy moving through the fabric of space time at velocity c , it should lose energy due to gravitational waves. doesn't this mean that the light from a distant galaxy would keep decreasing in frequency? further more this would imply that light that was originally emitted as blue light would be observed by the observer as (say) red assuming the observer and galaxy are at relative rest.
 

Answers and Replies

  • #2
29,052
5,323
since light is energy moving through the fabric of space time at velocity c , it should lose energy due to gravitational waves.
Gravitational radiation is quadrupole. An object moving at constant speed does not emit gravitational waves.
 
  • #3
Matterwave
Science Advisor
Gold Member
3,965
326
Light does get redshifted due to the expansion of the universe, however.
 
  • #4
1,115
3
While the OP's premise of straight line photon propagation giving gravitational wave emission is not per se correct, there is something else here that surely needs a good answer. The expected production of HFGW's (high frequency gravitational waves) owing to thermal motion of matter within say the sun, white dwarf stars, neutron stars etc has been calculated e.g. http://arxiv.org/abs/0708.3343.
Cannot find a single reference to any contribution from radiation (photons). Which seems strange because in large hot stars for instance pressure from thermal radiation dominates. A simple appeal to relativistic beaming would, by analogy with Bremsstrahlung radiation http://en.wikipedia.org/wiki/Bremsstrahlung , suggest an infinitely large contribution to HFGW's from photon emission/absorption/scattering events within stellar cores. This doesn't happen. So does this have something to say about the applicability of treating photons as point particles that propagate through space at c? One school of thought in QED views them as such.
 
Last edited:
Top