Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Light Wavelength increases over a distance

  1. Mar 6, 2009 #1
    The wavelength of light increases over large distances right?

    It seems that there is no account for this when determining the redshift of a distant star.
     
  2. jcsd
  3. Mar 6, 2009 #2

    f95toli

    User Avatar
    Science Advisor
    Gold Member

    No, not if the light is traveling through a linear medium (e.g. vacuum).
     
  4. Mar 6, 2009 #3
    I seem to remember something about microwave radiation effecting the first sattelite communications we tried to invent, and it was found to be light from the birth of the universe in which its wavelength had increased over the large distance it had travelled.
     
  5. Mar 6, 2009 #4
    Light wavelength increases over a large distance, DUE TO the redshift factor, or Hubble parameter.
    This increase is caused by the expansion of the space metric the light is travelling in.
     
  6. Mar 6, 2009 #5

    So when we determine the speed of a distant star based on redshift how can that be acurate? No wonder the further away a star is the faster is "apears" to be going.
     
  7. Mar 6, 2009 #6
  8. Mar 6, 2009 #7
    The wavelength is determined by the energy of the light. So in the absence of external effects such as gravitational red shift, scattering etc, it's quite impossible for the wavelength to increase (conservation of energy).
     
  9. Mar 6, 2009 #8

    Since light emitted by stars comes from the same basic elements (hydrogen, Helium, and Oxygen) we know what the frequency and wave length should be. Then we can calculate the speed with the formulas Naty1 mentions. The harder question that you may not realize, is how do you know how far away something is? If I have a light you can't tell how far away I am using its intensity w/o know what it's actual intensity is. There happens to be a particular body that burns with the same intrinsic brightness in almost all cases that it has been observed (type 1a supernova?). So it makes the perfect consistent light bulb to measure by. So then you can tell that the farther away a galaxy is the faster it is going. But the farther away you go, the farther back into time you are looking

    http://en.wikipedia.org/wiki/High-z_Supernova_Search_Team
     
  10. Mar 6, 2009 #9
    xepma. No ! Normally the frequency of light determines the energy. The wavelength is then dependent on the speed. For ex. the wavelength is greater in a medium, like glass or water, because of the reduced speed of light.
    You admit that gravitational red shift occurs, so that defeats your argument.

    Hubble redshift gives us a rough figure for the distance of a star or galaxy because the amount of redshift is proportional to distance. The amount is very small and only shows at cosmological distances. calcium absorption spectrum lines are used often to calibrate. for example Virgo at 59Mly shows recession at 1200km/s. Hydra at 3Bly shows recession at 61,000km/s.

    If you want a paper on "Redshift and Energy Conservation" look for Alasdair MacLeod on the arXiv website.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook