Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Confirmation, please.

  1. Jan 10, 2006 #1
    Are stars far away the same age as stars closer, or are they younger or older?

    And wouldn't the speed of light make the planets farther away less then estimates say they are?
     
  2. jcsd
  3. Jan 10, 2006 #2
    Light from astronomical bodies just makes them appear younger than they are because photons do not age as they approach us; they are frozen in the moment of time at which they are emitted from their source, traveling at the speed of light, to the moment they impinge on our retina.
     
  4. Jan 10, 2006 #3

    DaveC426913

    User Avatar
    Gold Member

    The age of stars can be assumed to be evenly distributed with respect to distance from us. There are lots of reasons why stars are clumped by age (such as stellar nurseries) but in general, there's no reason why a young star couldn't be right next door.

    I think you're talking about telescopes being a sort of time machine looking into the past. This is true. Stars that are 4 ly away are seen as they were 4 years ago. Stars that are 2 million ly away are seen as they were 2My ago.

    This fact has not escaped astronomers, and they take it into account.



    "And wouldn't the speed of light make the planets farther away less then estimates say they are?"

    Yes, our view of the planets is delayed by a few light-minutes to light-hours.

    All well-accounted for. Our "estimates" are pretty accurate. We know a LOT about where they are and where they're going. We can predict their positions many, many years, decades, and to some degree, centuries, into the past and future. And we can know their present positions accurately enough to land spacecraft on them.
     
    Last edited: Jan 10, 2006
  5. Jan 10, 2006 #4

    DaveC426913

    User Avatar
    Gold Member

    With all due respect, this is an odd and rather misleading pseudo-explanation. It seems to suggest that the distance=peering-into-the-past is some sort of illusion, instead of a very real and fundamental part of our universe.
     
    Last edited: Jan 10, 2006
  6. Jan 10, 2006 #5
    if the universe is expanding and the relative speed from us is [tex]v=HR[\tex] where H is hubbell const (around 70 parsec/sec if i recall correctly)

    apart from the fact that light takes time to reach us from distant planets, the stars that are far from us are seem to be slower because of their relative speed, and the ones at the edge (v approaches c) are almost frozen in time as far as were concerned...
     
  7. Jan 10, 2006 #6

    DaveC426913

    User Avatar
    Gold Member

    I don't think that relativistic time dilation plays much of a part in this.

    The dilatory factor remains effectively zero until within a few percent of the speed of light. This means that, in the 13.7Gly to the edge of the observable universe, only the last few 100Mly or so (i.e. < the skin on a universe-sized apple) would exhibit any observable time dilation at all.
     
  8. Jan 11, 2006 #7

    Chronos

    User Avatar
    Science Advisor
    Gold Member
    2015 Award

    The Hubble flow is negligible until you reach mega-parsec distance scales; at which point it is extremely difficult to resolve individual stars. On the other hand, all photons we observe are time capsules that represent their personal surface of last scattering.
     
  9. Jan 11, 2006 #8
    Okay, so here was a simple idea that I'm pretty sure is wrong, but who knows?


    Anyway,

    Since stars further away are seen moving faster then stars closer, and those stars are seen as younger, does that mean that as time passes, the expansion slows, answering the question of what's happening to the universe?

    Please tell me how I'm wrong in this conjecture, for I am almost sure I am.
     
  10. Jan 12, 2006 #9

    SpaceTiger

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Well, it depends on the scales you're talking about. The average age of the stellar populations tends to decrease with distance above the galactic plane. Within a bubble of 100 pc, the ages should be fairly evenly (though not smoothly) distributed. In a bubble of a kiloparsec, there would definitely be a noticable gradient. On still larger scales, you'll see gradients in the radial direction as well.
     
  11. Jan 12, 2006 #10

    SpaceTiger

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I'll assume you're talking about cosmological scales, on which we observe galaxies and quasars instead of individual stars. It turns out that more distant objects would appear to be moving more quickly (and appear younger) regardless of whether the universe were accelerating or decelerating.

    Think of it this way. The apparent speed of the receding galaxies is measured by the redshift of their light. The amount that light becomes redshifted is related to the amount by which the universe has expanded during its journey to us. Light coming from more distant objects will always have to travel further to reach us, so the universe will always have expanded more during its trip. Thus, light from more distant objects will always appear more redshifted and we'll always infer their recession velocities to be higher, as long as the universe is expanding. If the universe were contracting, then the opposite would be the case.
     
  12. Jan 13, 2006 #11
    ^That's the kind of perfect explination I love to see!

    Thank you! ^_^'
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?