Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I How does the speed of light affect expansion acceleration?

  1. Nov 16, 2017 #1
    I may have a fundamental misunderstanding of the concept, but I was wondering, how does the accelerating expansion of the universe calculate for the time dilation in light travel?

    From my understanding, we know that the universe expansion is accelerating because the farthest galaxies that we can see are all red-shifted. However, I also know that due to the speed of light, any light that we see from distant galaxies shows us a representation of what that galaxy looked like in the past. If the galaxies that we see red-shifted are far enough away and closer to the time of the singularity, couldn't it be possible that the red-shift we see is still from the initial expansion and not still currently happening? If it was accelerating away from us several billion years ago, how do we know that it is still accelerating today? Is there an "escape velocity" within a red-shifting galaxy?
     
  2. jcsd
  3. Nov 16, 2017 #2

    kimbyd

    User Avatar
    Science Advisor
    Gold Member

    It's difficult for me to parse what you're saying here, but I think the answer is that there's no time dilation, at least not in the way you're thinking.

    As light is redshifted, it takes longer for the troughs and subsequent peaks of the incoming electromagnetic wave to arrive, which means that the image of the galaxy we observe is slowed down. This isn't the same effect as time dilation, however.

    It is true that we are only seeing a slice in time of our universe, where looking further away is looking further into the past. We try to make sense of this universe through a particular model, one that states that the universe is uniform and is expanding uniformly. This model is fully-consistent with the slice of the universe we can see. When we apply this model to the portion of the universe we can observe, we find that all galaxies that aren't gravitationally-bound to us have, for billions of years, been accelerating away from us (in the early universe, they were decelerating).
     
  4. Nov 16, 2017 #3
    Hello Kimbyd, thank you very much for your reply, especially so quickly. I apologize if I'm not presenting my question very clearly, I'm fairly new to these fields of study and trying to learn as much as I can. Reading your response, I don't believe "time dilation" was the correct term for me to use.

    I think the core of my question ties to your last statement;

    How do we know that they were decelerating? Is there noticeably less red-shift on the farthest/earliest galaxies? (note: I'm assuming that the earliest galaxies are the farthest ones away because the light would have taken the longest to reach us, meaning that we are seeing them at an earlier time than galaxies closer to us. Would this assumption be correct?)
     
  5. Nov 16, 2017 #4

    kimbyd

    User Avatar
    Science Advisor
    Gold Member

    It's not that there is less redshift. Average* redshift increases with distance, always. It's a matter of precisely how that redshift increases with distance that matters.

    What the redshift tells us is how much the universe has expanded since a photon was emitted. First, redshift is defined as ##\lambda_o = (z+1)\lambda_e##, if ##\lambda_e## is the wavelength at emission, and ##\lambda_o## is the wavelength at observation. The ##z+1## is important: it means that at ##z=1##, the observed wavelength will be twice the emitted wavelength.

    In terms of the cosmological redshift, this means that the distances in the universe have doubled between the time that photon was emitted and the time it was observed.

    So what cosmologists do is compare the distance to the object in question (measured via multiple different means) to how much expansion there has been since the photons we see were emitted (measured via redshift). Comparing these two parameters across a wide range of redshifts (e.g. by observing many supernovae at different distances) lets them determine how rapidly the universe has expanded at different points in time. Really far away, there's deceleration. More recently there's acceleration.

    * Average because individual galaxies do move relative to the overall expansion. But on average galaxies are moving away from one another.
     
  6. Nov 17, 2017 #5

    Grinkle

    User Avatar
    Gold Member

    @Joe Fatuch Because we can model a supernovae, we know the wavelength photons emitted by a supernovae will have at their time of emission. That may be well understood by you, but I recall for myself that was a key piece in understanding how we measure expansion rates / distances.

    https://en.wikipedia.org/wiki/Cosmic_distance_ladder#Galactic_distance_indicators
     
  7. Nov 17, 2017 #6

    kimbyd

    User Avatar
    Science Advisor
    Gold Member

    That isn't accurate.

    The redshift is determined by measuring emission lines. Different atoms emit light at very specific wavelengths, enough to precisely determine the amount of redshift.

    The distance, for supernovae, stems from their brightness. There is a subclass of supernovae, Type-Ia (type one-a), which all seem to be roughly the same brightness. There are some differences from event to event, but they're all really close, and some of the differences can be corrected-for (e.g. brighter supernovae last longer, so by measuring how long it lasts we can partially correct the brightness).
     
  8. Nov 17, 2017 #7
    @kimbyd I think I see what you're saying. By measuring the brightness in a type 1-a supernovae, they can determine the distance of a galaxy and compare it's distance to another galaxy with a similar event. This allows them to measure the redshift from both galaxies and compare the results. (I hope I'm on the right path here.) With these measurements, they are finding that closer and therefore more recently observed galaxies are presenting a higher red shift (or acceleration) than the distant galaxies that we are seeing further back in time (which are decelerating).

    I hope that sums it up correctly in layperson's terms.
     
  9. Nov 17, 2017 #8

    kimbyd

    User Avatar
    Science Advisor
    Gold Member

    That's still a bit off.

    The redshift and distance are two completely different measurements.

    Redshift is measured by looking carefully at the spectrum of light coming in, and identifying specific features of the spectrum which uniquely determine its redshift.

    The distance for supernovae is determined by measuring how bright the object appears in the telescope, and comparing that to the brightness of such events at the source (most Type-Ia supernovae are very close in brightness to one another).

    The redshift measures how much the universe has expanded since the supernova occurred. A specific model of the expansion of the universe predicts a relationship between distance and redshift. Observing many supernovae allows us to see how well a model fits the data. Models with decelerating expansion in the early universe and accelerating expansion in the late universe fit the data best.
     
    Last edited: Nov 20, 2017
  10. Nov 20, 2017 #9

    Bandersnatch

    User Avatar
    Science Advisor
    Gold Member

    I think @Joe Fatuch in his last post is still repeating the same misconception, though - that light from closer objects in a monotonically expanding universe can have higher cosmological redshift than from farther ones (and that that's what points to recent acceleration).
     
  11. Nov 25, 2017 #10
    I think that you are getting to the core of my question. Is there something about reading redshift that doesn't have to account for relativity? For example, as of my understanding, If I see a galaxy 13 billion lightyears away, I'm seeing the light that left that galaxy 13 billion years ago, therefore I am seeing how that galaxy was ~13 billion years ago, not how it currently is. If the light in a galaxy 13 billion years ago is presenting higher redshift/acceleration values than a galaxy that is 10 billion light years away, wouldn't that mean that galaxies closer to our relative time and position are slowing rather than accelerating? I would assume that the redshift would increase as the distance/time went down showing more recent acceleration until ~65 million light years when the gravitational force begins pulling in on our local cluster.
     
  12. Nov 25, 2017 #11

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    No, you're not. You would be in flat spacetime, but the spacetime of the universe isn't flat. Cosmologists don't help any with this confusion because they quote distances and light travel times without bothering to tell you that there is no such simple connection between them.

    It is true that you are not seeing the galaxy as it is "now"; you are seeing it as it was when the light was emitted. But the correlation between how long ago the light was emitted and how far away the galaxy is is not as simple as you are assuming.

    No. The redshift doesn't mean what you think it means. It doesn't tell you how fast the galaxy is receding (or was when the light was emitted). It tells you by what factor the universe expanded while the light was traveling. So, for example, if we see a galaxy with a redshift of ##z = 1##, we know the universe has expanded by a factor of ##2## (##z + 1##) since that galaxy emitted the light we are seeing now.

    To know the distance to a galaxy, you don't look at the redshift; you look at other measurements, the brightness of the galaxy and its angular size on the sky. To see whether the expansion of the universe is accelerating or decelerating, you need to look at the relationship between all three of these measurements (redshift, brightness, and angular size); that relationship will be different for different "expansion histories" of the universe. But the redshift alone doesn't tell you that relationship.
     
  13. Nov 25, 2017 #12

    Bandersnatch

    User Avatar
    Science Advisor
    Gold Member

    @Joe Fatuch if it's not clear from Peter's and kimbyd's posts, the correct way of thinking of redshift is as of the total amount of expansion experienced by light as it travels through the expanding universe.

    Imagine the light emitted from some far away galaxy. It begins to travel towards the observer (us) through the expanding universe. As it does so, it experiences the expansion, and accumulates redshift as it goes.

    At some point it time, it passes by some other, closer (to us) galaxy, which also emits some light. Now you have the two light beams travelling together, but the older beam had already accumulated some redshift by the time it meets the new one. The two travelling together will accumulate the same additional amount of redshift. For the new (closer, later) one, the additional redshift will be all that it ever accumulates. But the older one will already have accumulated some redshift beforehand, so it'll add that to the new, additional redshift on its later leg of the journey to the observer.
    Hence the farther galaxies always have higher redshifts.

    Or, to use an analogy:
    Let's say you have a bank, in which savings accounts are being opened every now and then (light is emitted from galaxies at various distances = at different times). Each new account must have exactly the same initial deposit (=~spectral lines of emitted light are always the same), and no additional money is ever being paid in on any individual account. The bank pays premiums every month, equal to the current monthly rates, which makes the savings grow over time (=~light accumulates redshift as it travels over time).
    If an accountant looks at the total accumulated savings on each account today (total observed redshift of light from galaxies observed today), it will be noticed that older accounts always have more savings (=~ galaxies have more redshift) than newer ones.

    The accountant can say with certainty, that as long as all accounts opened later have less money on them (=~ galaxies observed in a later state/which emitted their light closer have less redshift), the bank has never deducted any premiums from the savings, and has always been paying at least something (=~ as long as the redshifts grow with distance, the universe has always been expanding, and never contracting).

    If the accountant wants to determine how the monthly premiums varied over time (=~ how the expansion varied over time), the curve of savings vs time must be fitted to some model. E.g. if the bank had huge premiums being paid early on, but reduced them later, then the accounts opened earlier must have much more money on them than you'd expect if the premiums were constant (=~ if the universe was expanding much faster early on, before being decelerated over time, then the light emitted from galaxies which emitted their light early on in the history of the universe = far away, must have much higher redshift than it'd be expected if the expansion was steady).

    Similar with fitting the curve to accelerated expansion (progressively heightened premiums) in the latter epoch of the history of the universe.


    In other words, you know the redshift of all galaxies further back in time and space must be higher, but by figuring out how exactly does it change with distance/time you can find out whether there has been any acceleration/deceleration.
     
    Last edited: Nov 25, 2017
  14. Nov 25, 2017 #13
    Fairly certain this is where I'm getting confused.

    So would it be correct to say that the redshift isn't set when the light is emitted but rather develops as a "stretch" of the light that increases as it travels through space equal to the distance that it's emitting galaxy recedes?

    Therefore the farther away a galaxy is (Measured separately by the angular size and supernovae brightness), the more "stretch" that has had time to accumulate on the light. However, we only see the increased "stretch" because the galaxy has moved away in the time that the light took to reach us. For example, If the galaxy had stayed stationary, there would be none (or minimal) shift visible, and if it had moved towards us the light would have shifted into the blue spectrum. But only because the original galaxy shifted position after the time the light was emitted. So, as we look at farther away galaxies we are seeing the shifting amount double per Mpc, showing that the farther away a galaxy is the faster and more extreme it's shift is becoming.

    Is that any closer to a correct understanding? (Hope I'm not driving you guys crazy with this one)
     
  15. Nov 25, 2017 #14

    jbriggs444

    User Avatar
    Science Advisor

    Any motion of the emitting galaxy after it has emitted the light is irrelevant.
     
  16. Nov 25, 2017 #15

    Bandersnatch

    User Avatar
    Science Advisor
    Gold Member

    Yes!
    Here, you're not quite right, and somewhat contradict the previous statements.
    It doesn't matter what any individual galaxy does after it emits its light. We receive the light it had emitted some time ago, and the redshift in this light is fully accounted for by what has been happening expansion-wise as the light was travelling, locally to where the light beam was at any given time (i.e. you integrate the local expansion over the whole light travel distance). It's all about what the light itself has experienced. The observed light doesn't by itself tell you anything about how the galaxy it came from looks like, or is receding like NOW, or during any of the time between the emission and reception of its light.

    In principle, the galaxy could have reversed its recession once it had emitted the light, and as long as there was expansion going on throughout the space the light itself was travelling, we'd still see redshift.

    You can infer the current state of that galaxy by analysing light from many sources, and fitting redshift curves to a model of expansion - which then tells you where the galaxy should be.
    If the redshift data over the whole range of distances/times indicates that the universe expands uniformly at any given time, then one can infer that the galaxy whose light we observed has indeed receded farther away since emission (but it's not what causes the redshift - i.e. it's neither the recession speed at emission, nor at reception, but the integrated expansion experienced by light).
     
  17. Nov 25, 2017 #16
    True but it would be very contrary to any feasible model of expansion. and it would need at least a century of observation to gather meaningful data,
    Galaxies arbitrarily changing their position is not on the table unless galaxies can be sometimes like quantum objects.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted