- #1
astrobird
- 22
- 2
I have read several articles about this topic, on Wikipedia but for example also this article http://supernova.lbl.gov/PDFs/PhysicsTodayArticle.pdf and also several books that discuss the subject. In very brief my understanding is that:
- 1. When observing type 1a supernovae (standard candle) at medium-range distance, they appear fainter than you would expect based on their redshift value, because the expansion of the universe has been accelerating since the light we are observing left the supernovae.
- 2. When observing type 1a supernovae at very large distance, they appear brighter than you would expect based on their redshift value, because the expansion of the universe was decelerating still when the light we are observing left the supernovae.
My question now is how such conclusions can be made from the redshift and brightness observations. As is explained in the various texts the redshifts arises from the fact that the photons are stretched while on their way from the supernovae to earth. I understand how this works, but if the photons are stretched because of the expansion it also means that (because of the expansion) the distance between the photon and Earth becomes longer, is that right?
When then looking at point 1 above. The fact that the supernovae appear fainter than expected implies that they are more distant than expected (based on their redshift), however, if they are more distant than expected because the expansion accelerated, shouldn't this acceleration of expansion have affected the redshift in exactly the same way so that in the end distance from the object(s) to Earth as well as the redshift of the photons traveling from the object(s) to Earth would be equally affected?
I'm sure I'm missing something obvious as in most texts I read the point 1 and 2 above are presented as very logical conclusions based on the observations, I just don't see how, please enlighten me:)
- 1. When observing type 1a supernovae (standard candle) at medium-range distance, they appear fainter than you would expect based on their redshift value, because the expansion of the universe has been accelerating since the light we are observing left the supernovae.
- 2. When observing type 1a supernovae at very large distance, they appear brighter than you would expect based on their redshift value, because the expansion of the universe was decelerating still when the light we are observing left the supernovae.
My question now is how such conclusions can be made from the redshift and brightness observations. As is explained in the various texts the redshifts arises from the fact that the photons are stretched while on their way from the supernovae to earth. I understand how this works, but if the photons are stretched because of the expansion it also means that (because of the expansion) the distance between the photon and Earth becomes longer, is that right?
When then looking at point 1 above. The fact that the supernovae appear fainter than expected implies that they are more distant than expected (based on their redshift), however, if they are more distant than expected because the expansion accelerated, shouldn't this acceleration of expansion have affected the redshift in exactly the same way so that in the end distance from the object(s) to Earth as well as the redshift of the photons traveling from the object(s) to Earth would be equally affected?
I'm sure I'm missing something obvious as in most texts I read the point 1 and 2 above are presented as very logical conclusions based on the observations, I just don't see how, please enlighten me:)