- #1
Xilor
- 152
- 7
Hello, I've been trying to internally conceptualize as much of physics as I could, and doing so I realized that something I thought I understood at first does not actually make any sense in my head anymore. And since it doesn't, I'm afraid that I might have misunderstood a lot of other concepts, so I was hoping you folks over could tell me where I'm making my mistake. So, what I don't understand is how the observed redshifts show that the expansion of the universe is accelerating rather than decelerating.
From what I gather, astronomers have looked at supernovas at very large distances. Supernovas of which we can determine their distance to us by means of luminosity when it's known what kind of supernovas they are. Measured redshifts of these supernovas are compared to their distances, and from that it is seen that supernovas which are further away have a larger redshift than what would be predicted by a linear expansion. The further the supernova, the bigger the increase in redshift compared to linear expansion.
We see redshifts in all directions, these redshifts are correlated with distance and some of the redshifts observed would mean that some objects would be moving away at superluminal speeds. Because of these reasons, these redshifts are considered not to be a Doppler-shift effect, but are rather created through something that is happening in the space between the objects. The objects don't really move away, the distances to them just increase through expansion while the photons travel. These photons are redshifted during their travel because they are pulled apart by the expansion. Am I correct so far? If not then that could already solve the confusion.
In the case that that is correct (or correct enough) though, why would that show accelerated expansion?
These redshifts happen after the photon has left an object, so I'd think that we should look at the photon and not the object they are emitted from, what is happening to the photon during its travel?
With no expansion, a photon would arrive with no redshift.
With linear expansion, the photon throughout its travel will continually be slowly redshifted at equal rates. During the early part of its travel it will redshift just as much as in the later part.
With accelerated expansion, the photon throughout its travel should experience different rates of redshift. Acceleration expansion over time obviously means that there will be more expansion during the later part of its travel than during the early parts, that's what accelerated means. So, initially the expansion would be closer to no expansion, and the photons won't be stretched out as much. Near the end however, while the expansion has increased, the stretching apart and the accompanying redshift should be greater.
So when observing this, shouldn't it be the other way around? The end part, or the most recent movement of photons is the part with the highest redshifting of the photons and it is from the point of the observer the closest area. So the closest area should show the highest expansion compared to linear expansion, and thus the strongest stretching of photons and redshift. A source of photons within this close area should have the highest redshift per distance traveled. And sources further away should have less redshift per distance traveled, because the photons emanated from these further sources were going through space that wasn't expanding as much yet.
This seems like the complete opposite of the data to me, further supernovas show larger redshifts and closer novas show smaller redshfts. The data seems to say that there has been a deceleration instead to me. Further and older sources have been redshifted more than linear redshift (calculated with closer sources) would suggest, so the photons from them had to experience stronger redshift and expansion at the start. More expansion during early times, less expansion during later times = decelerating expansion.
So, I don't understand at all how I come to such a completely different conclusion, where is my logic going wrong?
From what I gather, astronomers have looked at supernovas at very large distances. Supernovas of which we can determine their distance to us by means of luminosity when it's known what kind of supernovas they are. Measured redshifts of these supernovas are compared to their distances, and from that it is seen that supernovas which are further away have a larger redshift than what would be predicted by a linear expansion. The further the supernova, the bigger the increase in redshift compared to linear expansion.
We see redshifts in all directions, these redshifts are correlated with distance and some of the redshifts observed would mean that some objects would be moving away at superluminal speeds. Because of these reasons, these redshifts are considered not to be a Doppler-shift effect, but are rather created through something that is happening in the space between the objects. The objects don't really move away, the distances to them just increase through expansion while the photons travel. These photons are redshifted during their travel because they are pulled apart by the expansion. Am I correct so far? If not then that could already solve the confusion.
In the case that that is correct (or correct enough) though, why would that show accelerated expansion?
These redshifts happen after the photon has left an object, so I'd think that we should look at the photon and not the object they are emitted from, what is happening to the photon during its travel?
With no expansion, a photon would arrive with no redshift.
With linear expansion, the photon throughout its travel will continually be slowly redshifted at equal rates. During the early part of its travel it will redshift just as much as in the later part.
With accelerated expansion, the photon throughout its travel should experience different rates of redshift. Acceleration expansion over time obviously means that there will be more expansion during the later part of its travel than during the early parts, that's what accelerated means. So, initially the expansion would be closer to no expansion, and the photons won't be stretched out as much. Near the end however, while the expansion has increased, the stretching apart and the accompanying redshift should be greater.
So when observing this, shouldn't it be the other way around? The end part, or the most recent movement of photons is the part with the highest redshifting of the photons and it is from the point of the observer the closest area. So the closest area should show the highest expansion compared to linear expansion, and thus the strongest stretching of photons and redshift. A source of photons within this close area should have the highest redshift per distance traveled. And sources further away should have less redshift per distance traveled, because the photons emanated from these further sources were going through space that wasn't expanding as much yet.
This seems like the complete opposite of the data to me, further supernovas show larger redshifts and closer novas show smaller redshfts. The data seems to say that there has been a deceleration instead to me. Further and older sources have been redshifted more than linear redshift (calculated with closer sources) would suggest, so the photons from them had to experience stronger redshift and expansion at the start. More expansion during early times, less expansion during later times = decelerating expansion.
So, I don't understand at all how I come to such a completely different conclusion, where is my logic going wrong?