- #1
etotheipi
Suppose light travels during a time interval of t2 - t1, where the scale factors at t1 and t2 are a(t1) and a(t2) respectively.
If we consider an infinitesimally small interval of time dt during this interval, without accounting for expansion we would expect light to travel a distance cdt. How do we adjust this quantity to account for the factor by which this distance has increased during this time interval?
The goal is to determine the proper distance traveled by the light between these two times, and I think this just constitutes integrating the adjusted version of the quantity cdt over t1 to t2 but don't know how to find the expansion factor.
Please do let me know if I'm completely wrong, I'm still not at all comfortable with cosmological distances!
If we consider an infinitesimally small interval of time dt during this interval, without accounting for expansion we would expect light to travel a distance cdt. How do we adjust this quantity to account for the factor by which this distance has increased during this time interval?
The goal is to determine the proper distance traveled by the light between these two times, and I think this just constitutes integrating the adjusted version of the quantity cdt over t1 to t2 but don't know how to find the expansion factor.
Please do let me know if I'm completely wrong, I'm still not at all comfortable with cosmological distances!