hairygary
- 4
- 0
Wouldn't the most important reason for time itself needing to "slow" stem from the assertion that regardless of the reference frame, and any velocity of one reference frame relative to the other, the speed of light is the same? So if an observer was traveling at a velocity of c/2 away from a light source, the speed of the light relative to the observer would be c, while also being c relative to a second observer stationary relative to the light source. At first this seems contradictory, as the two reference frames will measure different relative Δx between two events, and velocity=Δx/Δt. The only way the speed of light can be the same in both reference frames is if Δt is different. I'm relatively new to the concept though, so there might be a mistake somewhere in my reasoning.