patapat
- 20
- 0
I was reading about time dilation and say if we have an inertial S that moves with velocity v in the x-direction with respect to an inertial frame S'. In S' we shoot a light towards a mirror and measure the time from when the original flash takes place to when it returns to it's origin giving us \Deltat'=2D/c. They said that since an observer in inertial frame S would measure these events to take longer than they do in S' that an observer in S can conclude that time passes more slowly in S'. Is this correct? When i think about it, if it took 7 seconds in S and 5 secs in S' then wouldn't time run "slow" in S since it took longer for the same event to take place than it did in S'? Clarification needed, thanks.
-Pat
-Pat