- #1
anarine24
- 3
- 0
I have a question regarding time dilation: An astronaut travels at a speed of 7800 m/s relative to the earth. According to a clock on earth, the trip lasts 15 days. Determine the difference (in seconds) between the time recorded by the Earth clock and the astronaut’s clock.
Now I took the formula t = t0/sqrt(1-v^2/c^2), where v=7800m/s and t=15 days, or 1.3E6s. I plugged those in (with c=3E8m/s) to find t0. Since the question asked for the difference between the two times, I did t-t0 and I got an answer of 0s, because my t0 turned out to be the same as t. I was also told with the problem that the answer is 4.4E-4 seconds, but I'm not seeing how. I know that the time measured by the Earth (t) is supposed to be higher than the time measured by the astronaut (t0), so there must be some difference of times.
Anyway, I'm stuck and it's probably easy but I just can't see it right now. Any help would be greatly appreciated!
Now I took the formula t = t0/sqrt(1-v^2/c^2), where v=7800m/s and t=15 days, or 1.3E6s. I plugged those in (with c=3E8m/s) to find t0. Since the question asked for the difference between the two times, I did t-t0 and I got an answer of 0s, because my t0 turned out to be the same as t. I was also told with the problem that the answer is 4.4E-4 seconds, but I'm not seeing how. I know that the time measured by the Earth (t) is supposed to be higher than the time measured by the astronaut (t0), so there must be some difference of times.
Anyway, I'm stuck and it's probably easy but I just can't see it right now. Any help would be greatly appreciated!
Last edited: