This is a pretty classical problem. A plane goes from city A to city B. The distance between the two cities (measured on the ground) is 500 km. The plane is travelling at 0.2c. How long does the trip take for the pilot and what is the distance between the two cities for the pilot?
I know that
[tex]L = L_0/\gamma[/tex]
[tex]\Delta T = \gamma \Delta T_0[/tex]
The Attempt at a Solution
The proper distance is 500 km, so I can work out that the pilot sees a distance of approximately 489.9 km. My problem is with the time. I know that I can divide 489.9 by 0.2c to get the time for the pilot: 0.00817 seconds. This is the correct answer. But why is it wrong for me to divide 500 km by 0.2 c (=0.00833s) and consider that the proper time, after which I transform it by multiplying by gamma (the result is 0.0085018 s)? I noticed that if I consider the time on the ground to be the "moving" time and the time for the pilot to be the proper time, using the formula gives me the correct time. But it doesn't make sense to me, since I started considering the ground as the rest frame. Why the switch? My guess would be that it has something to do with needing two clocks on the ground and one in the air, but I still don't really understand the asymetry.
Thanks for your help.