- #1
- 933
- 56
I've seen across the internet the explanations of why (ignorind wind) the time an airplane takes to travel from one place to another on Earth is the same regardless of its direction of flight.
The explanations usually rely on using reference frames. But I thought of one that I think is more simple, because it doesn't use reference frames explicitally.
Just consider that the mean time is the mean distance traveled over the Earth divided by the speed relative to the Earth.
If the Airplane has a speed V and travels a distance D the time is D / V, and if it travels in the opposite direction its speed is -V, but the same distance is -D, which shows that the time is the same.
Is this explanation valid?
So of course by talking about "speed" and "distance" I'm implically using reference frames. But I don't need to bring them up to the discussion. So I think this is more easy for people who don't know physics to get with.
The explanations usually rely on using reference frames. But I thought of one that I think is more simple, because it doesn't use reference frames explicitally.
Just consider that the mean time is the mean distance traveled over the Earth divided by the speed relative to the Earth.
If the Airplane has a speed V and travels a distance D the time is D / V, and if it travels in the opposite direction its speed is -V, but the same distance is -D, which shows that the time is the same.
Is this explanation valid?
So of course by talking about "speed" and "distance" I'm implically using reference frames. But I don't need to bring them up to the discussion. So I think this is more easy for people who don't know physics to get with.