1. The problem statement, all variables and given/known data The Concorde traveled 8000 km between 2 places in North American and Europe at an average speed of 375 m/s. What is the total difference in time between 2 similar atomic clocks, one on the airplane and one at rest on Earth during a one-way trip? Consider only time dilation and ignore other effects like Earth's rotation. 2. Relevant equations Δt_E = γ(Δt_S), γ = [1-(v/c)^2]^.5 3. The attempt at a solution Here's my thinking: I can find the elapsed time during a one-way trip as measured from Earth frame by dividing the distance (8000 km) by the speed (375 m/s). Then, since the airplane and its clock are moving relative to Earth, the elapsed time interval will be smaller as measured from Earth frame. So I use the equation above. The problem is that v/c is WAY too small even for a scientific calculator, which just treats it as zero. What can I do with this problem? I tried a binomial approximation, but that didn't help much.