1. The problem statement, all variables and given/known data Two atomic clocks are synchronized. One is placed on a satellite which orbits around the earth at high speeds for a whole year. The other is placed in a lab and remains at rest with respect to the earth. You may assume both clocks can measure time accurately to many significant digits. a)Will the two clocks stil be synchronized after one year? b) imagine the speed of light is much lower than its actual value. How would the results of this experiment change if the speed of light was only twice the average speed of the satellite? Explain your reasoning using a calculation. 2. Relevant equations Δtm = Δts/√(1-v2/c2) 3. The attempt at a solution a) I calculated the Δtm using a theoretical velocity (3x103m/s) and a theoretical Δts 3.1x107 (about how many seconds per year) When calculated using Δtm = Δts/√(1-v2/c2) I find no time dilation.. 3.1x107/0.9999999999= 3.1x107 but the fact that the clocks can go to many significant digits worries me, I think they may not be synchronized after the experiment because of the obvious time dilation that will inevitably take place.. any input here would be awesome! b) using the same theoretical #'s, and changing the speed of light of course, I determined much more time dilation would occur as expected, as objects approach the speed of light time dilation becomes very significant. Δtm=3.1x107/√(1-1x10-6) =31,000,015.5 s significant time dilation as speed becomes closer to speed of light, or in this case speed of light becomes closer to speed of satellite. Does everything seem logical and ok? Thanks !