1. The problem statement, all variables and given/known data Calculate the difference in time after one year between a clock at Earth's surface and a clock on a satellite orbiting at 300 km above the surface 2. Relevant equations T = T0 / (1 - 2gR/c^2)^.5 That is, this: http://hyperphysics.phy-astr.gsu.edu/hbase/relativ/imgrel/gtim3.gif 3. The attempt at a solution I don't understand how to use this equation to get the difference between the clock on the satellite and the clock on the surface. Do I just take the value of T with R = Earth's radius, and again with R = Earth's radius + 300 km, and take the difference?