I am trying to understand how relativity explains that clocks all over the globe count at about the same rate at sea level. One of the papers that I have looked at is "The global positioning system, relativity, and extraterrestrial navigation" at http://www.nist.gov/customcf/get_pdf.cfm?pub_id=904814. The most common equation for the gravitational potential including the quadrupole term is given as eq.1.4 and the coordinate time increment for a clock is then given as eq. 1.5, see attached eqs.jpg. If I use these equations to compare two clocks at different latitudes using a spreadsheet, I find that they count the same regardless of latitude to within a fraction of a nanosecond per day. So far so good. However, if I attempt to back calculate the value of gravitational acceleration implied from equation 1.4, I get a value different than the actual measured value. This seems to be because the value for the gravitational potential in equation 1.4 is based on the distance r from a point mass (V= GM/r) rather than from an object like the earth where the mass is distributed over a large volume. The quadrupole term as far as I can tell corrects only for the earth's oblateness. So here is where I see a problem - based on the V from equation 1.4, we would get an implied gravitational acceleration at the north pole of 9.854 m/s2, whereas the actual measured value of g at the north pole is 9.832 m/s2. If you attempt to substitute the real values for g or GM/r into equation 1.5, you end up with a clock at the north pole countering slower than a clock at the equator by 92.59 ns per day, which is inconsistent with the data. So how does this calculation account for the fact that the earth is not a point mass? Thanks.