I'm having a problem with a project I've been kicking around for some time now. This arose out of a discussion with a co-worker who was wondering how much less one would weigh at various intervals of increasing altitude above the surface of the earth. I set out to create a spread sheet to demonstrate. The spreadsheet would take inputs for the mass (weight) of the person & the desired altitude and show the weight at altitude. Although I created the spreadsheet using Newton's formula for gravitational attraction, I noticed a peculiar thing. That is, if I set the altitude to zero, i.e to remain at mean sea level, the field showing weight at altitude does not give the original starting weight. Despite having searched the web for the highest precision values I could find for the variables, I'm supposing that the problem can only be due to some inaccuracy in one or more of the values I have in the equation. (Or I messed up somewhere else) Here's what I have: F = G*m1*m2/ r^2 G = 6.67259 * 10^-11 m1 = Mass of Earth = 5.9736 * 10^24 kg m2 = Mass of object on surface of earth (at mean sea level) = 50 kg r = Radius of earth = 6.3674425 * 10^6 m 1 Newton (N) = 9.80665002864 kg (6.67259 * 10^-11) *(5.9736 * 10^24) * 50 = 19.929692 * 10^15 (19.929692 * 10^15) / (6.3674425 * 10^6) ^2 = 491.5532 N = F F = 491.5532 N BUT! 50Kg * 9.80665002864 = 490.3325 N For a difference of 1.227 Newtons. With this kind of error, I have to input an altitude of 7920 m to get the result to equal the starting weight. Can anyone see what I'm doing wrong here? Or am I just trying to get more accurate results than is possible with available data? Thanks for any insight.