The exact problem reads: The acceleration of gravity g is a constant only for a limited range of height differences. A better approximation, one that might hold over a larger range of height differences, is that g decreases linearly with height, g = go - hg', where h is the height measured from the ground surface and s' is a (small) constant of the appropriate dimensions. (a) Find the speed of a dropped object as a function of height assuming it was dropped starting from rest from a height ho. (b) Find the speed of a dropped object as a function of time assuming it was dropped starting from rest from a height ho. What I've tried to do so far for part a was to integrate acceleration in terms of h. I'm not sure if that is even allowed, but I ended up getting v = g0h - 0.5g'h^2 Then on part b, I more or less got stuck trying to get a formula for h in terms of t, and I'm not quite sure where to start. I may just be thinking about this incorrectly, so any input on the best method for starting this problem would be welcome.