This isn't a homework problem exactly - I'm attempting to teach myself introductory physics and there's a question that's bugging me. The kinematics equations that I've seen so far for falling bodies all assume that little g is a constant, i.e. the distance of the fall of the object from its initial height above the earth is small enough that delta g is zero. My question is how the equation would change for freefall where the starting height was far enough that little g was changing during the fall? 2. Relevant equations y(t) = 1/2gt^2, g = G*M1*M2/r^2 3. The attempt at a solution My best guess is that instead of using little g as a constant, i'd integrate F = G*M1*M2/r^2 over the distance traveled and somehow plug that into the equation instead, but my knowledge of both calculus and physics is too shaky to know if I'm on the right track. Can anyone help? Many thanks in advance.