# Average Acceleration (gravity question)

Hello. I was doing some physics and I came across a problem involving gravity.
The problem was finding out how long (in disregaurd to atmosphere) long (time) it would take for a body to reach earth. I'm familiar with distance's derivatives, which give me:
$$t=\sqrt{\frac{2D}{g}}$$
where t=time D=distance g=gravitational acceleration.

Although using g=9.8 is fine for short distances, it wouldnt really work at, say 2500km (about g=5) above the surface of the earth.

I'm curious as to how I may find the time it will take the object to fall? I'm not worried about what happens in between the distance. I'm not taking into account other bodies in the universe (its just a hypothetical physics problem). Since the derivative of g will depend on distance, and since D(t) is dependant on g, that doesnt really work. I'm wondering if i can use the average value of g? although the first bit of time will calculate the body being faster, the last bit of time it wont be getting as much speed. Can I do this?

does:
$$t=\sqrt{\frac{2D}{g_{avg}}}?$$

can I do this for any varying acceleration? I know how to find averages of functions, please help.