- #1

- 6

- 0

## Homework Statement

A object is brought up to a distance of 2*R (R=moon radius) from the moon mass center and dropped. Starting velocity is 0.

Calculate the velocity the object has when hitting the moon surface.

Calculate the time it takes to reach the surface.

## Homework Equations

Radius R = 1740000m

Moon mass M = 7.35*10^22kg

gravitational acceleration, g=6.67*10^-11*M/R^2

Conservation of energy in gravitational field.

## The Attempt at a Solution

Using the laws of conservation of energy i have managed to calculate the speed when hitting the surface: 1679m/s

The problem now is finding the time it takes. Would have been easy if acceleration was constant, but it isn't!

I tried to calculate it as if acceleration was constant and got 2073seconds. This is probably somwhere near the correct answer, but still it's not 100% correct.

If i have done it right the acceleration varies like this:

http://img31.imageshack.us/img31/3249/grafjd.jpg [Broken]

How can i calculate the time used when the acceleration varies with the distance from the moon?

Thanks for all help :)

Last edited by a moderator: