- #1

dbogen

- 2

- 0

## Homework Statement

An object is dropped from a distanse of 2*R(moon radius) from moon senter.

How many seconds does it take until impact with moon, and in what speed will it hit?

The distanse of the fall will be 1.74*10^6 m. The problem is the not so constant accleration.. ;)

gravity constant: G := 6.67*10^(-11)

moon mass M := 0.0735^24 kg

moon radius R := 1.74*10^6 m

## Homework Equations

gravity(accleration) is

g=G*M/R^2

So a(x)=(G*M)/(2*R - x)^2 , x = meter fallen x{0..R)

## The Attempt at a Solution

I've calculated the speed at impact:

Average accleration:

> A:=(G*M)/(2*R - x)^2

> Aa := (int(A, x = 0 .. R))/R;

0.8096264368

Time using x= 0.5*a*t^2

> T := solve(R = 0.5*Aa*t^2, t);

-2073.229031, 2073.229031

This time is found using constant accleration, so it isn't the correct one...

but it will work for finding the speed of impact.

Speed

> V := T*Aa;

-1678.541033, 1678.541033

So it will hit at 1678,54 m/s

but for how long will it fall?