- #1

- 7

- 0

**1. The problem statement**

A object is dropped into a free-fall from a distance of 3 earthradiuses from the center of the earth. Starting at a velocity of 0. How long time does it take for the object to travel the halv distance to earth (that would be the length of one EarthRadius), and how long would it take before the object hit the surface of the earth?

**Variables and given/known data**

EarthRadius = 6.37*10^6

EarthMass = 5.97*10^24

Gamma = 6.67*10^(-11)

Gravity at EarthRadius = 9.8 m/s^2

Gravity at 2*EarthRadius = 1/4 * 9.8 m/s^2

Gravity at 3*EarthRadius = 1/9 * 9.8 m/s^2

## Homework Equations

F = m*a

m * a = (Gamma * EarthMass * m) / EarthRadius^2

## The Attempt at a Solution

I've tried set up a differential equation, but I dont get a answer that is correct, so Im pretty stuck.

m * a = (Gamma * EarthMass * m) / EarthRadius^2

(divide by m on both sides and get this)

a = (Gamma * EarthMass) / EarthRadius^2

Then I substitute a with d^2*EarthRadius / dt^2

and finally get:

(Gamma * EarthMass) / EarthRadius^2 = (d^2*EarthRadius) / dt^2

Then I set to initial conditions:

i) the object starting speed is 0 -> Diff(EarthRadius)(0) = 0

ii) and the object stop at EarthRadius -> EarthRadius(0) = EarthRadius

But I cant get the answer right, so Im pretty stuck.

I've calculated that the time it will take for the object to hit the surface of earth must be at least 1100 seconds and not more then 3400 seconds. This because the accelration at 1*EarthRadius from the centre of earth is 9.8 m/s^2 and at 3*EarthRadius from the centre of earth is 1/9*9.8 m/s^2

ANY help and tips will be appriciated :)