- #1
littlebhawk
- 5
- 0
Ok I am new to physics forum but I am sure if I am going to get help ill probably get it here, this problem has gone UNSOLVED by everyone who has tried it to include my physics and calculus teacher.
Here is the problem:
There is an object 10 Earth radii from the CENTER of earth, it is released with zero initial velocity, how long will it take to hit the surface of Earth assuming no atmosphere?
Obviously gravity increases as the object moves closer to earth, i found the integral of force of gravity versus distance from one Earth radius from center, to 10 Earth radii away to get the work done on the object, changed it to the kinetic energy at moment of impact, and got my final velocity. My final velocity was 10608.1 m/sec, I plugged this into Vf^2=Vi^2 + 2ad to solve for what i thought would be sort of an "average" acceleration, i got a= 0.9799 m/sec^2, plugged this into Vf=Vi+at, solved for T and got 10825.6 seconds, or about 3 hours.
I don't know if I am even close to right as I've never tried a question like this before, any help would be appreciated.
Here is the problem:
There is an object 10 Earth radii from the CENTER of earth, it is released with zero initial velocity, how long will it take to hit the surface of Earth assuming no atmosphere?
Obviously gravity increases as the object moves closer to earth, i found the integral of force of gravity versus distance from one Earth radius from center, to 10 Earth radii away to get the work done on the object, changed it to the kinetic energy at moment of impact, and got my final velocity. My final velocity was 10608.1 m/sec, I plugged this into Vf^2=Vi^2 + 2ad to solve for what i thought would be sort of an "average" acceleration, i got a= 0.9799 m/sec^2, plugged this into Vf=Vi+at, solved for T and got 10825.6 seconds, or about 3 hours.
I don't know if I am even close to right as I've never tried a question like this before, any help would be appreciated.