I have this problem where the earth immediately loses all orbital velocity and begins to fall towards the sun, and I need to find the time it takes for the earth to hit it.
Seemed straight forward enough.
Started with the work k.e. theorem,
.5mv(x)2-.5mvo2=∫[from xo to x]F(x)dx.
Where m=the mass of the earth, vo=0, and F(x)=F(r)=-GMm/r2 where M is the mass of the sun.
The Attempt at a Solution
So I made the bounds translate from xo→x to rAU→r(t). Solved the integral and got
Seeing as how r(t) is never going to get bigger than 1 AU, this doesn't make any sense. The answer is already imaginary, and I haven't even gotten to the integral for solving for t(r) yet. Anybody know what I did wrong?
note: In case it wasn't that obvious, I'm using the initial position of the Earth as rAU and the final point I'm trying to get the Earth to is the radius of the sun, ill just write as ro. The center of the sun is the origin of the coordinate system.