- #1

jobendorfer

- 3

- 0

## Homework Statement

A comet is released from rest at a distance r_max from the sun.

Therefore it has no angular momentum.

We can assume the sun is a point mass ( i.e. sun's radius is zero ).

How long does it take for the comet to hit the sun?

Let m = comet mass

let M_s = sun's mass

let G = gravitational constant

r = distance of comet from the sun, with origin at the sun.

## Homework Equations

t = sqrt( m/2 ) * ∫ dr / sqrt( E - U(r) )

## The Attempt at a Solution

Since the comet is released from rest, it initially has kinetic energy T = 0 and potential

U = G*M_s*m/r_max.

E = T + U = G*M_s*m/r_max.

U(r) = G*M_s*m/r

E-U(r) = (G*M_s*m)( 1/r_max - 1/r)

After grinding a bit I got:

t = 1/sqrt( 2*G*M_s) ∫ dr/sqrt( 1/r_max - 1/r )

with the limits of integration being r_max to 0.

This is where my eyes glazed over, since it's been 25 years since I've been near a calculus course. Any hints about how to proceed?

This is Taylor, _Classical Mechanics_, problem 8.21, part (c).

I am NOT in a class. I'm a software grunt trying to educate myself, any help offered would be greatly appreciated. This problem has really had me tearing my hair out for about 3 days.

Thanks,

John

(jobendorfer@cyberoptics.com)