The problem is assume Andromeda and the Milky Way are going to collide. the relative speed of them moving is 10^6m/s. assume each star has the radius of our sun (6.955*10^8m) the distance between each star is 3.1*10^18 m. how long will it take for a star to collide with the sun?
i believe it might solved with the mean free path with mean free time= lamda/v, lambda being 1 over the area of the object in question.
The Attempt at a Solution
I've done a few things with this problem, first i divided the radius by the distance then divided by velocity, but that doesnt work due to the units. i tried the mean free path way, but then the distance isn't involved and it seems like it should be. the other thing i tried was to find the area of disc with radius of the distance and the subtracting the area of radius then use that for the mean free time. that seemed all right, but i still have a unit problem. also the answers have seemed low. only the first way provided an answer that i felt was correct, but again, unit problems.
i'm now thinking maybe i have to use 1/distance and then divide the radius by that then by velocity, but i'm not sure why.
i am stuck, please help