- 94

- 6

**1. The problem statement, all variables and given/known data**

(We are to solve this with Monte Carlo programming. Based on the universe from Olber's paradox)

Suppose you are in an infinitely large, infinitely old universe in which the average

density of stars is n = 10^9 Mpc^−3 and the average stellar radius is equal to the Sun’s

radius R = 7 × 10^8 m. How far, on average, could you see in any direction before

your line of sight struck a star? (Assume standard Euclidean geometry holds true in

this universe.)

We are allowed to assume:

-The universe is static

-The stars are roughly homogeneously distributed

-Every star has radius = solar radius

**2. Relevant equations**

My thoughts:

l = 1/(n * sigma)

where sigma is the cross sectional area of interaction, or Pi R_sun^2

**3. The attempt at a solution**

I have working code and get a decent answer (with a LOT of waiting...) but I wanted to verify my answer. We had a similar 2D problem where we calculated the MFP of an arrow shot in a forest with average tree density of 0.005 trees per m^2, and tree radius of 1m.

I was easily able to verify my results by using the formula from section 2 (above), which was 100m. With a large enough forest, and enough runs, I was able to get ~100m from my Mathematica program.

For this 3D problem, I wanted to verify the answer (by hand) to confirm my program's results.

If the formula from part 2 can be applied here, where n= 10^9 Mpc^-3, and sigma = Pi*R_sun^2, I get:

**MFP = 6.26 x 10^23 pc**

(Keep in mind this is a homogeneously distributed universe, meaning no galaxies, no clusters, etc)

Anyway, my program gets around 5 x 10^24, so I'm wondering if my calculation is wrong, or if my program just needs higher precision or something...

Any help is appreciated!

Last edited: