center o bass
- 545
- 2
Suppose we have a cosmological model where condensed objects of characteristic mass ##m## occupies some fraction ##\Omega## of the critical density ##\rho_c##. Take a QSRS to be at cosmological distance ##L \sim 1/H_0## (##c=1##). Its easy to the that the average distance between the condensed masses will now be given by ##\Omega \rho_c /m##.
QUESTION: why will the expectation value ##\bar l## for the distance of the closest mass to the line of sight (between us at the QSRS) be
$$ l \sim (m H_0/\Omega \rho_c)^{1/2}?$$
I've read this claim in "Method for detecting a cosmological density of condensed objects" (1973) by Press and Gunn, and would very much like to see an argument on why this is true.
QUESTION: why will the expectation value ##\bar l## for the distance of the closest mass to the line of sight (between us at the QSRS) be
$$ l \sim (m H_0/\Omega \rho_c)^{1/2}?$$
I've read this claim in "Method for detecting a cosmological density of condensed objects" (1973) by Press and Gunn, and would very much like to see an argument on why this is true.