- 5,774
- 174
What exactly prevents us from ruling out a uniform distribution on infinite sets? To be more precise, why are distributions and limits like
\int_{-\infty}^{+\infty}dx\,\lim_{\sigma\to\infty}f_{\mu,\sigma}(x) = 1
\int_{-\infty}^{+\infty}dx\,\lim_{\Lambda\to\infty}\frac{1}{\Lambda} \chi_{[a,a+L]} = 1
not allowed or not reasonable in probability theory? What prevents us from interpreting this as uniform distributions on infinite intervals?
(f is a normal distribution with mean μ and standard deviation sigma; χ is the characteristic function on some interval)
\int_{-\infty}^{+\infty}dx\,\lim_{\sigma\to\infty}f_{\mu,\sigma}(x) = 1
\int_{-\infty}^{+\infty}dx\,\lim_{\Lambda\to\infty}\frac{1}{\Lambda} \chi_{[a,a+L]} = 1
not allowed or not reasonable in probability theory? What prevents us from interpreting this as uniform distributions on infinite intervals?
(f is a normal distribution with mean μ and standard deviation sigma; χ is the characteristic function on some interval)
Last edited: