It seems to me that if we consider different methods of generating a distribution of an infinite number of samples of and unbounded real number then we get some distinct results. 1) If we randomly sample a value, then its probability must be non-zero, which is also true of any other value. 2) We pick a random value then iteratively increase and decrease this value by a finite, random amount. 3) We take repeated random samples, resulting infinite spacing. Obviously I'm not the first to think of this so my question is, what is the mathematical formalism for this? What do I look for to find out more?