Is there a mathematical method to determine the optimum sampling of data for probabilities? Flip a coin. Simplistically speaking from experience, it has a 1/2 chance of landing on either side. But what if it can land on its edge? What if it can fall through a crack? What if lava from a fissure invading the room can envelop and melt the coin? What if it can quantum mechanically flip itself after landing? Other examples of probability, like the nonlinear trajectory of a particle, have determinism not immediately apparent. Even an electronic random number generator run by a quantum computer is susceptable to decoherence between the device and the observer. It seems that we must have extensive practical knowledge about the system under observation, then apply Occam's razor, if we are to determine the set of data required. But how may this be done systematically?