Is there a mathematical method to determine the optimum sampling of data for probabilities?(adsbygoogle = window.adsbygoogle || []).push({});

Flip a coin. Simplistically speaking from experience, it has a 1/2 chance of landing on either side. But what if it can land on its edge? What if it can fall through a crack? What if lava from a fissure invading the room can envelop and melt the coin? What if it can quantum mechanically flip itself after landing? Other examples of probability, like the nonlinear trajectory of a particle, have determinism not immediately apparent.

Even an electronic random number generator run by a quantum computer is susceptable to decoherence between the device and the observer. It seems that we must have extensive practical knowledge about the system under observation, then apply Occam's razor, if we are to determine the set of data required. But how may this be done systematically?

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Prioritizing statistical data

Loading...

Similar Threads for Prioritizing statistical data |
---|

A Defining a Symmetry Statistic to test for Normality? |

I Order Statistics |

I The characteristic function of order statistics |

I Small Reduced Chi Squared interpretation |

B The Drunkard's Walk: A Discussion of the Text |

**Physics Forums | Science Articles, Homework Help, Discussion**