In physics entropy is typically referred to as a measure of disorder in a physical system however I have seen it referred to as a measure of statistical uncertainty for a set of data. I also recall the function defined the statistical uncertainty took the form of an integral and is used in information theory. Could anybody shed any light on this thought of entropy representing a statistical measure of uncertainty?(adsbygoogle = window.adsbygoogle || []).push({});

**Physics Forums - The Fusion of Science and Community**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Entropy is typically referred to as a measure of disorder

Loading...

Similar Threads for Entropy typically referred | Date |
---|---|

A Reference Prior | Nov 8, 2017 |

Is Shannon Entropy subjective? | Jan 24, 2016 |

"Randomness Through Entropy" Paradox | Dec 28, 2015 |

A formula based approach to Arithmetic Coding | Sep 7, 2015 |

Maximizing the entropy of a constrained distribution | Feb 18, 2014 |

**Physics Forums - The Fusion of Science and Community**