Entropy is typically referred to as a measure of disorder

AI Thread Summary
Entropy is commonly understood in physics as a measure of disorder within a physical system, but it also serves as a measure of statistical uncertainty in data sets. This dual interpretation highlights its relevance in both thermodynamics and information theory. The mathematical representation of statistical entropy often involves integrals, emphasizing its application in quantifying uncertainty. Discussions around statistical entropy reveal its significance in understanding information content and data variability. Overall, entropy's role as a statistical measure of uncertainty is crucial for bridging concepts in physics and information theory.
Watts
Messages
37
Reaction score
0
In physics entropy is typically referred to as a measure of disorder in a physical system however I have seen it referred to as a measure of statistical uncertainty for a set of data. I also recall the function defined the statistical uncertainty took the form of an integral and is used in information theory. Could anybody shed any light on this thought of entropy representing a statistical measure of uncertainty?
 
Physics news on Phys.org
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top