Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Refuting the idea of entropy equalling "disorder"

  1. Aug 14, 2016 #1
    Hi.

    I have heard this, that entropy is often called "disorder" but _isn't really so_. And I am even more puzzled by the connection and difference between entropy from "information theory" pov and from "thermodynamic" pov. I see stuff like this:

    http://arstechnica.com/civis/viewtopic.php?f=2&t=3122

    see the posts of "kmellis" and he simultaneously says that entropy in thermodynamics is NOT informational disorder, while simultaneously advocating an "information theoretic" basis for physics. How the f--- do you do that and then since thermo. entropy != inform. entropy, how do they relate in such a framework or not relate?

    I'm curious. How can one prove mathematically that the entropy change in, say, converting an "ordered" stack of identical objects -- a more simplified version of an often given and apparently invalid example -- to a "messy" looking pile, is identically zero if you assume all other variables (temperature, etc.) are mathematically constant, so there is literally nothing but the rearrangement going on? I know, this is highly idealized, but that's the point, to isolate the "disordering" in a common man's sense and show that it has absolutely zero effect on the entropy of the whole system of objects. What I am wondering about is why couldn't there be some immeasurably small but not zero entropy change because after all you are rearranging the matter in the system, just not by a very great degree when you think of things on a "microscopic" scale?
     
  2. jcsd
  3. Aug 14, 2016 #2
    Research macrostates and microstates. The macrostate interpretation of entropy does not include disorder, while the microstate interpretation does via the Boltzmann equation. Both interpretations are equally correct and important, each applying to their respective part of thermodynamics (i.e. macrostate thermodynamics and microstate thermodynamics).
     
  4. Aug 15, 2016 #3

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor
    2016 Award

    Part of the difficulty is that there are multiple ways to define the information content of a system. One way, Shannon entropy, is more suited to communications- how much information is required to digitally communicate (possibly including 'measure') the microstate of a system. In this sense, entropy is related to how random the bit stream is- if you can predict the value of an incoming bit before receiving it, the entropy associated with that bit is zero. Thus, in the Shannon context, 'negentropy' is usually more useful than 'entropy'. Data compression is especially well-suited to this context.

    Another way to define information content is how much information is needed to create the state, the Kolmogorov complexity. In this context, one can pose questions about how much information is required to build a factory that makes things (including other factories). Assigning numbers to the Kolmorgorov complexity is not trivial, AFAIK.

    There are other measures of information, but I am only familiar with those two. Does this help?
     
  5. Aug 17, 2016 #4
    I guess it depends on how fundamental you want to go. If you are talking about the entropy of a gas or an engine, then you can use a chemistry definition and it's all very objective and we can all agree on what the entropy is. And this is consistent with the laws of thermodynamics. But then you get into statistical mechanics and macrostates and it all sounds very subjective, since a macrostate seems to depend on how much information you have. (I am grouping the statistical mechanics definition and information theoretical definition together, since they seem to be compatible to me.)

    For familiar systems like ideal gases, we can use the convention that a macrostate refers to a set of states with known volume, energy, and particle number. And then entropy is just a state variable, S(V,U,N), and the two notions agree. But if we don't know the volume, does that mean we don't know the entropy? Or does it mean the entropy is greater (because we are referring to a larger set of states with known energy and particle number)?

    Practically speaking, we don't measure the energy. We measure the temperature and infer the energy. But temperature is also defined in terms of the entropy.

    It seems like the chemistry definition is more useful, but the information theoretical definition is more extensible to more exotic systems which have different state variables than volume, energy, and particle number. For example, in a big bang nucleosynthesis experiment, particle number might not be a useful state variable. When we report a value for entropy for an exotic system, we have to define what we consider a macrostate to be.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Refuting the idea of entropy equalling "disorder"
Loading...