Is there a difference between statistical and thermodynamic entropy?

  • Thread starter zeromodz
  • Start date
  • #1
245
0
I read that entropy is a measure of disorder. Or in other words, unpredictability. So if I have a fair 50/50 coin. It is at maximum entropy (unpredictability). If I have a two headed coin I have zero entropy. It seems that the more possible unpredictable states, the higher the entropy.

Now compare this notion to thermodynamic entropy with the heat death of the universe. There will come a point in time where the universe will reach maximum entropy allowing no more work to be done because everything will be in not just thermo-equilibrium, but complete equilibrium with gravity, electromagnetism, and every other method of extracting work. At this state of maximum entropy, there is no more unpredictability. The universe is dead and every future event is predictable. There is no more chaos and everything is at rest. How does this correspond to the notion of maximum entropy and disorder with the coin example?

Thank you
 

Answers and Replies

  • #2
Andrew Mason
Science Advisor
Homework Helper
7,641
371
I read that entropy is a measure of disorder. Or in other words, unpredictability.
Neither statement is correct. In fact, the second law of thermodynamics says that the macroscopic state of a microscopically chaotic system is very predictable.

So if I have a fair 50/50 coin. It is at maximum entropy (unpredictability). If I have a two headed coin I have zero entropy. It seems that the more possible unpredictable states, the higher the entropy.
Thermodynamic entropy is a concept that only applies to systems with large numbers of particles. It does not apply to a single entity that can have only a few states.

Now compare this notion to thermodynamic entropy with the heat death of the universe. There will come a point in time where the universe will reach maximum entropy allowing no more work to be done because everything will be in not just thermo-equilibrium, but complete equilibrium with gravity, electromagnetism, and every other method of extracting work. At this state of maximum entropy, there is no more unpredictability. The universe is dead and every future event is predictable. There is no more chaos and everything is at rest. How does this correspond to the notion of maximum entropy and disorder with the coin example?
Your example helps to demonstrate why entropy is not a measure of unpredictability. (That is not to say that your prediction of the final state of the universe is correct. It is by definition an unverifiable theory.).

AM
 
  • #3
245
0
Neither statement is correct. In fact, the second law of thermodynamics says that the macroscopic state of a microscopically chaotic system is very predictable.

Thermodynamic entropy is a concept that only applies to systems with large numbers of particles. It does not apply to a single entity that can have only a few states.

Your example helps to demonstrate why entropy is not a measure of unpredictability. (That is not to say that your prediction of the final state of the universe is correct. It is by definition an unverifiable theory.).

AM
I quote

"Entropy is a measure of disorder, or more precisely unpredictability. For example, a series of coin tosses with a fair coin has maximum entropy, since there is no way to predict what will come next. A string of coin tosses with a two-headed coin has zero entropy, since the coin will always come up heads. Most collections of data in the real world lie somewhere in between. It is important to realize the difference between the entropy of a set of possible outcomes, and the entropy of a particular outcome. A single toss of a fair coin has an entropy of one bit, but a particular result (e.g. "heads") has zero entropy, since it is entirely "predictable"."

How do you explain this definition of entropy in information theory then?

My source is
http://en.wikipedia.org/wiki/Entropy_(information_theory)
 
  • #4
5,601
40
You are into an area of physics that I believe is one of the hardest to understand...at least for me. No matter how I look, I just can't get an intuitive feel for it....just like my first thermodynamics professor!!! (That's also what Richard Feynman felt about his first exposure.)

I don't think anybody really understands entropy yet....like gravity, maybe....That's why John von Neumann suggested to Claude Shannon when Shannon was developing information theory at Bell Labs he use entropy instead of "uncertainty" in explanations....opponents would be intimidated!!!!!

[A good book with many perspectives is Charles Seife's Decoding the Universe....lots of examples only a little basic math. My copy is 2006. You can buy used online cheap. That last historical tidbit came from there.]

I do believe that one perspective on entropy is that it IS a measure of disorder.

One of the confusing aspects of information and thermodynamic versions of entropy is that nature usually moves to increase entropy, say in the universe, and that means moving towards a more uniform distribution of whatever property you study....at the same time nature moves to dissipate information. Seife says the ideas are exactly the same. At the end no more useful information will be available.

Most discussions of entropy are based on non gravitational situations such as the increasingly uniform distribution of gas molecules pumped into a container. But with gravity present, our thinking needs a BIG change in perspective: clumpiness tends to ensue and with gravity present THAT is increasing entropy....less order. So the big bang itself where gravity dominated was a very LOW entropy state. At the end of the universe uniformity will prevail without gravity...entropy will be high. That is strange!!!! (at least to me.)

Another weird aspect of entropy is that only erasing bits of information causes an increase in entropy...you can add, multiply or delete bits without consuming energy...but erasure costs energy and increases entropy. It's called Landauer's principle. Apparently erasing a bit release heat and therefore increases the entropy of the universe. And yet another weird aspect of entropy is that reversable operations don't increase the entropy of the universe; irreversable operations do...


If you want to go further, try studying Maxwell's demon...that misconception lasted 111 years!! [I don't really get that either!] You could also consider reading about information in messages: the sending machine should have maximum entropy, the most information, so its useful at the other end, yet the receiving signal should have minimal entropy, so there is no ambiguity. The streams that look random, that are least predictable, are likely to carry the most information per symbol. Predictable streams, like 1,1,1,1,1,1,1,1,... are redundant and probably carry less information.
 
Last edited:
  • #5
Andrew Mason
Science Advisor
Homework Helper
7,641
371
I quote

"Entropy is a measure of disorder, or more precisely unpredictability. For example, a series of coin tosses with a fair coin has maximum entropy, since there is no way to predict what will come next. A string of coin tosses with a two-headed coin has zero entropy, since the coin will always come up heads. Most collections of data in the real world lie somewhere in between. It is important to realize the difference between the entropy of a set of possible outcomes, and the entropy of a particular outcome. A single toss of a fair coin has an entropy of one bit, but a particular result (e.g. "heads") has zero entropy, since it is entirely "predictable"."

How do you explain this definition of entropy in information theory then?

My source is
http://en.wikipedia.org/wiki/Entropy_(information_theory)
The term "entropy" in information theory has nothing to do with thermodynamic entropy.

And, unless you provide a very specific definition of "disorder", it very misleading to say that thermodynamic entropy is a measure of disorder. Consider a litre of boiling water (100C) and an iceberg at -20.000000 C. Does disorder increase when I pour the litre of boiling water over the iceberg and it all turns to ice at -19.9999999 C? Why?

AM
 
  • #6
828
1
The term "entropy" in information theory has nothing to do with thermodynamic entropy.

And, unless you provide a very specific definition of "disorder", it very misleading to say that thermodynamic entropy is a measure of disorder. Consider a litre of boiling water (100C) and an iceberg at -20.000000 C. Does disorder increase when I pour the litre of boiling water over the iceberg and it all turns to ice at -19.9999999 C? Why?

AM
Your question depends on what you are calling the closed system.

And I think the two entropy terms are ultimately related.
 
  • #7
Andrew Mason
Science Advisor
Homework Helper
7,641
371
Your question depends on what you are calling the closed system.
The closed system is the iceberg and the one litre of water. Ignore the surrounding air.
And I think the two entropy terms are ultimately related.
How?

AM
 
  • #8
828
1
The closed system is the iceberg and the one litre of water. Ignore the surrounding air.
How?

AM
Then the entropy has increased.
 
  • #9
Mute
Homework Helper
1,388
10
I think the answer to the question "Are statistical entropy and thermodynamic entropy the same thing?" is not at all trivial. In fact, I was even considering starting a related discussion of it myself.

First, regarding information entropy: this is not quite the same as statistical mechanical entropy. The formulas are the same, but that is because statistical mechanical entropy is an application of information theory to physical-mechanical systems. A shuffled deck of cards has informational entropy but you can't extract work from it (unless you consider actual mechanical degrees of freedom).

I believe the thermodynamic definition of entropy can be derived from the statistical defintion; this means there is at least a regime where the two entropies give the same result.

Now, that said, E. T. Jaynes was of the opinion that entropy is not a physical quantity, and is only information. His reasoning comes from a thought experiment involving two experimenters with two identical boxes of two gases separated by a piston that they use to do work. However, one experimenter knows that the gas on one side of the piston actually has two different types, and has chosen a piston that is permeable to this gas. This experimenter can use the gases to do more work than the other. What the pther experimenter perceives as a violation of the second law is actually due to his lack of knowledge about other degrees freedom the system has. At the level of the microcanonical ensemble, we only put in as much information as we have; by adding more information about the system we can always generate new ways to manipulate the entropy and do work.

Jaynes thus concludes that entropy is really only our information about the system. I've never quite convinced myself he was 100% correct. I agree that statistical entropy is an application of information theory to the physical-mechanical systems, but I'm not sure it implies that thermodynamic entropy is necessarily the same and that entropy is only information and not a physical quantity in its own right.

If I can I'll look up the papers later and post the links.
 
  • #10
Andrew Mason
Science Advisor
Homework Helper
7,641
371
Then the entropy has increased.
Of course the entropy has increased. Has the amount of disorder increased? If you think so, please explain.

AM
 
  • #11
828
1
Of course the entropy has increased. Has the amount of disorder increased? If you think so, please explain.

AM
If the temperature of the glacier has risen, then the molecule's positions are less predictable, and so I would say yes the disorder has increased.

Energy can be put into a system to decrease order, like your example, or it can be put into a system to increase order like in a refrigeration system.
 
  • #12
atyy
Science Advisor
14,075
2,373
It seems that the more possible unpredictable states, the higher the entropy.

Now compare this notion to thermodynamic entropy with the heat death of the universe. There will come a point in time where the universe will reach maximum entropy allowing no more work to be done because everything will be in not just thermo-equilibrium, but complete equilibrium with gravity, electromagnetism, and every other method of extracting work. At this state of maximum entropy, there is no more unpredictability. The universe is dead and every future event is predictable. There is no more chaos and everything is at rest. How does this correspond to the notion of maximum entropy and disorder with the coin example?
Let's just make the universe a box of gas with a partition. At the start, the gas is all on one side of the partition. At some point, the partition is removed, this is just removal of a constraint, so neither work nor heat is done on the system, which is still "technically" closed (is it too terrible if we imagine this as the exapansion of the universe :tongue2:). The gas expands to fill the container. This famous scenario is called Joule free expansion. The thermodynamic entropy has increased, and the final "state" is clearly "unique". The relation to increased disorder is that that final state refers to the macroscopic state, not the microscopic state consisting of the position and velocity of every single molecule of gas. There are many more microscopic positions and momenta of individual gas molecules consistent with the one final macroscopic state than with the one initial macroscopic state.

http://www.physics.nus.edu.sg/~phybeb/core/node9.html (Eq 8.34)
http://www.nyu.edu/classes/tuckerman/stat.mech/lectures/lecture_6/node2.html
http://ocw.mit.edu/courses/physics/8-044-statistical-physics-i-spring-2008/lecture-notes/lec14.pdf [Broken]
 
Last edited by a moderator:
  • #13
Andrew Mason
Science Advisor
Homework Helper
7,641
371
If the temperature of the glacier has risen, then the molecule's positions are less predictable, and so I would say yes the disorder has increased.
What about the litre of boiling water? Has its disorder increased? If not, can you say that the disorder associated with the boiling water has decreased by less than the amount the disorder of the molecules in the iceberg has increased? How are you measuring disorder?

AM
 
Last edited:
  • #14
828
1
What about the litre of boiling water? Has its order increased? If not, can you say that the order associated with the boiling water has decreased by less than the amount the order of the molecules in the iceberg has increased? How are you measuring disorder?

AM
You said the system was the water and the iceberg combined, not just the water, so you can't talk about the order of the water alone unless you change your system, and then the answer of what is the entropy changes as well.

Also, I am confused by the way you worded your second question. The order of the molecules in the iceberg has not increased. When entropy increases, order is lowered.


Here is a whole article on their relationship:
http://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory
 
  • #15
atyy
Science Advisor
14,075
2,373
First, regarding information entropy: this is not quite the same as statistical mechanical entropy. The formulas are the same, but that is because statistical mechanical entropy is an application of information theory to physical-mechanical systems. A shuffled deck of cards has informational entropy but you can't extract work from it (unless you consider actual mechanical degrees of freedom).
Now, that said, E. T. Jaynes was of the opinion that entropy is not a physical quantity, and is only information.
I think http://arxiv.org/abs/0810.1019" [Broken], who does post on PF, doesn't like Jaynes's view.

http://arxiv.org/abs/cond-mat/0603382" [Broken] makes very interesting remarks related to this too: "Accordingly, entropy concerns the properties of system when it is manipulated by larger and external degrees of freedom. This reasoning suggests that coarse graining is naturally associated with the concept of entropy which is the property of a system with respect to certain types of external manipulations. The manipulations we are referring to include the active preparation of initial conditions, but exclude mere observations that do not interact with the particles. From this viewpoint, the partitioning or coarse graining we refer to is not subjective."
 
Last edited by a moderator:
  • #16
Andrew Mason
Science Advisor
Homework Helper
7,641
371
You said the system was the water and the iceberg combined, not just the water, so you can't talk about the order of the water alone unless you change your system, and then the answer of what is the entropy changes as well.
The system is the water + iceberg. So if you are considering the change in "disorder" you can't just take into account the change in "disorder" of the iceberg. '

Also, I am confused by the way you worded your second question. The order of the molecules in the iceberg has not increased.
You're right. I meant to say "disorder" where I said order (corrected above).
When entropy increases, order is lowered.
It depends on how you define "order". Entropy of the iceberg + water has increased but it seems to me that there has been a significant decrease in the motion of the molecules in the litre of boiling water and a barely perceptible increase in the motion of the molecules in the iceberg. How can you conclude that "disorder" has increased overall? How do you define disorder?
There is no obvious relationship between information entropy and thermodynamic entropy. Thermodynamic entropy concerns heat flow. It applies to the macroscopic state of matter. The internal organization of the molecules has no significance in thermodynamics. In information theory the microscopic organization of bits is all-important.

AM
 
Last edited:
  • #17
Andy Resnick
Science Advisor
Education Advisor
Insights Author
7,510
2,081
<snip>
Now compare this [statistical] notion to thermodynamic entropy with the heat death of the universe. There will come a point in time where the universe will reach maximum entropy allowing no more work to be done because everything will be in not just thermo-equilibrium, but complete equilibrium with gravity, electromagnetism, and every other method of extracting work. At this state of maximum entropy, there is no more unpredictability. The universe is dead and every future event is predictable. There is no more chaos and everything is at rest. How does this correspond to the notion of maximum entropy and disorder with the coin example?
There are some really deep questions in here- start with the notion of 'predictability'. When the universe is at the (global) state of maximum entropy, it may not be possible to have a well-defined notion of *time*- if predictablity refers to events at some future time, the concept no longer has meaning.

As for the coin, one must be careful- a coin sitting on the table is something different than a binary string of digits corresponding to actions performed by the coin (and it's environment). Defining the entropy of the binary string is a well-posed problem, but defining the entropy of the coin is not- how much energy was required to make the coin? Does the coin change if I remove an atom? If I swap a few atoms with each other? Assigning entropy in this way is not simple, but there are some tentative ideas on how:

http://en.wikipedia.org/wiki/Kolmogorov_complexity
 
  • #18
5,601
40
There is no obvious relationship between information entropy and thermodynamic entropy.
yet the quoted Wikipedia article states:

It is well known that a Shannon based definition of information entropy leads in the classical case to the Boltzmann entropy.
which is consistent with my post #4 from Charles Seife's book where Seife writes:

Shannon's measure of randomness is precisely the same function as Boltzman entropy.
A related footnote in Seife's book has this:

...in most cases - especially those where messages are sufficiently large or where a stream represents a sufficiently large collection of messages- the entropy/information content of a stream of digits is precisely the same as the entropy/information capacity of the source of that message. The equivalence is statistical just as the second law of thermodynamics is statistical.
I'm depending on "experts" here; I'd love to claim I've personally thought through all the relationships and ramifications....but that ain't happening in this life! It is reassuring to see different opinions, however,expressed in the posts above because as I posted, this IS a subject where getting an intuitive understanding is NOT easy.....nothing here is "obvious". And it's been a bit of a personal frustration for many years.....
 
Last edited:

Related Threads on Is there a difference between statistical and thermodynamic entropy?

Replies
35
Views
2K
Replies
0
Views
3K
Replies
1
Views
691
  • Last Post
3
Replies
62
Views
11K
  • Last Post
Replies
7
Views
1K
Replies
2
Views
1K
Replies
4
Views
6K
  • Last Post
Replies
3
Views
5K
Replies
1
Views
874
Top