Questions about examples of why entropy is NOT disorder

  • #1
I understand the pedagogical concern that the idea of “disorder” should be discouraged when introducing entropy to new students (e.g. http://entropysite.oxy.edu/cracked_crutch.html).

But I also just read http://www.science20.com/train_thought/blog/entropy_not_disorder-75081" [Broken] by Steve Donaldson which provides some examples showing that entropy and disorder are not the same thing.

I have questions about two of his examples:

That the entropy of a shuffled deck of cards is not different than an ordered deck.
That the entropy of a living person is greater than that of a dried up corpse.

In the example of the deck of cards: Isn’t it correct to say that the Gibbs entropy is the same but the Shannon entropy is greater in the shuffled deck?

In the example of a living person Vs a dried up corpse: isn’t it true that a living person has lower entropy than the same entire person dried up - including the 60% of his mass that is water (now dispersed into the environment)?

Also would it be correct to say that “entropy” exist independent of human thought whereas “disorder” is a cognitive perception. The same way that light at wavelength of 580nm is not the same thing as the color yellow? That entropy exists "in reality" and disorder exists "in our minds".
 
Last edited by a moderator:

Answers and Replies

  • #2
Andrew Mason
Science Advisor
Homework Helper
7,692
403
I
But I also just read http://www.science20.com/train_thought/blog/entropy_not_disorder-75081" [Broken] by Steve Donaldson which provides some examples showing that entropy and disorder are not the same thing.
Be careful with these kinds of articles on difficult subjects such as entropy. They are often misleading or very wrong. This one is a good example.

The author states:
Steve Donaldson said:
The entropy of system is the average heat capacity of the system averaged over its absolute temperature.
...

Entropy, S, is the heat content, Q, divided by the body's temperature, T.
S = Q/T
Stated another way, the heat, Q, stored in an object at temperature, T, is its entropy, S, multiplied by its temperature, T.
Q = T x S

In classical thermodynamics one starts with the definition of change in entropy of a system in a moving from one state to another: ΔS = ∫dQ/T over a reversible path between the two states.

Then we define the entropy of a substance at T=0 as zero: S(0) = 0.

So the absolute entropy of a system is [itex]S(T) - S(0) = S(T) = \int_0^T dQ/T[/itex]

If a mole of a gas is brought from 0 Kelvin to temperature T at constant pressure and has constant heat capacity, Cp, the entropy of the gas is:

[itex]S = \int_0^T dQ/T = C_p\int_0^T dT/T = C_p\ln T - S_0 = C_p\ln T[/itex]

Its internal energy content is [itex]U = C_vT = (C_p-R)T = C_pT - RT[/itex]

So the entropy of the gas is not its "heat content" divided by T.

Re disorder

The second law of thermodynamics is statistical in nature. But entropy cannot be thought of disorder unless disorder or order is defined in a particular way. It is not obvious that there is more disorder in two warm bodies than there is in a hot body and a cold body.

Also, be careful about thermodynamic entropy and information entropy. They both have a statistical nature and have mathematical similarities but they have very different origins and are based on quite different concepts.

AM
 
Last edited by a moderator:
  • #3
Now I'm really confused. The wikipedia artical on the Third Law of Thermodynamics says

"The third law of thermodynamics is a statistical law of nature regarding entropy:
The entropy of a perfect crystal approaches zero as temperature approaches absolute zero."

and the wikipedia artical on perfect crystal (http://en.wikipedia.org/wiki/Perfect_crystal):
"Crystalline materials (mainly metals and alloys, but also stoichiometric salts and other materials) are made up of solid regions of ordered matter (atoms placed in one of a number of ordered formations called Bravais lattices). "

so here I am back to the concept of "order" again...

Can entropy be thought of in terms of order/disorder or not?
 
  • #4
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
2021 Award
27,685
11,957
Can entropy be thought of in terms of order/disorder or not?

If we answer "no", you will then say, "but so and so said...". If we answer "yes", you will then say, "but I am really confused." You should reread what Andrew said. This description, like most verbal descriptions, is not perfect, nor is it completely wrong. If you really want to understand, you need to be able to calculate.
 
  • #5
If we answer "no", you will then say, "but so and so said...". If we answer "yes", you will then say, "but I am really confused." You should reread what Andrew said. This description, like most verbal descriptions, is not perfect, nor is it completely wrong. If you really want to understand, you need to be able to calculate.

fair enough, and thanks for the frank response.
 
  • #6
595
0
Order, in the sense used with the deck of cards example, measures a different quantity than entropy. A deck of cards can be arranged in 52! -- I always confuse myself when doing this kind of thing, so lets just say a-whole-buncha -- ways. Just ONE of those arrangements has the additional quality of being in rank and suit order. The probability (entropy) of that arrangement is _exactly_ the same as any other specific arrangement, just like the probability of flipping 4 heads in a row -- HHHH -- is the identical to flipping HTHT.

What is different about the rank order of the deck or the HHHH is the Mutual Information between elements in the system. If you know the deck is sorted and just one card's position you can predict where all the rest are. This is a sort of Second Order (no pun of course) entropy measurement, and, IMHO, what many people are trying to get at when they talk about entropy and order.
 
  • #7
5,607
40
Can entropy be thought of in terms of order/disorder or not?

yes... it's a place to start. But don't think that you'll understand everything about entropy
by folloowing this introductory concept.

Some good stuff here:
http://en.wikipedia.org/wiki/Entropy_(information_theory)

It terns out entropy can be considered a subset of information theory!!!
...and there are some STRANGE principles contained within,like Landauer's principle...also described in Wikipedia....
 
  • #8
yes... it's a place to start. But don't think that you'll understand everything about entropy
by folloowing this introductory concept.

Some good stuff here:
http://en.wikipedia.org/wiki/Entropy_(information_theory)

It terns out entropy can be considered a subset of information theory!!!
...and there are some STRANGE principles contained within,like Landauer's principle...also described in Wikipedia....

The paper by Jaynes cited in that Wiki page is here:
http://bayes.wustl.edu/etj/articles/theory.1.pdf

I would be interesting to know if that work has withstood the test of time.
 

Related Threads on Questions about examples of why entropy is NOT disorder

  • Last Post
Replies
6
Views
964
  • Last Post
2
Replies
25
Views
7K
  • Last Post
Replies
5
Views
1K
Replies
39
Views
11K
  • Last Post
Replies
4
Views
937
Replies
7
Views
1K
Replies
15
Views
7K
  • Last Post
Replies
24
Views
3K
  • Last Post
Replies
4
Views
33K
Replies
3
Views
866
Top