Questions about examples of why entropy is NOT disorder

In summary: Full summary: In summary, there is a pedagogical concern about using the term "disorder" when introducing entropy to new students. However, there are examples that show entropy and disorder are not the same thing. There is a difference between thermodynamic entropy and information entropy, and the concept of order can complicate the understanding of entropy. The Third Law of Thermodynamics states that the entropy of a perfect crystal approaches zero as temperature approaches absolute zero, but this does not necessarily mean that entropy can be thought of in terms of order/disorder. It is important to be able to calculate to truly understand entropy.
  • #1
GeorgeWBush
6
0
I understand the pedagogical concern that the idea of “disorder” should be discouraged when introducing entropy to new students (e.g. http://entropysite.oxy.edu/cracked_crutch.html).

But I also just read http://www.science20.com/train_thought/blog/entropy_not_disorder-75081" by Steve Donaldson which provides some examples showing that entropy and disorder are not the same thing.

I have questions about two of his examples:

That the entropy of a shuffled deck of cards is not different than an ordered deck.
That the entropy of a living person is greater than that of a dried up corpse.

In the example of the deck of cards: Isn’t it correct to say that the Gibbs entropy is the same but the Shannon entropy is greater in the shuffled deck?

In the example of a living person Vs a dried up corpse: isn’t it true that a living person has lower entropy than the same entire person dried up - including the 60% of his mass that is water (now dispersed into the environment)?

Also would it be correct to say that “entropy” exist independent of human thought whereas “disorder” is a cognitive perception. The same way that light at wavelength of 580nm is not the same thing as the color yellow? That entropy exists "in reality" and disorder exists "in our minds".
 
Last edited by a moderator:
Science news on Phys.org
  • #2
GeorgeWBush said:
I
But I also just read http://www.science20.com/train_thought/blog/entropy_not_disorder-75081" by Steve Donaldson which provides some examples showing that entropy and disorder are not the same thing.
Be careful with these kinds of articles on difficult subjects such as entropy. They are often misleading or very wrong. This one is a good example.

The author states:
Steve Donaldson said:
The entropy of system is the average heat capacity of the system averaged over its absolute temperature.
...

Entropy, S, is the heat content, Q, divided by the body's temperature, T.
S = Q/T
Stated another way, the heat, Q, stored in an object at temperature, T, is its entropy, S, multiplied by its temperature, T.
Q = T x S

In classical thermodynamics one starts with the definition of change in entropy of a system in a moving from one state to another: ΔS = ∫dQ/T over a reversible path between the two states.

Then we define the entropy of a substance at T=0 as zero: S(0) = 0.

So the absolute entropy of a system is [itex]S(T) - S(0) = S(T) = \int_0^T dQ/T[/itex]

If a mole of a gas is brought from 0 Kelvin to temperature T at constant pressure and has constant heat capacity, Cp, the entropy of the gas is:

[itex]S = \int_0^T dQ/T = C_p\int_0^T dT/T = C_p\ln T - S_0 = C_p\ln T[/itex]

Its internal energy content is [itex]U = C_vT = (C_p-R)T = C_pT - RT[/itex]

So the entropy of the gas is not its "heat content" divided by T.

Re disorder

The second law of thermodynamics is statistical in nature. But entropy cannot be thought of disorder unless disorder or order is defined in a particular way. It is not obvious that there is more disorder in two warm bodies than there is in a hot body and a cold body.

Also, be careful about thermodynamic entropy and information entropy. They both have a statistical nature and have mathematical similarities but they have very different origins and are based on quite different concepts.

AM
 
Last edited by a moderator:
  • #3
Now I'm really confused. The wikipedia artical on the Third Law of Thermodynamics says

"The third law of thermodynamics is a statistical law of nature regarding entropy:
The entropy of a perfect crystal approaches zero as temperature approaches absolute zero."

and the wikipedia artical on perfect crystal (http://en.wikipedia.org/wiki/Perfect_crystal):
"Crystalline materials (mainly metals and alloys, but also stoichiometric salts and other materials) are made up of solid regions of ordered matter (atoms placed in one of a number of ordered formations called Bravais lattices). "

so here I am back to the concept of "order" again...

Can entropy be thought of in terms of order/disorder or not?
 
  • #4
GeorgeWBush said:
Can entropy be thought of in terms of order/disorder or not?

If we answer "no", you will then say, "but so and so said...". If we answer "yes", you will then say, "but I am really confused." You should reread what Andrew said. This description, like most verbal descriptions, is not perfect, nor is it completely wrong. If you really want to understand, you need to be able to calculate.
 
  • #5
Vanadium 50 said:
If we answer "no", you will then say, "but so and so said...". If we answer "yes", you will then say, "but I am really confused." You should reread what Andrew said. This description, like most verbal descriptions, is not perfect, nor is it completely wrong. If you really want to understand, you need to be able to calculate.

fair enough, and thanks for the frank response.
 
  • #6
Order, in the sense used with the deck of cards example, measures a different quantity than entropy. A deck of cards can be arranged in 52! -- I always confuse myself when doing this kind of thing, so let's just say a-whole-buncha -- ways. Just ONE of those arrangements has the additional quality of being in rank and suit order. The probability (entropy) of that arrangement is _exactly_ the same as any other specific arrangement, just like the probability of flipping 4 heads in a row -- HHHH -- is the identical to flipping HTHT.

What is different about the rank order of the deck or the HHHH is the Mutual Information between elements in the system. If you know the deck is sorted and just one card's position you can predict where all the rest are. This is a sort of Second Order (no pun of course) entropy measurement, and, IMHO, what many people are trying to get at when they talk about entropy and order.
 
  • #7
Can entropy be thought of in terms of order/disorder or not?

yes... it's a place to start. But don't think that you'll understand everything about entropy
by folloowing this introductory concept.

Some good stuff here:
http://en.wikipedia.org/wiki/Entropy_(information_theory)

It terns out entropy can be considered a subset of information theory!
...and there are some STRANGE principles contained within,like Landauer's principle...also described in Wikipedia...
 
  • #8
Naty1 said:
yes... it's a place to start. But don't think that you'll understand everything about entropy
by folloowing this introductory concept.

Some good stuff here:
http://en.wikipedia.org/wiki/Entropy_(information_theory)

It terns out entropy can be considered a subset of information theory!
...and there are some STRANGE principles contained within,like Landauer's principle...also described in Wikipedia...

The paper by Jaynes cited in that Wiki page is here:
http://bayes.wustl.edu/etj/articles/theory.1.pdf

I would be interesting to know if that work has withstood the test of time.
 

1. Why is entropy not the same as disorder?

Entropy is a measure of the number of ways in which the atoms or molecules of a system can be arranged while maintaining the same energy. Disorder, on the other hand, is a subjective term that describes the perceived randomness or lack of organization in a system. While high entropy can often be associated with disorder, it is not the same concept.

2. Can you provide an example of why increasing entropy does not always result in disorder?

An example of this is the process of crystallization. When a liquid solution is cooled and the molecules come together to form a crystal, the entropy of the system decreases as the molecules become more ordered. However, this process also results in an increase in the entropy of the surroundings, as heat is released. Therefore, even though the entropy of the system decreases, the overall entropy of the universe increases due to the increase in the entropy of the surroundings.

3. How does the second law of thermodynamics explain why entropy is not disorder?

The second law of thermodynamics states that the total entropy of an isolated system will always increase over time. This means that in a closed system, the number of microstates (ways in which the system can be arranged) will tend to increase, leading to an increase in entropy. This increase in entropy does not necessarily result in disorder, as the system may still exhibit a high degree of organization or structure.

4. Are there any examples of systems with high entropy that exhibit a high degree of organization?

Yes, living organisms are a prime example of this. Despite having a high degree of entropy due to the large number of molecules and chemical reactions occurring within them, living organisms are highly organized and maintain a complex structure. This is because living organisms constantly take in energy from their surroundings and use it to maintain their organization and decrease their entropy.

5. How does the concept of negative entropy relate to the idea that entropy is not disorder?

Negative entropy, also known as negentropy, is a term used to describe systems that decrease in entropy over time. This concept is often used to explain how living organisms can maintain their high degree of organization despite the tendency for entropy to increase. It highlights the fact that entropy is not always disorder, as in the case of living systems where negative entropy is constantly being generated to maintain order and organization.

Similar threads

Replies
3
Views
968
  • Thermodynamics
Replies
15
Views
8K
Replies
62
Views
13K
Replies
9
Views
3K
  • Computing and Technology
Replies
3
Views
1K
  • General Discussion
2
Replies
38
Views
7K
Back
Top