Is entropy truly a measure of disorder or is it something else entirely?

In summary, the conversation discusses the relationship between entropy and disorder. While entropy is one measure of disorder, it is not always related to disorder. The traditional definition of entropy as a measure of disorder can be difficult to understand, but Shannon's reinterpretation as a measure of missing information may be more comprehensive. The concept is further explored in A. Katz's book, Principles of Statistical Mechanics, and recent experimental work on "quantum Maxwell demon" supports the information-theoretical interpretation of entropy.
  • #1
LEELA PRATHAP KUMAR
How much correct to say entropy is always measure of disorder
Is disorder always related to entropy
 
Science news on Phys.org
  • #2
LEELA PRATHAP KUMAR said:
How much correct to say entropy is always measure of disorder
Is disorder always related to entropy
That topic is often discussed here. I suggest a forum search (always a good idea on basic questions). A good place to start is with the links at the bottom of the page (the forum does a brief search for you, based on your subject line)
 
  • Like
Likes DrClaude
  • #3
LEELA PRATHAP KUMAR said:
How much correct to say entropy is always measure of disorder
Is disorder always related to entropy
Certainly not "always." How do you define disorder?
 
  • #4
Most scientific areas use common words and give them a more narrow technical meaning. There are lots of common uses and broader understandings of physics words that may differ with the strict physics definitions: energy, force, momentum, revolution, period, etc. Disorder is one of those words also.

Entropy is one measure of disorder. There are other uses and non-technical understandings of the word disorder which do not exactly correspond with the physics definition of entropy.
 
  • #5
Well, this you often read, but to be honest I never understood it. Also the tradiational definition of introducing temperature as an integrating factor to make ##\delta Q=T \mathrm{d} S##, which introduces entropy into the game, didn't help me much.

What I find most convincing is to use Shannon's reinterpretation of entropy as a measure for missing information (relative to a prior state defining complete information) for a given probability distribution, leading to the Shannon-Jaynes-von-Neumann entropy in (quantum-)statistical physics. The idea of course goes back to Szilard's famous paper on the Maxwell demon of 1928.

A good introduction to statistical physics (both classical and quantum) using the information-theoretical approach is

A. Katz, Principles of Statistical Mechanics, W. H. Freeman and Company, San Francisco and London, 1967.

I also think that with the recent experimental work on "quantum Maxwell demon" it's almost empirically proven that the information-theoretical interpretation of entropy is the most comprehensive one.
 
  • Like
Likes DrClaude

1. What is the difference between entropy and disorder?

Entropy is a measure of the amount of energy in a system that is unavailable for work, while disorder refers to the random arrangement of particles in a system. In other words, entropy measures the amount of energy that is not able to do work, while disorder measures the randomness or chaos in a system.

2. How does entropy affect the behavior of a system?

As a system increases in entropy, it becomes less organized and more random. This can lead to a decrease in the system's ability to do work and can result in a decrease in its overall stability. In thermodynamics, this is known as the Second Law of Thermodynamics.

3. Is there a relationship between entropy and temperature?

Yes, there is a direct relationship between entropy and temperature. As temperature increases, the entropy of a system also increases. This is because at higher temperatures, particles within a system have more energy and are able to move around more freely, increasing the randomness and disorder of the system.

4. Can entropy be reversed?

No, entropy cannot be reversed. The Second Law of Thermodynamics states that the total entropy of a closed system will always increase over time. While it is possible to temporarily decrease the entropy of a small portion of a system, the overall entropy will continue to increase.

5. How is entropy related to the concept of equilibrium?

Entropy is closely related to the concept of equilibrium. In a closed system, entropy will always increase until it reaches a state of maximum entropy, or equilibrium. At this point, the system will no longer be able to do work and will be at its most disordered state.

Similar threads

Replies
9
Views
6K
  • Thermodynamics
Replies
4
Views
391
  • Thermodynamics
Replies
6
Views
1K
  • Thermodynamics
Replies
1
Views
2K
  • Thermodynamics
Replies
2
Views
9K
Replies
3
Views
849
Replies
3
Views
973
Replies
3
Views
1K
  • Thermodynamics
Replies
26
Views
1K
Replies
4
Views
1K
Back
Top