How can we best describe entropy other than with disorder?

Click For Summary

Discussion Overview

The discussion revolves around various interpretations of entropy beyond the common notion of disorder. Participants explore theoretical perspectives, including statistical physics, energy distribution, and the implications of these interpretations in different contexts.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants propose that entropy can be understood as the natural logarithm of the multiplicity of the macrostate, as described in statistical physics.
  • One participant suggests that entropy reflects the amount of ignorance about a system, emphasizing the distinction between macrostate and microstate knowledge.
  • Another viewpoint highlights that entropy can be described as a spreading of energy, indicating that as energy disperses, entropy increases.
  • A participant challenges the disorder interpretation by presenting a scenario involving chemical reactions, suggesting that in certain cases, entropy may lead to more order rather than disorder.
  • There is mention of multiple interpretations of entropy, including the Q/T formulation and the concept of negative information, raising questions about how these perspectives might be interconnected.

Areas of Agreement / Disagreement

Participants express differing views on the interpretation of entropy, with no consensus reached on a singular definition or understanding. Multiple competing perspectives remain present throughout the discussion.

Contextual Notes

Some claims depend on specific definitions of terms like "disorder" and "energy," and the discussion includes unresolved questions about the relationship between different interpretations of entropy.

Grajmanu
Messages
1
Reaction score
0
Entropy is often represented as a representation of disorder in a system or the amount of energy deposited as unusable in a system. What are the other perspectives about entropy?
 
Science news on Phys.org
Statistical physics: entropy is the natural logarithm of the multiplicity of the macrostate.
 
  • Like
Likes   Reactions: Dale
Simply state what forms of energy is disordered first before looking up the definition of Work energy in an engineering physics textbook.
 
I find the statistical physics interpretation more satisfying. Entropy can be considered as the amount of ignorance we have about a system. For example, we can know the system's macrostate (i.e. its temperature, internal energy etc.) but we do not know what microstate it is in. Microstate here means, a certain distribution of either quantum states like energy levels or classically you can think of microstates being described by the individual positions and momenta of each particle. The idea is that while we can always be given a macrostate, there are often many(very many!) more microstates that are consistent with a macrostate. The entropy is then the logarithm of how many of these microstates are consistent with the macrostate (aka multiplicity) and these microstates change with time (Liouville's theorem governs these changes). So the less we know about the positions and momenta of each particle, the more entropy there is. If we fix one particle in a position, we have less entropy as we have reduced the number of available microstates. Entropy is more a product of the statistical intepretation.
 
One generally speaks of two sets of distinguishable molecules (say, rad and green ones) separated by a membrane. One removes the membrane, they get all mixed up, and the result is more disordered than when they were separated.

What about the case where the red molecules are oxygen and the green ones are hydrogen and taking away the membrane causes a spark? After the smoke clears away (and a lot of energy with it), the result is more ordered, it seems to me. In which case, the disorder idea does not work. Rather, one must speak in terms of diffusion of energy, which works very well.

One chemist, Frank Lambert, has taken on this subject on the Internet.

Any comments?
 
Entropy can be described as a spreading of energy. Whenever energy spreads out entropy increases.
 
Agreed. That makes four ways of looking at entropy: Q/T of Carnot or Clausius or whoever it was, kln(W) of Boltzmann, disorder (same thing, really), and negative (?) information. Can all these be tied together somehow?

Personally, I don't get the information business, basically because I don't see why Maxwell's daemon has to keep records! In fact, that would keep him from making random decisions, would it not? But that is another subject which has probably been discussed.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 15 ·
Replies
15
Views
9K
  • · Replies 26 ·
Replies
26
Views
3K