How can we best describe entropy other than with disorder?

AI Thread Summary
Entropy is commonly viewed as a measure of disorder or the amount of unusable energy in a system, with statistical physics defining it as the natural logarithm of the multiplicity of a macrostate. This perspective highlights the difference between macrostates, which can be known, and microstates, which represent the specific configurations of particles within the system. The discussion also touches on how entropy can be seen as a measure of ignorance about a system's microstate, where fewer known positions and momenta lead to higher entropy. A counterpoint is raised regarding scenarios like the reaction between oxygen and hydrogen, suggesting that increased order can occur post-reaction, challenging the disorder concept. Ultimately, the conversation explores various interpretations of entropy, including its relation to energy diffusion and the potential connections between these views.
Grajmanu
Messages
1
Reaction score
0
Entropy is often represented as a representation of disorder in a system or the amount of energy deposited as unusable in a system. What are the other perspectives about entropy?
 
Science news on Phys.org
Statistical physics: entropy is the natural logarithm of the multiplicity of the macrostate.
 
  • Like
Likes Dale
Simply state what forms of energy is disordered first before looking up the definition of Work energy in an engineering physics textbook.
 
I find the statistical physics interpretation more satisfying. Entropy can be considered as the amount of ignorance we have about a system. For example, we can know the system's macrostate (i.e. its temperature, internal energy etc.) but we do not know what microstate it is in. Microstate here means, a certain distribution of either quantum states like energy levels or classically you can think of microstates being described by the individual positions and momenta of each particle. The idea is that while we can always be given a macrostate, there are often many(very many!) more microstates that are consistent with a macrostate. The entropy is then the logarithm of how many of these microstates are consistent with the macrostate (aka multiplicity) and these microstates change with time (Liouville's theorem governs these changes). So the less we know about the positions and momenta of each particle, the more entropy there is. If we fix one particle in a position, we have less entropy as we have reduced the number of available microstates. Entropy is more a product of the statistical intepretation.
 
One generally speaks of two sets of distinguishable molecules (say, rad and green ones) separated by a membrane. One removes the membrane, they get all mixed up, and the result is more disordered than when they were separated.

What about the case where the red molecules are oxygen and the green ones are hydrogen and taking away the membrane causes a spark? After the smoke clears away (and a lot of energy with it), the result is more ordered, it seems to me. In which case, the disorder idea does not work. Rather, one must speak in terms of diffusion of energy, which works very well.

One chemist, Frank Lambert, has taken on this subject on the Internet.

Any comments?
 
Entropy can be described as a spreading of energy. Whenever energy spreads out entropy increases.
 
Agreed. That makes four ways of looking at entropy: Q/T of Carnot or Clausius or whoever it was, kln(W) of Boltzmann, disorder (same thing, really), and negative (?) information. Can all these be tied together somehow?

Personally, I don't get the information business, basically because I don't see why Maxwell's daemon has to keep records! In fact, that would keep him from making random decisions, would it not? But that is another subject which has probably been discussed.
 
Back
Top