How can we best describe entropy other than with disorder?

In summary, the different perspectives on entropy are that it is a representation of disorder in a system or the amount of energy deposited as unusable in a system. Statistical physics says entropy is the natural logarithm of the multiplicity of the macrostate while simply stating what forms of energy is disordered first before looking up the definition of Work energy in an engineering physics textbook.
  • #1
Grajmanu
1
0
Entropy is often represented as a representation of disorder in a system or the amount of energy deposited as unusable in a system. What are the other perspectives about entropy?
 
Science news on Phys.org
  • #2
Statistical physics: entropy is the natural logarithm of the multiplicity of the macrostate.
 
  • Like
Likes Dale
  • #3
Simply state what forms of energy is disordered first before looking up the definition of Work energy in an engineering physics textbook.
 
  • #4
I find the statistical physics interpretation more satisfying. Entropy can be considered as the amount of ignorance we have about a system. For example, we can know the system's macrostate (i.e. its temperature, internal energy etc.) but we do not know what microstate it is in. Microstate here means, a certain distribution of either quantum states like energy levels or classically you can think of microstates being described by the individual positions and momenta of each particle. The idea is that while we can always be given a macrostate, there are often many(very many!) more microstates that are consistent with a macrostate. The entropy is then the logarithm of how many of these microstates are consistent with the macrostate (aka multiplicity) and these microstates change with time (Liouville's theorem governs these changes). So the less we know about the positions and momenta of each particle, the more entropy there is. If we fix one particle in a position, we have less entropy as we have reduced the number of available microstates. Entropy is more a product of the statistical intepretation.
 
  • #5
One generally speaks of two sets of distinguishable molecules (say, rad and green ones) separated by a membrane. One removes the membrane, they get all mixed up, and the result is more disordered than when they were separated.

What about the case where the red molecules are oxygen and the green ones are hydrogen and taking away the membrane causes a spark? After the smoke clears away (and a lot of energy with it), the result is more ordered, it seems to me. In which case, the disorder idea does not work. Rather, one must speak in terms of diffusion of energy, which works very well.

One chemist, Frank Lambert, has taken on this subject on the Internet.

Any comments?
 
  • #6
Entropy can be described as a spreading of energy. Whenever energy spreads out entropy increases.
 
  • #7
Agreed. That makes four ways of looking at entropy: Q/T of Carnot or Clausius or whoever it was, kln(W) of Boltzmann, disorder (same thing, really), and negative (?) information. Can all these be tied together somehow?

Personally, I don't get the information business, basically because I don't see why Maxwell's daemon has to keep records! In fact, that would keep him from making random decisions, would it not? But that is another subject which has probably been discussed.
 

1. What is entropy?

Entropy is a scientific concept that measures the level of disorder or randomness in a system. It is often referred to as the degree of chaos or randomness in a system.

2. Why is entropy often associated with disorder?

This is because, in most cases, an increase in entropy results in a decrease in order. For example, when ice melts into water, the molecules become more disordered, and entropy increases.

3. Is entropy the same as randomness?

No, entropy and randomness are not the same. While entropy measures the level of disorder in a system, randomness refers to the lack of a predictable pattern or sequence. Entropy can increase with randomness, but not all randomness results in an increase in entropy.

4. Can entropy be negative?

No, entropy cannot be negative. According to the Second Law of Thermodynamics, entropy in a closed system will either remain constant or increase over time, but it cannot decrease.

5. How can we describe entropy other than with disorder?

Aside from disorder, entropy can also be described as a measure of energy dispersal or the amount of unavailable energy in a system. It is also closely related to the number of possible microstates in a system, which describes the different ways that particles can be arranged within a system.

Similar threads

Replies
13
Views
1K
Replies
9
Views
6K
Replies
12
Views
1K
  • Thermodynamics
Replies
1
Views
2K
  • Thermodynamics
Replies
6
Views
1K
Replies
3
Views
1K
  • Thermodynamics
Replies
3
Views
786
Replies
3
Views
963
  • Thermodynamics
Replies
26
Views
1K
Replies
4
Views
1K
Back
Top