How order and disorder defined for entropy?

In summary, entropy is a measure of order and disorder, but the concept of order and disorder is subjective and relative. It is often associated with the number of possible ways to arrange particles in a macroscopic state, with fewer arrangements resulting in order and more resulting in disorder. This link between entropy and microstates becomes even more apparent when considering isolated systems and the number of equally probable quantum mechanical microstates. To learn more about this, one can consult a book on Statistical Thermodynamics.
  • #1
GME
2
0
Entropy is the measure of ored and disorder. But who tells that what is order and what is disorder? Isnt it a relative or subjective thing? How to define it in general, or it can be definet only for thermodinamic systems?
 
Science news on Phys.org
  • #2
"order" is a subjective term which humans tend to associate with macroscopic states that correspond to relatively few microscopic states. To borrow a typical example, of all the possible ways of arranging grains of sand, very few lead to sandcastles, which we call "ordered". There are far more ways of arranging the grains which just lead to a pile of sand, which we call "disordered".

The entropy link arises from the disparity in the number of microstates which correspond to a given macrostate.

If we assume the temporal transition between microstates is random, and a disordered macrostate corresponds to more microstates than an ordered one, we're likely to observe a transition from order to disorder as time proceeds.
 
  • #3
I'd like to elaborate on what MickyW said. For an isolated system, the amount of disorder can by quantified by the number of linearly independent quantum mechanical microstates that the system can exhibit. The larger the number of (equally probable) quantum mechanical microstates, the more disorder. The entropy is proportional to the natural log of the number of quantum mechanical microstates. If you want to learn more about this, get a book on Statistical Thermodynamics, such as Hill's book.
 

1. What is the definition of entropy?

Entropy is a measure of the disorder or randomness in a system. It is a thermodynamic property that describes the amount of energy that is unavailable for work in a system.

2. How is entropy related to order and disorder?

In general, a system with high entropy is considered to be more disordered, while a system with low entropy is considered to be more ordered. This is because as a system becomes more disordered, the number of possible arrangements or microstates that it can exist in increases, leading to a higher entropy value.

3. What are some examples of systems with high entropy?

Some examples of systems with high entropy include a gas that has expanded to fill a large volume, a disorganized pile of books, or a cup of hot coffee that has cooled to room temperature.

4. How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that natural processes tend to move towards a state of higher entropy and disorder.

5. Can entropy be reversed or decreased in a system?

While it is possible to decrease the entropy of a system in a localized area, the overall entropy of the system and its surroundings will always increase due to the second law of thermodynamics. This means that overall, entropy cannot be reversed or decreased in a closed system.

Similar threads

Replies
4
Views
1K
Replies
9
Views
6K
Replies
3
Views
708
Replies
6
Views
5K
  • Thermodynamics
Replies
6
Views
1K
  • Thermodynamics
Replies
1
Views
2K
Replies
3
Views
1K
Replies
3
Views
965
  • Thermodynamics
Replies
2
Views
9K
  • Thermodynamics
Replies
4
Views
376
Back
Top