How Does Entropy Differ Between System Equilibrium and Individual States?

  • Context: Graduate 
  • Thread starter Thread starter Thalion
  • Start date Start date
  • Tags Tags
    Entropy State System
Click For Summary
SUMMARY

This discussion focuses on the differences between entropy in system equilibrium and individual states within statistical mechanics and thermodynamics. It establishes that a system is in equilibrium when all micro-states are equally probable, and that entropy is defined by the natural logarithm of the number of available micro-states. The conversation highlights the confusion surrounding the concept of individual state entropy versus total system entropy, particularly in relation to the Second Law of Thermodynamics as presented in Kittel's Thermal Physics. The participants emphasize the importance of understanding the temporal dynamics of entropy changes and the implications of concepts like quasi-equilibrium and ergodicity.

PREREQUISITES
  • Statistical mechanics fundamentals
  • Thermodynamics principles, particularly the Second Law
  • Understanding of micro-states and macro-states
  • Familiarity with concepts of equilibrium and non-equilibrium systems
NEXT STEPS
  • Explore the concept of 'quasi equilibrium' in thermodynamics
  • Study the implications of ergodicity in statistical mechanics
  • Investigate the relationship between entropy and information in quantum mechanics
  • Learn about the practical applications of entropy changes in real-world thermodynamic processes
USEFUL FOR

Students and professionals in physics, particularly those studying statistical mechanics and thermodynamics, as well as researchers interested in the foundational concepts of entropy and its applications in various physical systems.

Thalion
Messages
2
Reaction score
0
I've started studying statistical mechanics and thermodynamics on my own, and I'm confused about a few basic concepts. From what I can gather, the following two statements are true about a system in equilibrium:

1. A system is in equilibrium (by definition) when all of its micro-states are equally probable.
2. The entropy of a system is defined by the natural log of the number of micro-states available to the system.


The second claim can be used to show to that the entropy of a system increases when, for example, an energy constraint is removed that allows two subsystems to come into thermal contact, since the combined system has a greater number of available micro-states. Under this definition, it seems as if the entropy of a system in equilibrium is independent of the actual state of the system (and, in fact, that a system that is instantaneously in an odd state, say, all the gas in the corner of a room, can be in equilibrium provided this state occurs with the same frequency as all others).

How does this work with the idea of an individual state having a particular entropy? I have also seen references to the entropy of a state, defined as the natural log of the multiplicity of a particular state of the system, rather than the total number of states available to the system. Are these just two different types of entropy? It sometimes seems like books/articles are close to equivocating on the meaning of the word...
 
Science news on Phys.org
To elaborate:

I find the following statement of the Second Law of Thermodynamics in Kittel's Thermal Physics:

"If a closed system is in a configuration that is not the equilibrium configuration, the most probable consequence will be that the entropy of the system will increase monotonically in successive instants of time."

I have several problems with this. First, a system isn't in an equilibrium configuration at a given moment in time. The only way to know whether a system is in equilibrium is to watch it for an extended period and see if it spends on average the same length of time in every micro-state. Second, the entropy of the system is defined irrespective of the state of the system, so it shouldn't change in time. If they mean that the entropy of the state will increase, this contradicts the time-dispersed definition of equilibrium.
 
Thalion said:
From what I can gather, the following two statements are true about a system in equilibrium:

1. A system is in equilibrium (by definition) when all of its micro-states are equally probable.
2. The entropy of a system is defined by the natural log of the number of micro-states available to the system.
[snip]

How does this work with the idea of an individual state having a particular entropy?

Thalion said:
To elaborate:

I find the following statement of the Second Law of Thermodynamics in Kittel's Thermal Physics:

"If a closed system is in a configuration that is not the equilibrium configuration, the most probable consequence will be that the entropy of the system will increase monotonically in successive instants of time."

I have several problems with this. First, a system isn't in an equilibrium configuration at a given moment in time. The only way to know whether a system is in equilibrium is to watch it for an extended period and see if it spends on average the same length of time in every micro-state. Second, the entropy of the system is defined irrespective of the state of the system, so it shouldn't change in time. If they mean that the entropy of the state will increase, this contradicts the time-dispersed definition of equilibrium.

These sorts of questions commonly arise when trying to reduce thermodynamics to a purely mechanical theory.

"equilibrium" means the *macroscopic* properties of a system don't change in time. In terms of a statistical theory, you are led to the definition #1 above.

So naturally, we must ask "how long shall we wait to see if nothing changes"? A similar issue is raised when considering 'ergodic' systems: how do you know if your particular system is representative of all the possible allowed systems? Glasses, in particular, are nonergodic.

To get around this, we've invented concepts like 'quasi equilibrium', 'local equilibrium', 'detailed balance', 'steady state'... all of which allow for considerations like 't < T, a relaxation time'. As another example, although 'temperature' is an equilibrium property of a material, we can write T(t) or even T(x,t) and allow it to vary *slowly as compared to some other process*, such as atomic collision times. This has greatly extended the usefulness of thermodynamics.

So, you have a system in a specific microstate (for example, you measured everything you need to know). How can you assign an entropy? Here's where it gets interesting.

AFAICT, a system about which you have complete information- you completely specify the microstate- has an entropy of zero.

This does not cause problems- as soon as the system is allowed to evolve in time, most generally it does not remain in the same microstate- you lose information, the system spreads out in state space, and the entropy increases.

In quantum mechanics, this picture is referred to a 'decoherence' or 'dephasing'.

In the end, it makes more sense to speak of the *change* in entropy as the system undergoes a process, just as it makes more sense to speak of the *change* in potential energy in mechanics, rather than the 'absolute' potential energy.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
10
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K