How Does Entropy Differ Between System Equilibrium and Individual States?

In summary: However, if you only specify a subset of the microstates, then the entropy of the system is greater than zero. For example, if you know the system is in one of two possible microstates, and you don't know which one, then the entropy of the system is two-thirds of the total entropy (since two-thirds of the system is in one of two possible microstates).
  • #1
Thalion
2
0
I've started studying statistical mechanics and thermodynamics on my own, and I'm confused about a few basic concepts. From what I can gather, the following two statements are true about a system in equilibrium:

1. A system is in equilibrium (by definition) when all of its micro-states are equally probable.
2. The entropy of a system is defined by the natural log of the number of micro-states available to the system.


The second claim can be used to show to that the entropy of a system increases when, for example, an energy constraint is removed that allows two subsystems to come into thermal contact, since the combined system has a greater number of available micro-states. Under this definition, it seems as if the entropy of a system in equilibrium is independent of the actual state of the system (and, in fact, that a system that is instantaneously in an odd state, say, all the gas in the corner of a room, can be in equilibrium provided this state occurs with the same frequency as all others).

How does this work with the idea of an individual state having a particular entropy? I have also seen references to the entropy of a state, defined as the natural log of the multiplicity of a particular state of the system, rather than the total number of states available to the system. Are these just two different types of entropy? It sometimes seems like books/articles are close to equivocating on the meaning of the word...
 
Science news on Phys.org
  • #2
To elaborate:

I find the following statement of the Second Law of Thermodynamics in Kittel's Thermal Physics:

"If a closed system is in a configuration that is not the equilibrium configuration, the most probable consequence will be that the entropy of the system will increase monotonically in successive instants of time."

I have several problems with this. First, a system isn't in an equilibrium configuration at a given moment in time. The only way to know whether a system is in equilibrium is to watch it for an extended period and see if it spends on average the same length of time in every micro-state. Second, the entropy of the system is defined irrespective of the state of the system, so it shouldn't change in time. If they mean that the entropy of the state will increase, this contradicts the time-dispersed definition of equilibrium.
 
  • #3
Thalion said:
From what I can gather, the following two statements are true about a system in equilibrium:

1. A system is in equilibrium (by definition) when all of its micro-states are equally probable.
2. The entropy of a system is defined by the natural log of the number of micro-states available to the system.
[snip]

How does this work with the idea of an individual state having a particular entropy?

Thalion said:
To elaborate:

I find the following statement of the Second Law of Thermodynamics in Kittel's Thermal Physics:

"If a closed system is in a configuration that is not the equilibrium configuration, the most probable consequence will be that the entropy of the system will increase monotonically in successive instants of time."

I have several problems with this. First, a system isn't in an equilibrium configuration at a given moment in time. The only way to know whether a system is in equilibrium is to watch it for an extended period and see if it spends on average the same length of time in every micro-state. Second, the entropy of the system is defined irrespective of the state of the system, so it shouldn't change in time. If they mean that the entropy of the state will increase, this contradicts the time-dispersed definition of equilibrium.

These sorts of questions commonly arise when trying to reduce thermodynamics to a purely mechanical theory.

"equilibrium" means the *macroscopic* properties of a system don't change in time. In terms of a statistical theory, you are led to the definition #1 above.

So naturally, we must ask "how long shall we wait to see if nothing changes"? A similar issue is raised when considering 'ergodic' systems: how do you know if your particular system is representative of all the possible allowed systems? Glasses, in particular, are nonergodic.

To get around this, we've invented concepts like 'quasi equilibrium', 'local equilibrium', 'detailed balance', 'steady state'... all of which allow for considerations like 't < T, a relaxation time'. As another example, although 'temperature' is an equilibrium property of a material, we can write T(t) or even T(x,t) and allow it to vary *slowly as compared to some other process*, such as atomic collision times. This has greatly extended the usefulness of thermodynamics.

So, you have a system in a specific microstate (for example, you measured everything you need to know). How can you assign an entropy? Here's where it gets interesting.

AFAICT, a system about which you have complete information- you completely specify the microstate- has an entropy of zero.

This does not cause problems- as soon as the system is allowed to evolve in time, most generally it does not remain in the same microstate- you lose information, the system spreads out in state space, and the entropy increases.

In quantum mechanics, this picture is referred to a 'decoherence' or 'dephasing'.

In the end, it makes more sense to speak of the *change* in entropy as the system undergoes a process, just as it makes more sense to speak of the *change* in potential energy in mechanics, rather than the 'absolute' potential energy.
 

1. What is the difference between entropy of system and entropy of state?

The entropy of a system refers to the overall measure of disorder or randomness in a closed system, while the entropy of a state refers to the specific state or arrangement of particles within the system. In other words, the entropy of a system is a macroscopic property, while the entropy of a state is a microscopic property.

2. How is the entropy of a system related to its thermodynamic properties?

The entropy of a system is closely related to its thermodynamic properties, such as temperature, pressure, and volume. It is a fundamental property in the second law of thermodynamics, which states that the total entropy of a closed system will always increase or remain constant over time.

3. Can the entropy of a system be negative?

No, the entropy of a system cannot be negative. This is because entropy is a measure of disorder or randomness, and a negative value would imply a negative amount of disorder, which is not physically possible.

4. How does the entropy of a system change during a physical or chemical process?

The entropy of a system can change in two ways: through heat transfer and/or through changes in volume. If heat is transferred into a system, it will increase the disorder of the particles and thus increase the entropy. Similarly, if the system expands, the particles become more disordered and the entropy increases.

5. How is the entropy of a system related to the concept of equilibrium?

In a closed system, the entropy will tend to increase until it reaches a state of maximum entropy, also known as thermodynamic equilibrium. This is the state in which the system is most disordered and has the highest entropy. In this state, there is no net flow of energy or particles, and the system is at a stable state.

Similar threads

Replies
13
Views
1K
Replies
2
Views
837
Replies
17
Views
1K
  • Thermodynamics
Replies
4
Views
2K
  • Thermodynamics
Replies
1
Views
723
Replies
15
Views
1K
Replies
5
Views
2K
Replies
12
Views
1K
Replies
10
Views
1K
Back
Top