Register to reply

Entropy of System vs. State

by Thalion
Tags: entropy, state
Share this thread:
Thalion
#1
Oct1-10, 12:38 AM
P: 2
I've started studying statistical mechanics and thermodynamics on my own, and I'm confused about a few basic concepts. From what I can gather, the following two statements are true about a system in equilibrium:

1. A system is in equilibrium (by definition) when all of its micro-states are equally probable.
2. The entropy of a system is defined by the natural log of the number of micro-states available to the system.


The second claim can be used to show to that the entropy of a system increases when, for example, an energy constraint is removed that allows two subsystems to come into thermal contact, since the combined system has a greater number of available micro-states. Under this definition, it seems as if the entropy of a system in equilibrium is independent of the actual state of the system (and, in fact, that a system that is instantaneously in an odd state, say, all the gas in the corner of a room, can be in equilibrium provided this state occurs with the same frequency as all others).

How does this work with the idea of an individual state having a particular entropy? I have also seen references to the entropy of a state, defined as the natural log of the multiplicity of a particular state of the system, rather than the total number of states available to the system. Are these just two different types of entropy? It sometimes seems like books/articles are close to equivocating on the meaning of the word...
Phys.Org News Partner Physics news on Phys.org
Physical constant is constant even in strong gravitational fields
Physicists provide new insights into the world of quantum materials
Nuclear spins control current in plastic LED: Step toward quantum computing, spintronic memory, better displays
Thalion
#2
Oct1-10, 12:46 AM
P: 2
To elaborate:

I find the following statement of the Second Law of Thermodynamics in Kittel's Thermal Physics:

"If a closed system is in a configuration that is not the equilibrium configuration, the most probable consequence will be that the entropy of the system will increase monotonically in successive instants of time."

I have several problems with this. First, a system isn't in an equilibrium configuration at a given moment in time. The only way to know whether a system is in equilibrium is to watch it for an extended period and see if it spends on average the same length of time in every micro-state. Second, the entropy of the system is defined irrespective of the state of the system, so it shouldn't change in time. If they mean that the entropy of the state will increase, this contradicts the time-dispersed definition of equilibrium.
Andy Resnick
#3
Oct1-10, 08:53 AM
Sci Advisor
P: 5,544
Quote Quote by Thalion View Post
From what I can gather, the following two statements are true about a system in equilibrium:

1. A system is in equilibrium (by definition) when all of its micro-states are equally probable.
2. The entropy of a system is defined by the natural log of the number of micro-states available to the system.
[snip]

How does this work with the idea of an individual state having a particular entropy?
Quote Quote by Thalion View Post
To elaborate:

I find the following statement of the Second Law of Thermodynamics in Kittel's Thermal Physics:

"If a closed system is in a configuration that is not the equilibrium configuration, the most probable consequence will be that the entropy of the system will increase monotonically in successive instants of time."

I have several problems with this. First, a system isn't in an equilibrium configuration at a given moment in time. The only way to know whether a system is in equilibrium is to watch it for an extended period and see if it spends on average the same length of time in every micro-state. Second, the entropy of the system is defined irrespective of the state of the system, so it shouldn't change in time. If they mean that the entropy of the state will increase, this contradicts the time-dispersed definition of equilibrium.
These sorts of questions commonly arise when trying to reduce thermodynamics to a purely mechanical theory.

"equilibrium" means the *macroscopic* properties of a system don't change in time. In terms of a statistical theory, you are led to the definition #1 above.

So naturally, we must ask "how long shall we wait to see if nothing changes"? A similar issue is raised when considering 'ergodic' systems: how do you know if your particular system is representative of all the possible allowed systems? Glasses, in particular, are nonergodic.

To get around this, we've invented concepts like 'quasi equilibrium', 'local equilibrium', 'detailed balance', 'steady state'... all of which allow for considerations like 't < T, a relaxation time'. As another example, although 'temperature' is an equilibrium property of a material, we can write T(t) or even T(x,t) and allow it to vary *slowly as compared to some other process*, such as atomic collision times. This has greatly extended the usefulness of thermodynamics.

So, you have a system in a specific microstate (for example, you measured everything you need to know). How can you assign an entropy? Here's where it gets interesting.

AFAICT, a system about which you have complete information- you completely specify the microstate- has an entropy of zero.

This does not cause problems- as soon as the system is allowed to evolve in time, most generally it does not remain in the same microstate- you lose information, the system spreads out in state space, and the entropy increases.

In quantum mechanics, this picture is referred to a 'decoherence' or 'dephasing'.

In the end, it makes more sense to speak of the *change* in entropy as the system undergoes a process, just as it makes more sense to speak of the *change* in potential energy in mechanics, rather than the 'absolute' potential energy.


Register to reply

Related Discussions
Entropy as a state function? Classical Physics 32
Von Neumann Entropy of GHZ state Advanced Physics Homework 3
Entropy of a system General Physics 2
Entropy of a system? Classical Physics 6
The difference between system equili and system and steady state General Physics 4