- #1

- 11

- 0

i couldn't understand why a system in thermodynamic equilibrium must with maximum entropy?

and this is just 100% sure that within an isolated system entropy never decreases?

thank you in advance for your reply...

- Thread starter UglyNakedGuy
- Start date

- #1

- 11

- 0

i couldn't understand why a system in thermodynamic equilibrium must with maximum entropy?

and this is just 100% sure that within an isolated system entropy never decreases?

thank you in advance for your reply...

- #2

CWatters

Science Advisor

Homework Helper

Gold Member

- 10,529

- 2,297

Imagine a nice regular array of unconstrained particles of some sort. If they are vibrating (aka hot) it's not difficult to imagine how they could bounce off each and fly about until the nice regular structure is gone and disorder rules.Entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermodynamic equilibrium, which is the state of maximum entropy.

It's much harder to imagine how a randomly distributed group of vibrating particles could somehow form a nice regular array. Clearly it's possible for crystals to form but when a crystal is formed energy is released (so it's not an isolated system).

- #3

- 71

- 6

The concept of entropy in statistical mechanics is stronger, in the sense that it carries a significantly greater amount of predictive power. Instead of just classifying processes as reversible or irreversible, statistical mechanics postulates that the entropy of a system at a certain energy is proportional to the logarithm of the number of ways in which the system can assume that energy (in classical physics, this means the size of a surface in phase space of constant energy). This establishes the functional relationship between the entropy and energy, which is assumed already to be a function of the observable parameters such as volume, magnetization, etc. It is important to note that entropy does not have an instantaneous meaning in the way that energy does. In principle you could measure the energy of a system at any time, and get an answer that will fluctuate around the mean value. Entropy, on the other hand, requires a distribution in order to be meaningful. It is possible to define a limiting notion of entropy as an average over a finite interval of time, and it is certainly possible for this notion of entropy to decrease in time. However, the "real" entropy that is discussed in statistical mechanics is the limit as the averaging time approaches infinity, and thus acquires a time-independent meaning. (If you know information theory, then you may notice that the assumption of equal apriori probabilities provides an upper bound on the statistical entropy of a system with a conservative Hamiltonian).

- Last Post

- Replies
- 15

- Views
- 1K

- Last Post

- Replies
- 2

- Views
- 2K

- Last Post

- Replies
- 9

- Views
- 3K

- Last Post

- Replies
- 7

- Views
- 2K

- Last Post

- Replies
- 3

- Views
- 2K

- Replies
- 5

- Views
- 2K

- Replies
- 4

- Views
- 6K

- Replies
- 38

- Views
- 5K

- Replies
- 3

- Views
- 3K

- Replies
- 12

- Views
- 3K