# Entropy at lowest energy

1. Feb 10, 2014

### avistein

Why entropy is highest at lowest energy?
Entropy is disorderness.So low energy means low disorderness.But why it is highest?

2. Feb 10, 2014

### shreder

can you quote the exact words from the book i dont quite understand what you mean

3. Feb 10, 2014

### avistein

I mean to say that more entropy relate to more stability.So more entropy relate to less energy. But how? more disorderness will relate to more energy.

4. Feb 10, 2014

### BvU

No such thing. Entropy and energy are different beasts. Completely.
Entropy tends to be maximum because disordered states are much, much more probable than ordered states. At any given energy.

Simple example: black and white balls in a box. Start of with all white balls to the left, all black to the right. Only one way to do that. Shake. Energy = how hard you shake. So average kinetic energy of the balls. Entropy = average color of e.g. left half of the box, or something like that to measure disorder.

Even with just a few balls, average color is 50% gray with very small deviations. Imagine the smallness of the deviations with NAvogadro balls !

Last edited: Feb 10, 2014
5. Feb 10, 2014

### Khashishi

Entropy is not disorderness. In normal systems, entropy increases with energy. According to the third law of thermodynamics, the entropy goes to zero at zero temperature, so your information is totally flawed.

6. Feb 10, 2014

### BvU

Can we help avi on class XI level ? My expose might not be what helps him best at this point.
avi: you touched upon a very important concept. Can you indicate what you picked up and where ?

7. Feb 10, 2014

### Useful nucleus

Do you mean energy or free energy?

8. Feb 10, 2014

### avistein

ok.I mean to say that why entropy is highest at equilibrium?

9. Feb 10, 2014

### Andrew Mason

Entropy is only defined for equilibrium states. Perhaps you mean: why does entropy increase overall when heat flows from a high temperature body to a lower temperature body?

The answer is rather simple: while the higher temperature body that loses heat experiences a decrease in entropy, the lower temperature body that gains heat experiences a bigger increase in entropy. This is because dS = dQ/T so a negative heat flow at high temperature has a smaller magnitude than a positive heat flow at low temperature. Since that is the way heat flows (out of the hot and into the cold), entropy will always increase as heat spreads out.

AM

10. Feb 11, 2014

### BvU

Another way to deal with this is to turn it around: an isolated system tends towards a situation of highest entropy. That's what we call equilibrium, because it stays there. On a macroscopic scale. On a microscopic scale small fluctuations occur in a dazzling tempo with correponding micromicroscopic fluctuations in entropy. Noticeable changes in entropy are so utterly unlikely that we can consider them absent.

I'm not all that happy with Andy's statement that entropy is only defined for equilibrium states.
The expression on Boltzmann's grave is more general than that.

11. Feb 11, 2014

### Andrew Mason

You have to define a macrostate first. The entropy is proportional to the logarithm of the number of possible microstates that can exist within the body or system in question while it presents that same macrostate. If a body is not in equilibrium, how do you define the macrostate? (e.g. if temperature, pressure, or volume or a combination of these is undefined).

AM

12. Feb 11, 2014

### BvU

This all goes over avi's head. No good. we need some class XI gentle introduction of the entropy concept. Without allowing him/her to cultivate misunderstandings that will bother him/her later on, like
Avi ?

13. Feb 12, 2014

### Khashishi

Entropy is highest in (thermal) equilibrium, because entropy always wants* to increase, and it only stops increasing when it reaches a local maximum. An equilibrium is a stationary state. If entropy is increasing, then the state isn't stationary. Therefore, if the state is in equilibrium, then entropy isn't increasing, which means it must have hit a local maximum.

*Entropy is overwhelmingly likely to increase whenever energy is transferred between subsystems, which is constantly happening.

14. Feb 12, 2014

### Andrew Mason

When we speak about entropy increasing we mean that there is a positive difference between the entropy of a system before and after the process (i.e. between two equilibrium states of the system). Unless the system is effectively in equilibrium during the process (ie. a reversible process) entropy is not defined during the process.

AM

15. Feb 12, 2014

### Useful nucleus

From classical thermodynamics point of view , this is a postulate which can be tested experimentally.

From modern statistical mechanics point of view, this can be proven (provided that you postulate something else ).

16. Feb 12, 2014

### jfizzix

I used to get very confused by thinking of entropy as disorder. I would use the word dispersion (i.e. how close to uniformity things are distributed).

Let's say you have a closed bottle of gas of volume $V$, where there area total of $N$ atoms, with total energy $U$. There are many many ways that $N$ atoms with total energy $U$ can be arranged in a closed bottle of volume $V$. The entropy $S(U,V,N)$ of this gas is a measure of how many ways there are to arrange all the $N$ atoms so that the total energy will be $U$, and the total volume will be $V$.

One of the fundamental theories of thermodynamics states that each of these possible arrangements are equally likely. Because of this, we may say that a system always tends toward maximum entropy simply because that is (by far) the most likely state for the system to be in.

As far as why energy is minimized at maximum entropy, this is a mathematical statement of the following:
For a closed system,
At constant entropy $S$, the equilibrium state will be that of the minimum energy $U$.
This is saying the same thing as...
For a closed system,
At constant energy $U$, the equilibrium state will be that of the maximum entropy $S$

17. Feb 12, 2014

### BvU

Anybody notice poor avi is out of the picture ?
Anybody notice entropy isn't even in the PF library ?