# Ambiguity in second law of thermodynamics

1. Jun 17, 2013

### SarcasticSully

The second law of thermodynamics essentially states (paraphrased) that the universe always moves from order to disorder (increase in entropy). The problem I have with this, though, is that there is no thermodynamic definition to disorder that I am aware of. Is there such a thermodynamic definition to disorder?

2. Jun 17, 2013

### Haborix

In my experience with learning thermodynamics I have found that instructors try very hard to avoid, when they first introduce the subject, the word "disorder". It's an easy word to use in casual conversation but it has many connotations that one probably shouldn't tie down with thermodynamics.

I think it's best to say stick with talking about more/less microstates. If I throw a bunch of particles in an infinite well, then they well occupy levels only as high as the total energy will allow. If I dump some heat in, then I have access to higher levels which particles can occupy which then leads to more microstates being available to the system. Would you call that a disordering of the system? To me it seems weird to use the word "disorder" in this context.

Last edited: Jun 17, 2013
3. Jun 17, 2013

### atyy

The technical term for "disorder" in the second law of thermodynamics is the "entropy". In statistical mechanics, entropy is monotonically related to the number of possibilities consistent with experimental observation (which has limited resolution). So things have high entropy or "disorder" if there are many possibilities we cannot rule out.

4. Jun 17, 2013

### SarcasticSully

So if I understand you correctly, the second law of thermodynamics does not necessarily dictate an increase in disorder, but rather an increase in possibilities of what the system can be, thereby increasing the likelihood of a disorderly state.

5. Jun 17, 2013

### atyy

No, disorder is not a technical term. So if you don't find "number of possibilities" to be an intuitive substitute for what you ordinarily think of as disorder, then it is better to stick with "number of possibilities", as Haborix reccomends.

I personally do find the "number of possibilities" to be a good substitute for "disorder", because when there are more possibilities consistent with my finite knowledge or finite measurement resolution, I am more ignorant. My intuition comes from white noise, which I intuitively consider "disordered", and the fact that there are many different white noise wave forms which sound identical to my ears. They all sound like "shhhhhhhhhhhhhhhh". Since there are possible detailed wave forms consistent with what I call "white noise", my ignorance is great. This is why I do associate higher ignorance, increased number of possibilities, and disorder.

6. Jun 18, 2013

### Joey21

Classical thermodynamics make no mention of 'disorder'. The second law of thermodynamics is an experimental result, and has two statements (I quote from the notes written by my professor):

Clausius' statement:' There can not exist a thermodynamical system whose only effect, if undergoing a cycle, is to extract work from a system and communicate the same amount of work to another system at a higher temperature.'

Kelvin-Plank statement:' There is not exist a thermodynamical ststem whose only effect, if undergoing a cycle, is to extract heat from a certain system and do the same amount of work on another.'

These two statements can be proven to be equivalent within the framework of thermodynamics. But as I said, there is no mention of disorder. In fact, there is no mention of entropy. The idea of entropy is a later development derived from Clausius' theorem, which in turn is derived from the second law, but I suppose the statement of the second law you wrote down is sometimes used instead of the two above because entropy is a theoretical concept obtained from the second law.

At least they way I was taught it ( this is science, I could have been taught wrong, but I hope not), entropy hasn't the slightest thing to do with disorder,or at least, as I've said before, in the context of classical Thermodynamics. That doesn't make it any less important though, maybe just less interesting to the rookie student or the layman. Entroy is a thermoydamic potential and a function of state. That means that it is a function only of the configuration of the system in every given moment, not how to system came to be in that way: if c_1 and c_2 are to different curves in the thermodynamic space of the system, then:

S=S(T,a_1,a_2,...) ; ΔS = $\int _{c_1}dS$=$\int _{c_2}dS$

At first sight it might not seem to significant, but it is. The principle of minimum entropy will tell you how a system will evolve, and when it will stop, and that is so deep that it more or less gives us our sense of time.

I have been quite extense, but with a point: no, in thermodynamics there is no concept of disorder. It is a later development of statistical physics. The use of your statement of the second law could be ( and is, I believe) misleading, especially if you are a layman or a second year student like myself. That may be the origin of the ambiguity you have observed. In any case, the second law of thermodynamics, and it's consequences, deserve a lot more thought than anyone can probably afford to give them.

Hope I have been helpful. This is my first proper response.

You should have come to our college. If anyone even tried to steer the lecture towards disorder, our professor would stop whoever it was and put them back into context. He didn't mention or let anyone mention disorder once!! And he still made it clear that Entropy is as important as anything could get.

7. Jun 18, 2013

### atyy

Yes, entropy as related to the number of possibilities (one way of defining "disorder") is statistical mechanics, rather than pure classical thermodynamics.

Another place where there is a notion of "disorder" in a subject related to thermodynamics is kinetic theory, where Boltzmann used the assumption of "molecular chaos". I like the explanation given in Kardar's http://ocw.mit.edu/courses/physics/8-333-statistical-mechanics-i-statistical-mechanics-of-particles-fall-2007/lecture-notes/lec9.pdf [Broken]

"While the assumption of molecular chaos before (but not after) collisions is the key to the irreversibility of the Boltzmann equation, the resulting loss of information is best justified in terms of the coarse graining of space and time: The Liouville equation and its descendants contain precise information about the evolution of a pure state. This information, however, is inevitably transported to shorter scales. A useful image is that of mixing two immiscible fluids. While the two fluids remain distinct at each point, the transitions in space from one to the next occur at finer resolution on subsequent mixing. At some point, a finite resolution in any measuring apparatus will prevent keeping track of the two components. In the Boltzmann equation the precise information of the pure state is lost at the scale of collisions. The resulting one body density only describes space and time resolutions longer than those of a two-body collision, becoming more and more probabilistic as further information is lost."

Statistical mechanics and kinetic theory have in common that they are attempts to combine mechanical pictures (eg. Newton's laws of motion) with classical thermodynamical concepts such as "entropy".

Last edited by a moderator: May 6, 2017
8. Jun 18, 2013

### fluidistic

I beg to differ. Entropy is not, as far as I know, a thermodynamic potential.
This statement is false, at least the way I've been taught. If you replace "in every given moment" by "in every thermodynamic equilibrium state" then I believe the sentence is true.

9. Jun 18, 2013

### Joey21

It is a potential. It's legendre transforms are potentials. It's variation when a system leaves on equilibrium state and takes on another is independent of the path in thermodynamic space. It is, however you look at it, a potential.

Now you are right about that. I shouldn't have left information unmentioned. I will correct myself:

Thermodynamical functions and properties, such as entropy, temperature or chemical potential, to name a few common examples, are defined for equilibrium states: if the system under study is not in equilibrium there is no good way to determine which value of these functions to assign to it. Instead, if there are subsystems which are in equilibrium then these quantaties will be defined locally for each subsystem.

Having said that, Fluidistic correction is now in proper context.

10. Jun 18, 2013

### Haborix

This has been a nice little discussion. I should say that when I learned thermodynamics the course began with statistical mechanics and then developed the thermodynamics from there. So my knowledge of thermodynamics may not be as cultivated in the historical development of thermodynamics as yours is, Joey21.

To the OP, there are two approaches to thermodynamics. There is the Classical Thermodynamics which Joey21, with fluidistic's corrections, has presented, and then there is Statistical Thermodynamics which my reply sort of got at.

Anyway, if you want to use the word disorder, then make sure that you and whoever you are talking to are both using the same definition for the word. To be precise, you should probably only talk in the language of the Clausius or Kelvin-Planck statements that Joey21 provided.

11. Jun 19, 2013

### DrDu

Disorder is highly misleading. I would also try to avoid speaking of the whole universe. When discussing entropy one usually considers only some system which can somehow be separated from its environment.
There are several processes where one would say that both entropy and order increase, like in spontaneous crystalization.