What is the real second law of thermodynamics?

In summary: assuming you're asking about the entropy of a system in equilibrium, then it is absolutely true that the entropy will increase over time.
  • #141
stevendaryl said:
But to me, the entropy is an artifact of how we're modeling that collection of particles; it's not objective. If there is no objective notion of entropy for 5 particles in a box, I can't see how there can be an objective notion of entropy for 50 trillion particles.

The entropy of a gallon of fluid can be calculated from the Helmholtz free energy A as S=-dA/dT, and A can be determined objectively by measurements. This is routinely and reliably done in engineering applications.

So what should not be objective about it?

The entropy of a tiny system of 45 molecules is not so well-defined as the tiny system cannot be kept in equilibrium, and entropy is (in the textbook framework) an equilibiruim property.
 
Science news on Phys.org
  • #142
A. Neumaier said:
The entropy of a gallon of fluid can be calculated from the Helmholtz free energy A as S=-dA/dT, and A can be determined objectively by measurements. This is routinely and reliably done in engineering applications.

So what should not be objective about it?

Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.
 
  • #143
A. Neumaier said:
The entropy of a gallon of fluid can be calculated from the Helmholtz free energy A as S=-dA/dT, and A can be determined objectively by measurements. This is routinely and reliably done in engineering applications.

So what should not be objective about it?

The entropy of a tiny system of 45 molecules is not so well-defined as the tiny system cannot be kept in equilibrium, and entropy is (in the textbook framework) an equilibiruim property.

Why, in principle, can't a system of 45 particles be kept in equilibrium? Also, entropy cannot be defined only for equilibrium, or else there would be no use of the concept of entropy in the study of irreversible processes, in which entropy creation rates are calculated.

stevendaryl said:
Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.

In large systems, (the order of Avogadro's number of particles or larger) its ok to assume you have a thermometer which is small with respect to the system (so it does not appreciably affect it) yet large enough to have negligible fluctuations. For a system of 45 particles, using a mercury thermometer, you don't measure the temperature, you set it to the temperature of the thermometer. I think it is better to use the system itself as a thermometer by allowing it to vary its volume under fixed pressure, both of which are measureable, in principle. If you know the equation of state (e.g. ideal gas PV=NkT) then you can calculate the temperature. That's what I did in the perhaps too-long post above.
 
  • #144
stevendaryl said:
Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.

Circular or not, it is objective, and thus entropy is also objective.

Moreover, the concept of temperature and entropy were known long before the advent of statistical mechanics. Don't mistake modern presentations for the only way to set up things.
 
  • #145
Rap said:
Why, in principle, can't a system of 45 particles be kept in equilibrium? Also, entropy cannot be defined only for equilibrium, or else there would be no use of the concept of entropy in the study of irreversible processes, in which entropy creation rates are calculated.



In large systems, (the order of Avogadro's number of particles or larger) its ok to assume you have a thermometer which is small with respect to the system (so it does not appreciably affect it) yet large enough to have negligible fluctuations. For a system of 45 particles, using a mercury thermometer, you don't measure the temperature, you set it to the temperature of the thermometer. I think it is better to use the system itself as a thermometer by allowing it to vary its volume under fixed pressure, both of which are measureable, in principle. If you know the equation of state (e.g. ideal gas PV=NkT) then you can calculate the temperature. That's what I did in the perhaps too-long post above.

We were originally talking about a gallon of raw oil, for which I made my assertion. Certainly temperature is well-defined there for scientific use.

With 45 molecules, you only have a nanosystem, which behaves quite differently form a system in equilibrium. The concept of temperature is there hardly applicable - at least not in the conventional sense.

The equation of state is a concept valid only in the thermodynamic limit, i.e., when the number of molecules is huge.
 
  • #146
stevendaryl said:
Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.

Temperature is not defined in terms of entropy, it is (usually) defined by the Carnot cycle, which makes no explicit reference to entropy.

A. Neumaier said:
Circular or not, it is objective, and thus entropy is also objective.

Circular definitions are not definitions at all. Entropy, as you define it may be objective, but its definition cannot be circular.

A. Neumaier said:
Moreover, the concept of temperature and entropy were known long before the advent of statistical mechanics. Don't mistake modern presentations for the only way to set up things.

Yes. Statistical mechanics explains thermodynamic temperature and entropy, but does not define it.
 
  • #147
Rap said:
Why, in principle, can't a system of 45 particles be kept in equilibrium? Also, entropy cannot be defined only for equilibrium, or else there would be no use of the concept of entropy in the study of irreversible processes, in which entropy creation rates are calculated.

If you think it can be kept in equilibrium, describe a process that does it!

In the thermodynamics of irreversible processes, one assumes local equilibrium, i.e., equilibrium in mesoscopic cells (with many more than 45 particles).

Re a comment in another post: Basic definitions are always circular, such as the definition of a group. You assume certain concepts and you describe their relations, but not what they are. It is the same in thermodynamics. It would be impossible to define anything if it were otherwise.
 
  • #148
A. Neumaier said:
If you think it can be kept in equilibrium, describe a process that does it!

Wouldn't a cavity in a block of steel containing 45 atoms of argon qualify? Wait a long time till equilibrium occurs, thermally insulate the block of steel, etc.


A. Neumaier said:
In the thermodynamics of irreversible processes, one assumes local equilibrium, i.e., equilibrium in mesoscopic cells (with many more than 45 particles).

Yes, point taken.

A. Neumaier said:
Re a comment in another post: Basic definitions are always circular, such as the definition of a group. You assume certain concepts and you describe their relations, but not what they are. It is the same in thermodynamics. It would be impossible to define anything if it were otherwise.

Maybe we have different definitions of circular? Can you show how the definition of a group is circular?
 
  • #149
Regarding the definition of the entropy., I think that thermodynamic temperature ##T## and thermodynamic entropy ##S## are introduced both at the same time - one is not derived from each other.

Consider simple uniform one-component system. If U is a function of two variables, ##V, p## there always exist integration factor T(p, V) which makes the expression

$$
dQ = dU + pdV
$$

into total differential of certain function ##S(p,V)##:


$$
\frac{dU}{T} + \frac{p}{T} dV = dS.
$$

The function ##T(p,V)## can be chosen in such a way that it has value 273,16 for triple point of water and 0 for the lowest temperature. Once that is done, the changes in temperature and entropy are definite and depend only on the changes of the macroscopic variables ##p, V##.

The entropy has additional freedom in that any constant can be added to it, for example we can choose the entropy function so that it has value zero at pressure 1 atm and volume 1 liter. Once the value of entropy for one state is fixed, both temperature and entropy are definite functions of state. They do not depend on knowledge of a human.

There is another concept of entropy - information entropy, or Gibbs - Jaynes entropy, let's denote it by I. This is a different thing; it is not a function of macroscopic quantities, but a function of a set of probabilities

$$
I(p_k) = - \sum_k p_k ln p_k,
$$

It so happens that in statistical physics, the thermodynamic entropy is often calculated as the maximum value of I given the macroscopic quantities as constraints on the probabilities ##p_k##. But this does not mean that thermodynamic entropy is the same thing as information entropy.

One could use the information entropy in many diverse situations, even for 45 particle systems and also outside the realm of thermodynamics. Since probabilities ##p_k## are often dependent on what one knows about the system, the value of I is not a measurable physical quantity like volume, but a rather auxiliary concept. I think this was the entropy stevendaryl was talking about, so there is no disagreement with what Arnold has said.
 
  • #150
Rap said:
Wouldn't a cavity in a block of steel containing 45 atoms of argon qualify? Wait a long time till equilibrium occurs, thermally insulate the block of steel, etc.

What is here the meaning of equilibrium? Why can you assume that after a long time equilibrium will occur?
This is all ill-defined.


Rap said:
Maybe we have different definitions of circular? Can you show how the definition of a group is circular?

The definition of a group is noncircular in terms of ZFC, say. But the definition of ZFC is circular (as it needs a meta-ZFC or so to
formulate its definition.)

However, the operations product and inverse are circularly defined within a group, and this is what I had meant. Indeed, what is conventionally regarded as circular definitions are in fact just consistency requirements for an object that ''has'' the defined concepts in the same way as a grou ''has'' a product and an inverse. Thus there is no logical problem in circular definitions. Circularity is a problem only in logical arguments, as without an independent argument establishing one of the claims the reasoning is invalid.
 
  • #151
A. Neumaier said:
What is here the meaning of equilibrium? Why can you assume that after a long time equilibrium will occur? This is all ill-defined.

I can see no reason why, after a sufficiently long time, equilibrium would not occur. How do you propose that it may not? I say this in the sense of "conceptual" equilibrium, rather than empirical, or measured equilibrium. In other words, its one thing to say that, according to kinetic theory, the 45 atoms will eventually assume a Maxwell-Boltzmann distribution on average, with severe fluctuations, (which I would still call equilibrium) and another to ask how to experimentally determine that equilibrium has occurred. Is your point that the fluctuations rule out the idea of equilibrium? Experimentally, I think, knowing the properties of steel and argon, you could calculate the time for the steel block and argon to equilibrate, and just wait that amount of time,.

I think of thermal equilibrium for a closed system as the free mechanical parameters unchanging in time (more precisely over a time much longer than the "attention span" of the experiment). For example, the mechanical parameters for a monatomic gas are pressure, volume, and mass, the only directly measurable properties. Free means allowed to vary, rather than fixed by the experimenter.

Ok, fluctuations in small systems complicate things. You have to take an average. But I guess I have never followed this through - I have always assumed that it could be done without logical contradictions, just complications.

A. Neumaier said:
Thus there is no logical problem in circular definitions. Circularity is a problem only in logical arguments, as without an independent argument establishing one of the claims the reasoning is invalid.

That's not what I think of as circular definitions. An illustration of my idea of circular definitions is: If you define thermal equilibrium as temperature unchanging in time, and then apply the concept of equilibrium to define temperature, that's circular and illogical.
 
  • #152
I have just starting studying thermodynamics and am confused about the 2nd law. If the entropy of the universe always increases then how can non spontaneous processes occur?
Considering the universe to be an isolated system,
We have the equation TΔS(universe)= -ΔG
For non spontaneous changes ΔG>0
Therefore ΔS(universe)<0
Please point out the flaw in my reasoning.
 
  • #153
missing link

Pythagorean said:
Susskind gives the classical description in his entropy lecture (available online) in which he uses chaos (fractalization of phase-space) to recover this inconsistancy between centropy and determinism by partitioning the initial distribution. But this isn't valid if your partitions have an area less than Plancks constant.


addendum:

Perhaps you've heard the basis of the arguments before and already rejected them. I'm not sure, I know there's been several discussion motivated from information theory before here on physicsforums:

http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html

The linked page is missing.
Would you kindly specify in which lecture Susskind discusses this point?
 
  • #154
I believe it was statistical mechanics.
 
  • #155
Pythagorean said:
I believe it was statistical mechanics.

Do you know if one else had this idea, that entropy cannot be predicted theoretically without Uncertainty Principle?
 
  • #156
This thread was a couple years ago. I'm not sure Susskind actually predicted anything, he was just lecturing and discussing concepts and pointing to drawings, it wasn't a formal proof.

I've since learned not to cite lectures : )
There's often approaches in pedagogy that are just analogies and sometimes outright lies that allow for the initial absorption of a concept, to be morphed into a more sophisticated concept later... or something.
 
  • #157
Rap said:
Temperature is not defined in terms of entropy, it is (usually) defined by the Carnot cycle, which makes no explicit reference to entropy.

The definition of temperature used by statistical mechanics is:

[itex]\dfrac{1}{T} = \dfrac{\partial S}{\partial E}[/itex] at constant volume and number of particles.

The Carnot cycle can be used to give an operational definition of temperature, and the two are the same (for ideal gases, anyway).
 
  • #158
Jano L. said:
Regarding the definition of the entropy., I think that thermodynamic temperature ##T## and thermodynamic entropy ##S## are introduced both at the same time - one is not derived from each other.

In statistical mechanics, [itex]S[/itex] can be defined independently of temperature, via:

[itex]S = k ln(W)[/itex]

where [itex]W[/itex] is the number of states of a given energy, volume and number of particles. The problem, classically, is that this is an infinite number, but that can be handled by discretizing the volume of phase space.
 
  • #159
Yes, but that is "statistical entropy" as opposed to "thermodynamic entropy" which I was talking about. There is a connection, but they are different things.
 
  • #160
FIRST LAW OF THERMODYNAMICS

Suppose that we have a closed system that at initial time ti is in an initial equilibrium state, with internal energy Ui, and at a later time tf, it is in a new equilibrium state with internal energy Uf. The transition from the initial equilibrium state to the final equilibrium state is brought about by imposing a time-dependent heat flow across the interface between the system and the surroundings, and a time-dependent rate of doing work at the interface between the system and the surroundings. Let [itex]\dot{q}(t)[/itex] represent the rate of heat addition across the interface between the system and the surroundings at time t, and let [itex]\dot{w}(t)[/itex] represent the rate at which the system does work on the surroundings at the interface at time t. According to the first law (basically conservation of energy),
[tex]ΔU=U_f-U_i=\int_{t_i}^{t_f}{(\dot{q}(t)-\dot{w}(t))dt}=Q-W[/tex]
where Q is the total amount of heat added and W is the total amount of work done by the system on the surroundings at the interface.

The time variation of [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex] between the initial and final states uniquely characterizes the so-called process path. There are an infinite number of possible process paths that can take the system from the initial to the final equilibrium state. The only constraint is that Q-W must be the same for all of them.

If a process path is irreversible, then the temperature and pressure within the system are inhomogeneous (i.e., non-uniform, varying with spatial position), and one cannot define a unique pressure or temperature for the system (except at the initial and the final equilibrium state). However, the pressure and temperature at the interface can be measured and controlled using the surroundings to impose the temperature and pressure boundary conditions that we desire. Thus, TI(t) and PI(t) can be used to impose the process path that we desire. Alternately, and even more fundamentally, we can directly control, by well established methods, the rate of heat flow and the rate of doing work at the interface [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex]).

Both for reversible and irreversible process paths, the rate at which the system does work on the surroundings is given by:
[tex]\dot{w}(t)=P_I(t)\dot{V}(t)[/tex]
where [itex]\dot{V}(t)[/itex] is the rate of change of system volume at time t. However, if the process path is reversible, the pressure P within the system is uniform, and

[itex]P_I(t)=P(t)[/itex] (reversible process path)

Therefore, [itex]\dot{w}(t)=P(t)\dot{V}(t)[/itex] (reversible process path)

Another feature of reversible process paths is that they are carried out very slowly, so that [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex] are both very close to zero over then entire process path. However, the amount of time between the initial equilibrium state and the final equilibrium state (tf-ti) becomes exceedingly large. In this way, Q-W remains constant and finite.

SECOND LAW OF THERMODYNAMICS

In the previous section, we focused on the infinite number of process paths that are capable of taking a closed thermodynamic system from an initial equilibrium state to a final equilibrium state. Each of these process paths is uniquely determined by specifying the heat transfer rate [itex]\dot{q}(t)[/itex] and the rate of doing work [itex]\dot{w}(t)[/itex] as functions of time at the interface between the system and the surroundings. We noted that the cumulative amount of heat transfer and the cumulative amount of work done over an entire process path are given by the two integrals:
[tex]Q=\int_{t_i}^{t_f}{\dot{q}(t)dt}[/tex]
[tex]W=\int_{t_i}^{t_f}{\dot{w}(t)dt}[/tex]
In the present section, we will be introducing a third integral of this type (involving the heat transfer rate [itex]\dot{q}(t)[/itex]) to provide a basis for establishing a precise mathematical statement of the Second Law of Thermodynamics.

The discovery of the Second Law came about in the 19th century, and involved contributions by many brilliant scientists. There have been many statements of the Second Law over the years, couched in complicated language and multi-word sentences, typically involving heat reservoirs, Carnot engines, and the like. These statements have been a source of unending confusion for students of thermodynamics for over a hundred years. What has been sorely needed is a precise mathematical definition of the Second Law that avoids all the complicated rhetoric. The sad part about all this is that such a precise definition has existed all along. The definition was formulated by Clausius back in the 1800's.

Clausius wondered what would happen if he evaluated the following integral over each of the possible process paths between the initial and final equilibrium states of a closed system:
[tex]I=\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_I(t)}dt}[/tex]
where TI(t) is the temperature at the interface with the surroundings at time t. He carried out extensive calculations on many systems undergoing a variety of both reversible and irreversible paths and discovered something astonishing. He found that, for any closed system, the values calculated for the integral over all the possible reversible and irreversible paths (between the initial and final equilibrium states) was not arbitrary; instead, there was a unique upper bound (maximum) to the value of the integral. Clausius also found that this result was consistent with all the "word definitions" of the Second Law.

Clearly, if there was an upper bound for this integral, this upper bound had to depend only on the two equilibrium states, and not on the path between them. It must therefore be regarded as a point function of state. Clausius named this point function Entropy.

But how could the value of this point function be determined without evaluating the integral over every possible process path between the initial and final equilibrium states to find the maximum? Clausius made another discovery. He determined that, out of the infinite number of possible process paths, there existed a well-defined subset, each member of which gave the same maximum value for the integral. This subset consisted of what we call today the reversible process paths. So, to determine the change in entropy between two equilibrium states, one must first conceive of a reversible path between the states and then evaluate the integral. Any other process path will give a value for the integral lower than the entropy change.

So, mathematically, we can now state the Second Law as follows:

[tex]I=\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_I(t)}dt}≤ΔS=\int_{t_i}^{t_f} {\frac{\dot{q}_{rev}(t)}{T(t)}dt}[/tex]
where [itex]\dot{q}_{rev}(t)[/itex] is the heat transfer rate for any of the reversible paths between the initial and final equilibrium states, and T(t) is the system temperature at time t (which, for a reversible path, is equal to the temperature at the interface with the surroundings). This constitutes a precise "variational calculus" statement of the Second Law of Thermodynamics.
Chet
 
Last edited:
  • #161
Jano L. said:
Yes, but that is "statistical entropy" as opposed to "thermodynamic entropy" which I was talking about. There is a connection, but they are different things.

Aren't they empirically the same value?
 
  • #162
Chestermiller said:
See my Blog at https://www.physicsforums.com/blog.php?b=4300

Chet

That link seems broken.
 
Last edited by a moderator:
  • #163
stevendaryl said:
That link seems broken.
I don't no how to do the Blog so it is available to everyone. So, I have just copied what I wrote in the Blog, and edited it into my previous post.

Chet
 
  • #164
Equating thermodynamic entropy with information entropy

This isn't a particularly new paper (2011) but this discussion has come up before on Physicsforums, so I'd just like to see if any interesting discussion comes out of it. I'm essentially a layman in thermodynamics, so I want to make sure I'm not misinterpreting it. I know that there are two version of thermodynamic entropy and this refers to the statistical one, but I'm not sure how significant of a point that is.

In the case of conservative maps the Boltzmann-Gibbs entropy S(t) increases linearly in time with a slope equal to the Kolmogorov-Sinai entropy rate. The same result is obtained also for a simple caseof dissipative system, the logistic map, when considered in the chaotic regime. A very interesting results is found at the chaos threshold. In this case, the usual Boltzmann-Gibbs is not appropriate and in order to have a linear increase, as for the chaotic case, we need to use the generalized q-dependent Tsallis entropy
(q = 1 for Tsallis entropy reduces to the Boltzmann-Gibbs entropy)
http://arxiv.org/pdf/cond-mat/0007302.pdf
 
  • #165
Second law of thermodynamics is simple to understand in classical thermodynamics, where we are concerned with a system and surroundings. It does not matter what the system contains or is composed of. The system is defined by specifying the values of certain properties in a given equilibrium state. If we are given two equilibrium states A, B of a system, the second law enables us to predict whether the process that takes the system from state A to state B is spontaneous or not.
Kelvin and Clausius statements of the second law are the simplest and are already given in the above discussions.
Statistical thermodynamics does not provide us with the sort of assertive statements about processes between two given equilibrium states of a system, that are provided by classical thermodynamics.
To resolve issues in statistical thermodynamics, one takes recourse to quantum mechanics and the issues gets more complicated - then one can take recourse to information theories and so on endlessly - doesn't help much to appreciate the simplicity of nature.
 

Similar threads

  • Thermodynamics
Replies
19
Views
225
Replies
4
Views
953
Replies
12
Views
1K
  • Thermodynamics
2
Replies
46
Views
2K
Replies
2
Views
917
Replies
100
Views
6K
Replies
6
Views
1K
Replies
9
Views
821
  • Thermodynamics
Replies
2
Views
689
Replies
13
Views
2K
Back
Top