What is the real second law of thermodynamics?

AI Thread Summary
The second law of thermodynamics states that the entropy of a closed system tends to increase over time, reflecting a tendency toward disorder. While classical interpretations suggest that entropy cannot decrease, discussions highlight the Poincaré Recurrence theorem, which implies that systems may return to a previous state given sufficient time, albeit over impractically long durations. The conversation also emphasizes the statistical nature of entropy, where fluctuations can lead to temporary decreases in entropy, though these events are exceedingly rare. Additionally, the role of quantum mechanics introduces complexities in understanding entropy, as it involves inherent uncertainties that contribute to information loss. Overall, the second law serves as a fundamental guideline for physical processes, with entropy being a key measure of system behavior.
  • #151
A. Neumaier said:
What is here the meaning of equilibrium? Why can you assume that after a long time equilibrium will occur? This is all ill-defined.

I can see no reason why, after a sufficiently long time, equilibrium would not occur. How do you propose that it may not? I say this in the sense of "conceptual" equilibrium, rather than empirical, or measured equilibrium. In other words, its one thing to say that, according to kinetic theory, the 45 atoms will eventually assume a Maxwell-Boltzmann distribution on average, with severe fluctuations, (which I would still call equilibrium) and another to ask how to experimentally determine that equilibrium has occurred. Is your point that the fluctuations rule out the idea of equilibrium? Experimentally, I think, knowing the properties of steel and argon, you could calculate the time for the steel block and argon to equilibrate, and just wait that amount of time,.

I think of thermal equilibrium for a closed system as the free mechanical parameters unchanging in time (more precisely over a time much longer than the "attention span" of the experiment). For example, the mechanical parameters for a monatomic gas are pressure, volume, and mass, the only directly measurable properties. Free means allowed to vary, rather than fixed by the experimenter.

Ok, fluctuations in small systems complicate things. You have to take an average. But I guess I have never followed this through - I have always assumed that it could be done without logical contradictions, just complications.

A. Neumaier said:
Thus there is no logical problem in circular definitions. Circularity is a problem only in logical arguments, as without an independent argument establishing one of the claims the reasoning is invalid.

That's not what I think of as circular definitions. An illustration of my idea of circular definitions is: If you define thermal equilibrium as temperature unchanging in time, and then apply the concept of equilibrium to define temperature, that's circular and illogical.
 
Science news on Phys.org
  • #152
I have just starting studying thermodynamics and am confused about the 2nd law. If the entropy of the universe always increases then how can non spontaneous processes occur?
Considering the universe to be an isolated system,
We have the equation TΔS(universe)= -ΔG
For non spontaneous changes ΔG>0
Therefore ΔS(universe)<0
Please point out the flaw in my reasoning.
 
  • #153
missing link

Pythagorean said:
Susskind gives the classical description in his entropy lecture (available online) in which he uses chaos (fractalization of phase-space) to recover this inconsistancy between centropy and determinism by partitioning the initial distribution. But this isn't valid if your partitions have an area less than Plancks constant.


addendum:

Perhaps you've heard the basis of the arguments before and already rejected them. I'm not sure, I know there's been several discussion motivated from information theory before here on physicsforums:

http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html

The linked page is missing.
Would you kindly specify in which lecture Susskind discusses this point?
 
  • #154
I believe it was statistical mechanics.
 
  • #155
Pythagorean said:
I believe it was statistical mechanics.

Do you know if one else had this idea, that entropy cannot be predicted theoretically without Uncertainty Principle?
 
  • #156
This thread was a couple years ago. I'm not sure Susskind actually predicted anything, he was just lecturing and discussing concepts and pointing to drawings, it wasn't a formal proof.

I've since learned not to cite lectures : )
There's often approaches in pedagogy that are just analogies and sometimes outright lies that allow for the initial absorption of a concept, to be morphed into a more sophisticated concept later... or something.
 
  • #157
Rap said:
Temperature is not defined in terms of entropy, it is (usually) defined by the Carnot cycle, which makes no explicit reference to entropy.

The definition of temperature used by statistical mechanics is:

\dfrac{1}{T} = \dfrac{\partial S}{\partial E} at constant volume and number of particles.

The Carnot cycle can be used to give an operational definition of temperature, and the two are the same (for ideal gases, anyway).
 
  • #158
Jano L. said:
Regarding the definition of the entropy., I think that thermodynamic temperature ##T## and thermodynamic entropy ##S## are introduced both at the same time - one is not derived from each other.

In statistical mechanics, S can be defined independently of temperature, via:

S = k ln(W)

where W is the number of states of a given energy, volume and number of particles. The problem, classically, is that this is an infinite number, but that can be handled by discretizing the volume of phase space.
 
  • #159
Yes, but that is "statistical entropy" as opposed to "thermodynamic entropy" which I was talking about. There is a connection, but they are different things.
 
  • #160
FIRST LAW OF THERMODYNAMICS

Suppose that we have a closed system that at initial time ti is in an initial equilibrium state, with internal energy Ui, and at a later time tf, it is in a new equilibrium state with internal energy Uf. The transition from the initial equilibrium state to the final equilibrium state is brought about by imposing a time-dependent heat flow across the interface between the system and the surroundings, and a time-dependent rate of doing work at the interface between the system and the surroundings. Let \dot{q}(t) represent the rate of heat addition across the interface between the system and the surroundings at time t, and let \dot{w}(t) represent the rate at which the system does work on the surroundings at the interface at time t. According to the first law (basically conservation of energy),
ΔU=U_f-U_i=\int_{t_i}^{t_f}{(\dot{q}(t)-\dot{w}(t))dt}=Q-W
where Q is the total amount of heat added and W is the total amount of work done by the system on the surroundings at the interface.

The time variation of \dot{q}(t) and \dot{w}(t) between the initial and final states uniquely characterizes the so-called process path. There are an infinite number of possible process paths that can take the system from the initial to the final equilibrium state. The only constraint is that Q-W must be the same for all of them.

If a process path is irreversible, then the temperature and pressure within the system are inhomogeneous (i.e., non-uniform, varying with spatial position), and one cannot define a unique pressure or temperature for the system (except at the initial and the final equilibrium state). However, the pressure and temperature at the interface can be measured and controlled using the surroundings to impose the temperature and pressure boundary conditions that we desire. Thus, TI(t) and PI(t) can be used to impose the process path that we desire. Alternately, and even more fundamentally, we can directly control, by well established methods, the rate of heat flow and the rate of doing work at the interface \dot{q}(t) and \dot{w}(t)).

Both for reversible and irreversible process paths, the rate at which the system does work on the surroundings is given by:
\dot{w}(t)=P_I(t)\dot{V}(t)
where \dot{V}(t) is the rate of change of system volume at time t. However, if the process path is reversible, the pressure P within the system is uniform, and

P_I(t)=P(t) (reversible process path)

Therefore, \dot{w}(t)=P(t)\dot{V}(t) (reversible process path)

Another feature of reversible process paths is that they are carried out very slowly, so that \dot{q}(t) and \dot{w}(t) are both very close to zero over then entire process path. However, the amount of time between the initial equilibrium state and the final equilibrium state (tf-ti) becomes exceedingly large. In this way, Q-W remains constant and finite.

SECOND LAW OF THERMODYNAMICS

In the previous section, we focused on the infinite number of process paths that are capable of taking a closed thermodynamic system from an initial equilibrium state to a final equilibrium state. Each of these process paths is uniquely determined by specifying the heat transfer rate \dot{q}(t) and the rate of doing work \dot{w}(t) as functions of time at the interface between the system and the surroundings. We noted that the cumulative amount of heat transfer and the cumulative amount of work done over an entire process path are given by the two integrals:
Q=\int_{t_i}^{t_f}{\dot{q}(t)dt}
W=\int_{t_i}^{t_f}{\dot{w}(t)dt}
In the present section, we will be introducing a third integral of this type (involving the heat transfer rate \dot{q}(t)) to provide a basis for establishing a precise mathematical statement of the Second Law of Thermodynamics.

The discovery of the Second Law came about in the 19th century, and involved contributions by many brilliant scientists. There have been many statements of the Second Law over the years, couched in complicated language and multi-word sentences, typically involving heat reservoirs, Carnot engines, and the like. These statements have been a source of unending confusion for students of thermodynamics for over a hundred years. What has been sorely needed is a precise mathematical definition of the Second Law that avoids all the complicated rhetoric. The sad part about all this is that such a precise definition has existed all along. The definition was formulated by Clausius back in the 1800's.

Clausius wondered what would happen if he evaluated the following integral over each of the possible process paths between the initial and final equilibrium states of a closed system:
I=\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_I(t)}dt}
where TI(t) is the temperature at the interface with the surroundings at time t. He carried out extensive calculations on many systems undergoing a variety of both reversible and irreversible paths and discovered something astonishing. He found that, for any closed system, the values calculated for the integral over all the possible reversible and irreversible paths (between the initial and final equilibrium states) was not arbitrary; instead, there was a unique upper bound (maximum) to the value of the integral. Clausius also found that this result was consistent with all the "word definitions" of the Second Law.

Clearly, if there was an upper bound for this integral, this upper bound had to depend only on the two equilibrium states, and not on the path between them. It must therefore be regarded as a point function of state. Clausius named this point function Entropy.

But how could the value of this point function be determined without evaluating the integral over every possible process path between the initial and final equilibrium states to find the maximum? Clausius made another discovery. He determined that, out of the infinite number of possible process paths, there existed a well-defined subset, each member of which gave the same maximum value for the integral. This subset consisted of what we call today the reversible process paths. So, to determine the change in entropy between two equilibrium states, one must first conceive of a reversible path between the states and then evaluate the integral. Any other process path will give a value for the integral lower than the entropy change.

So, mathematically, we can now state the Second Law as follows:

I=\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_I(t)}dt}≤ΔS=\int_{t_i}^{t_f} {\frac{\dot{q}_{rev}(t)}{T(t)}dt}
where \dot{q}_{rev}(t) is the heat transfer rate for any of the reversible paths between the initial and final equilibrium states, and T(t) is the system temperature at time t (which, for a reversible path, is equal to the temperature at the interface with the surroundings). This constitutes a precise "variational calculus" statement of the Second Law of Thermodynamics.
Chet
 
Last edited:
  • #161
Jano L. said:
Yes, but that is "statistical entropy" as opposed to "thermodynamic entropy" which I was talking about. There is a connection, but they are different things.

Aren't they empirically the same value?
 
  • #162
Chestermiller said:
See my Blog at https://www.physicsforums.com/blog.php?b=4300

Chet

That link seems broken.
 
Last edited by a moderator:
  • #163
stevendaryl said:
That link seems broken.
I don't no how to do the Blog so it is available to everyone. So, I have just copied what I wrote in the Blog, and edited it into my previous post.

Chet
 
  • #164
Equating thermodynamic entropy with information entropy

This isn't a particularly new paper (2011) but this discussion has come up before on Physicsforums, so I'd just like to see if any interesting discussion comes out of it. I'm essentially a layman in thermodynamics, so I want to make sure I'm not misinterpreting it. I know that there are two version of thermodynamic entropy and this refers to the statistical one, but I'm not sure how significant of a point that is.

In the case of conservative maps the Boltzmann-Gibbs entropy S(t) increases linearly in time with a slope equal to the Kolmogorov-Sinai entropy rate. The same result is obtained also for a simple caseof dissipative system, the logistic map, when considered in the chaotic regime. A very interesting results is found at the chaos threshold. In this case, the usual Boltzmann-Gibbs is not appropriate and in order to have a linear increase, as for the chaotic case, we need to use the generalized q-dependent Tsallis entropy
(q = 1 for Tsallis entropy reduces to the Boltzmann-Gibbs entropy)
http://arxiv.org/pdf/cond-mat/0007302.pdf
 
  • #165
Second law of thermodynamics is simple to understand in classical thermodynamics, where we are concerned with a system and surroundings. It does not matter what the system contains or is composed of. The system is defined by specifying the values of certain properties in a given equilibrium state. If we are given two equilibrium states A, B of a system, the second law enables us to predict whether the process that takes the system from state A to state B is spontaneous or not.
Kelvin and Clausius statements of the second law are the simplest and are already given in the above discussions.
Statistical thermodynamics does not provide us with the sort of assertive statements about processes between two given equilibrium states of a system, that are provided by classical thermodynamics.
To resolve issues in statistical thermodynamics, one takes recourse to quantum mechanics and the issues gets more complicated - then one can take recourse to information theories and so on endlessly - doesn't help much to appreciate the simplicity of nature.
 
Back
Top