Development of 2nd Law Of Thermodynamics

Click For Summary
The discussion revolves around the development and understanding of the Second Law of Thermodynamics, particularly its qualitative and quantitative aspects. The original qualitative postulates, such as the Kelvin-Planck and Clausius statements, are contrasted with the later quantitative formulations, including the efficiency of the Carnot cycle. The thread raises a critical question about the experimental validation of the Second Law, specifically whether there was a rigorous demonstration of its efficiency equation, eff = 1 - TL/TH, akin to Joule's demonstration of the First Law. Participants note that while Carnot made early measurements, the lack of precise experimental methods in the 19th century limited rigorous validation of the Second Law. The conversation highlights the evolution of thermodynamic principles and the ongoing exploration of their foundational experiments.
  • #91
Actually I meant post #71, not #72. If I specialize your considerations to a quasi-static process I find
C=int dU+pdV; Hence K=C_v= partial U/partial T|_V, the heat capacity at constant volume
and Lambda=T (partial p/ partial T)_V .
These two quantities cannot be written as a differential of a common function of T and V.
 
Science news on Phys.org
  • #92
Are you referring to the work by Caratheodory? I am not that familiar with it (yet), but what you are saying sounds vaguely familiar... something about Pfaffian forms?
 
  • #93
Dear Andy, no, I am not referring to Caratheodory here.
I just use that Q=dU/dt +p dV/dt so that the integral over time, C, becomes
int dU+pdV, then I use that
dU(T,V)=(partial U/partial T)_V dT +(partial U/partial V)_T dV
using some standard expressions for partial derivatives one shows that
(partial U/partial V)_T =(partial U/partial V)_S+(partial U/partial S)_V (partial S/partial V)_T=p+T(partial p/partial T)_V
The last step uses the definitions of p and T as derivatives of U with respect to V and S respectively and a Maxwell equation to express the derivative of S with respect to V as a derivative of p with respect to T.

If your statement that Lambda=partial H /\partial V and K=partial H/partial T were true, than the right hand side of your Carnot Clapeyron equation would be zero given that the derivatives with respect to V and T commute.
 
  • #94
I see what you mean.. I think I found the source of the error (it's mine), but I'm not sure how to reconcile the two results.

First, there's the Carnot-Clapeyron theorem:

\frac{\partial p}{\partial T}= \mu\Lambda,

But there's also this result from Clausius:

\frac{\partial p}{\partial T}= J(\frac{\partial \Lambda}{\partial T}-\frac{\partial K}{\partial V}

I don't see how they can be the same. I think Clausius's result comes from considering a complete cycle, while the Carnot-Clapeyron theorem comes from a single process, but I can't really see why there are two different results like this.

Using Clausius' result, you will obtain the correct expressions for the potentials that you wrote down:

J\Lambda - p = \frac{\partial U}{\partial V}

and
JK = \frac{\partial U}{\partial T}

EDIT- the LaTex engine is really starting to irritate me... These are not displaying properly, even though the code is correct. Sorry for any confusion...
 
Last edited:
  • #95
what is mu?
Did you include spaces between the \partial and the following p or T or whatever?

Ok, so mu is basically 1/T. Then both formulas seem to be correct, at least for a reversible process.
 
Last edited:
  • #96
Yeah, I think I get it now:

<br /> \frac{\partial p}{\partial T} = J(\frac{\partial \Lambda}{\partial T}-\frac{\partial K}{\partial V})<br />

Is a constitutive expression valid for anybody undergoing any *process*, while

<br /> \frac{\partial p}{\partial T} =\mu \Lambda}<br />

is valid for anybody undergoing a Carnot cycle. There's some other implications as well, but those are secondary.

Now I understand what Caratheory's work was asking: we have assumed that any *real* cyclic process can be "constructed" out of differential elements (infinitesimal Carnot cycles). Caratheodory showed that given any neighborhood of points in p-V space, there are *always* at least two points that *cannot* be joined by an adiabat. Meaning, an infinitesimal Carnot cycle *cannot* be constructed in some neighborhood (since two of the points can't be connected), meaning finite cycles may not be able to be constructed out of Carnot cycles...

But I wonder if Caratheodory's results should instead be interpreted as a positive statement about the structure of the state space. For example, a torus has paths that cannot be shrunk to a point while a sphere does not. So our mental image of (how the P-V-T volume is projected onto) the p-V surface is insufficient- it's not a simply connected smooth surface, but something more complicated- an arbitrary loop on the surface cannot always be contracted to an infinitesimal loop.

Obviously, that's pure conjecture on my part... maybe it's time for a literature search.
 
Last edited:
  • #97
It is hard for me to judge when these relations are valid in general, mainly because I don't know the precise definitions of Lambda and K in irreversible situations where you claim (?) them to be defined, too. For me, this would be the historically interesting point when they started to realize that their expressions for heating are path dependent.

Caratheodory does not consider Carnot processes. It is not that two points cannot be joined by an adiabat, but that this adiabat can be run only in one direction. E.g. in as far as rubbing my hands together can be regarded as an adiabatic process, it is possible to rise the temperature of my palms, but not to cool them down.
 
  • #98
Your first question is the easy one: the specific heat K of a material relates how the heating leads to a change in temperature.

The latent heat (lambda) is defined as "the quantity of heat which must be communicated to a body in a given state to convert it into another state without changing it's temperature". Typically the volume changes, but the pressure or the 'state' (melting/freezing/etc) can change as well. Without latent heat, a heat engine would not be possible.

I'm not sure I follow you on the second question- IIRC, Caratheodory's theorem relates to the inaccessibility of adiabatic processes to go from one arbitrary state to another arbitrary state. But I'm not as familiar with his work as I should be.
 
  • #99
Ok, I feel it's time to (sort of) wrap up this thread with a discussion of Clausius 4th, 5th, and 6th papers (http://www.humanthermodynamics.com/Clausius.html#anchor_116), in which he obtains the analytical expression for entropy we are all exposed to in introductory classes. This will conclude my analysis of the historical literature; we have covered quite a bit of material already!

First I'd like to thank everyone who participated in making this thread a useful one- I am taking a lot of the material here and turning it into some decent lecture notes.

Ok- in 1854 Clausius opens his fourth paper with

"In my memoir “On the Moving Force of Heat, &c.”, I have shown that the theorem of the equivalence of heat and work, and Carnot’s theorem, are not mutually exclusive, by that, by a small modification of the latter, which does not affect its principle, they can be brought into accordance. With the exception of this indispensable change, I allowed the theorem of Carnot to retain its original form, my chief objection then being, by the application to the two theorems to special cases, to arrive at conclusions which, according as they involved known or unknown properties of bodies, might suitably serve as proofs of the truth of the theorems, or as examples of their fecundity.

Clausius doesn't cite work by anyone else (here or anywhere). He first states his 'first theorem':

"Mechanical work may be transformed into heat, and conversely heat into work, the magnitude of the one being always proportional to that of the other."

But we know that this is incomplete: the amount of work that heat may produce in a process is also proportional to the temperature; in a cycle, the temperature *difference* between hot and cold. Clausius continues:

"The forces which here enter into consideration may be divided into two classes: those which the atoms of a body exert upon each other, and which depend, of course, upon the nature of the body, and those which arise from the foreign influences to which the body may be exposed. According to these two classes of forces which have to be overcome, of which the latter are subject to essentially different laws, I have divided the work done by heat into interior and exterior work.

He writes this as (in modern notation)

\dot{E} = JQ - p\dot{V}

Clausius then considers P = P(V, T) and E = E(V, T), and obtains

J \Lambda - p = \frac{\partial E}{\partial V}
JK = \frac{\partial E}{\partial T}

And so his 'first theorem' is what we have seen several times already:

\frac{\partial p}{\partial T} = J(\frac{\partial \Lambda}{\partial T}-\frac{\partial K}{\partial V})

Next, he states a 'second theorem':

"The theorem, as hitherto used, may be enunciated in some such manner as the following:

In all cases where a quantity of heat is converted into work, and where the body effecting this transformation ultimately returns to its original condition, another quantity of heat must necessarily be transferred from a warmer to a colder body; and the magnitude of the last quantity of heat, in relation to the first, depends only upon the temperature of the bodies between which heat passes, and not upon the nature of the body effecting the transformation.

"In deducing this theorem, however, a process is contemplated which is too simple a character; for only two bodies losing or receiving heat are employed, and it is tacitly assumed that one of the two bodies between which the transformation of heat takes place is the source of the heat which is converted into work. Now by previously assuming, in this manner, a particular temperature of the heat converted into work, the influence which a change of this temperature has upon the relation between the two quantities of heat remains concealed, and therefore the theorem in the above form is incomplete.

[...]

"Heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time.

"Everything we know concerning the interchange of heat between two bodies of different temperature confirms this; for heat everywhere manifests a tendency to equalize differences of temperature, and therefore to pass in contrary direction, i.e. from a warmer to colder bodies. Without further explanation, therefore, the truth of this principle will be granted."

Consider what Clausius just asserted: a certain process is forbidden to occur under any circumstance, and he does not offer any quantitative proof.

Clausius continues:

"On considering the results of such processes more closely, we find that in one and the same process heat may be carried from a colder to warmer body and another quantity of heat transferred from a warmer to a colder body without any other permanent change occurring. In this case we have not a simple transmission of heat from a colder to a warmer body, or an ascending transmission of heat, as it may be called, but two connected transmission of opposite characters, one ascending and the other descending, which compensate each other. It may, moreover, happen that instead of a descending transmission of heat accompanying, in the one and the same process, the ascending transmission, another permanent change may occur which has the peculiarity of not being reversible without either becoming replaced by a new permanent change of a similar kind, or producing a descending transmission of heat. In this case the ascending transmission of heat may be said to be accompanied, not immediately, but immediately, by a descending one, and the permanent change which replaces the latter may be regarded as a compensation for the ascending transmission.

"Now it is to these compensations that our principle refers; and with the aid of this conception the principle may be also expressed thus: an uncompensated transmission of heat from a colder to a warmer body can never occur. "

(emphasis mine)

Again, Clausius make a definitive claim that a certian process can *never* occur, but does not provide any justification. However, he does go on to calculate something useful:

"If two transformations which, without necessitating any other permanent change, can mutually replace one another, be called equivalent, then the generations of the quantity of heat C of the temperature t from work, has the equivalence-value:

C/T

"and the passage of the quantity of heat Q from the temperature t1 to the temperature t2, has the equivalence-value:

C(1/T2-1/T1)

"wherein T is a function of the temperature, independent of the nature of the process by which the transformation is effected.

That's confusing! He said T is a *function* of the temperature, not "the" temperature. As it happens (luckily for Clausius), his function coincides with Kelvin's definition of the absolute temperature T =J/\mu. But for now, 'T' is not 'temperature'.

Clausius then analyzes a series of thermal reservoirs and the flow of heat from the first to the final. This is simply \sum \frac{C}{T}. Passing to the continuum limit, and considering a cyclic process, Clausis obtains:

"The equation:
N = \int \frac{dC}{T} = 0
is the analytical expression, for all reversible cyclical processes, of the second fundamental theorem in the mechanical theory of heat."

Clausius then considers irreversible processes:

"If we represent the transformations which occur in a cyclical process by these expressions, the relation existing between them can be stated in a simple and definite manner. If the cyclical process is reversible, the transformations which occur therein must be partly positive and partly negative, and the equivalence-values of the positive transformations must be together equal to those of the negative transformations, so that the algebraic sum of all the equivalence-values become = 0. If the cyclical process is not reversible, the equivalence values of the positive and negative transformations are not necessarily equal, but they can only differ in such a way that the positive transformations predominate. The theorem respecting the equivalence-values of the transformations may accordingly be stated thus: The algebraic sum of all the transformations occurring in a cyclical process can only be positive, or, as an extreme case, equal to nothing."

Clausius then simply writes, for "every cyclical process which is in any way possible." (not just reversible):

\int\frac{dC}{T} \geq 0.

Hopefully, you can see why there has been so much confusion about what the entropy 'really is'. Specifically, Clausius has made a series of vague statements in order to write a formula, presented without proof or derivation. There really is no logical foundation to Clausius' work.

Skipping ahead to Clausius' 9th paper, he picks up where he left off:

"The other magnitude to be here noticed is connected with the second fundamental theorem, and is contained in equation (IIa). In fact if, as equation (IIa) asserts, the integral:

\int\frac{dC}{T}.

"Vanishes whenever the body, starting from any initial condition, returns thereto after its passage through any other conditions, then the expression dC/T under the sign integration must be the complete differential of a magnitude which depends only on the present existing condition of the body, and not upon the way by which t reached the latter. Denoting his magnitude by S, we can write

dS = dC/T

"or, if we conceive this equation to be integrated for any reversible process whereby this body can pass from the selected initial condition to its present one, and denote at the same time by So the value which the magnitude S has in that initial condition,

S = S_{0} + \int\frac{dC}{T}

"This equation is to be used in the same way for determining S as equation (58) was for defining U. The physical meaning of S has already been discussed in the Sixth Memoir.

"we obtain the equation:

S - S_{0} = \int\frac{dC}{T}

"We might call S the transformation content of the body, just as we termed the magnitude U its thermal and ergonal content. But as I hold it to be better terms for important magnitudes from the ancient languages, so that they may be adopted unchanged in all modern languages, I propose to call the magnitude S the entropy of the body, from the Greek word τροπη, transformation. I have intentionally formed the word entropy so as to be as similar as possible to the word energy; for the two magnitudes to be denoted by these words are so nearly allied their physical meanings, that a certain similarity in designation appears to be desirable.
[...]

"For the present I will confine myself to the statement of one result. If for the entire universe we conceive the same magnitude to be determined, consistently and with due regard to all circumstances, which for a single body I have called entropy, and if at the same time we introduce the other and simpler conception of energy, we may express in the following manner the fundamental laws of the universe which correspond to the two fundamental theorems of the mechanical theory of heat:

1. The energy of the universe is constant.
2. The entropy of the universe tends to a maximum."

Here (finally) are the laws of thermodynamics written down in a form that most of us have seen. I also want to note that I have read the 6th paper as closely as I could, I cannot guess what Clausius meant by "The physical meaning of S has already been discussed in the Sixth Memoir." I believe he meant "uncompensated" processes, but I wouldn't exactly call that a 'physical meaning'.

So, we have covered the development of Thermodynamics from 1822 (Carnot's initial report) through 1865 (Clausius's naming of 'entropy' and two laws of thermodynamics). Sadly, the field did not progress much over the next 100 years; as a result, many textbooks (most of which were first written in the 1950s and 1960s) relay the subject as it was written 100 years previously; that is without any coherent logic and mathematical structure. Fortunately (for us) during the past 20 years or so, the foundational elements of thermodynamics (heat, temperature, entropy) are being re-examined and refined, and there have been several excellent reviews posted on this thread.
 
  • #100
Thank you Andy Resnick for posting your recap of the early development of thermodynamics. It has been a great help to me. I enjoy getting to the bottom, or beginning, of scientific ideas. I point out the special nature of the beginning, because, even though the researchers may have struggled to find their scientific bearings and eventually produce new additions to scientific learning, it is the reasons why they began their search that interest me most. Then their trains of thought also become important. Some early ideas may actually be better than some latter day ideas. Anyway, I appreciate their contributions and also your presentation about them. This has been a very enjoyable thread.

James
 
  • #101
Andy Resnick,

I have one last question. I won't debate your response. You are the expert. I think it would be helpful to me to know: How do you choose to explain the meaning of thermodynamic entropy? In other words, if I were in your class and asked: What do you think is the latest, best explanation for thermodynamic entropy? How would you respond?

James
 
  • #102
I'm no expert-seriously.

I like to think of entropy simply as 'energy unavailable to perform useful work'. I don't know if that's the latest, greatest definition, but it seems to be the most broadly applicable.
 
  • #103
Dear Andy Resnick,

If you are really 'I'm no expert-seriously.', then can I rescind my promise to not debate your response? I won't extend this discussion any further unless you concur. You have been very generous in contributing your knowledge and time to this thread. Even more valuable, from my perspective, is that you are direct in your answers and honest in their quality. If you do not know, you are willing to let that be known. If you do know, you are very meticulous in showing the details about what you do know. You are a valuable resource. I do not want to be a bother to you. If you think this thread is finished, then it is finished for me also.

James
 
  • #104
Andy Resnick,

Adding to my previous message: I do not want to burden you or anyone else with questions that would probably be repetitive. So, I will just repeat that I appreciate the detail that you have provided in your messages. I have printed all of them off and have included them in my binder under the subject of thermodynamic entropy. I have no further questions. Thank you for your time.

James
 
  • #105
James A. Putnam said:
Dear Andy Resnick,

If you are really 'I'm no expert-seriously.', then can I rescind my promise to not debate your response? I won't extend this discussion any further unless you concur. You have been very generous in contributing your knowledge and time to this thread. Even more valuable, from my perspective, is that you are direct in your answers and honest in their quality. If you do not know, you are willing to let that be known. If you do know, you are very meticulous in showing the details about what you do know. You are a valuable resource. I do not want to be a bother to you. If you think this thread is finished, then it is finished for me also.

James

James,

Thanks for the kind words. It's not for me to declare a thread 'finished'- if you are still interested in continuing the discussion, then by all means- continue!

All I had meant was, I set myself the task of 'translating' for a modern audience some of the early works on thermodynamics. That task is complete. That does not mean there's nothing left to discuss :)

For example, I've started discussing the structure of the p-V surface with some mathematician colleagues here (CSU), as I feel certain issues raised on this thread merit additional consideration. I can't comment on those discussions yet as they are too preliminary for PF.

In any event, you seem eager to continue our discussion, so please continue!
 
  • #106
Andy Resnick,

Thank you, however your answer regarding thermodynamic entropy...

"I like to think of entropy simply as 'energy unavailable to perform useful work'."

...left me wondering?

Thermodynamic Entropy does not have units of energy. Whether the energy is available or unavailable for work, it is still energy and has units of energy. Entropy by virtue of its units, is something different from entropy. I am not questioning the fact that energy has transferred from a state of potential usefulness to a state of un-usefulness for the Carnot engine to which the exchange of energy applies. A lower temperature Carnot engine could make use of the lost energy. I know that you know this. It is just that an answer that verbally, as opposed to physically, appears to relate or change entropy into energy does not pass by unnoticed.

Thermodynamic Entropy is the transferred energy divided by the temperature of the temperature sink. Where did the temperature go? What did it mean? Thermodynamic Entropy appears to me to be a process and not a state. It occurs and then it is gone. The energy involved in the entropy process remains. However, the entropy process by which the energy was transferred from a state of usefulness to un-usefulness remains unexplained. What do you think?

James
 
  • #107
Ha! Be careful, you may actually learn something! :)

Yes, the units of entropy is energy/degree. I don't know how to measure that, so I don't really know what that means. I do understand energy and I understand (after a fashion) temperature because I can measure those things.

Honestly, biochemists are way ahead of physicists with some of this material. They work directly with the Gibbs free energy:

http://www.tainstruments.com/main.aspx?id=214&n=1&gclid=CKv-2Kv6uaECFRIeDQodRHb9BA&siteid=11

http://www.setaram.com/Microcalorimetry.htm

I was lucky enough to work with a biochemist for a while and learn some of this stuff. What entropy 'is' or 'is not' is a much less useful question than asking how the Gibbs free energy changes during a process, mostly because the entropy can't be measured. So, the 'free energy' is how much energy *is* available to do work (especially chemical reactions and whatnot), making the entropy (or T*dS) the energy *not* available.
 
Last edited by a moderator:
  • #108
As you mention free energy and biochemists: There has been an interesting paper by Jarshinsky (I suppose I spelled him wrong) in Phys Rev Lett about 10 years ago about what I vaguely remember as a non-equilibrium connection about the mean exponentiated work and the free energy. It has been shown to be true experimentally in experiments e.g. stretching protein molecules at different speeds.
 
  • #109
DrDu said:
As you mention free energy and biochemists: There has been an interesting paper by Jarshinsky (I suppose I spelled him wrong) in Phys Rev Lett about 10 years ago about what I vaguely remember as a non-equilibrium connection about the mean exponentiated work and the free energy. It has been shown to be true experimentally in experiments e.g. stretching protein molecules at different speeds.

Yes! The Jarzynski inequality:

http://en.wikipedia.org/wiki/Jarzynski_equality

I discovered this in the context of laser tweezer experiments on protein folding.
 
  • #110
I have a son who is a graduate student studying meteorology. He uses entropy frequently. It is very useful, now if only we knew what it was?

James
 
  • #112
I saw it and I know the Keldysh formalism. I once attended a seminar he gave on the subject.
 
  • #113
Andy Resnick said:
James,

<snip>

For example, I've started discussing the structure of the p-V surface with some mathematician colleagues here (CSU), as I feel certain issues raised on this thread merit additional consideration. I can't comment on those discussions yet as they are too preliminary for PF.

Here's an interesting reference:

S.G. Rajeev, Quantization of contact manifolds and thermodynamics, Annals of Physics, Volume 323, Issue 3, March 2008, Pages 768-782, ISSN 0003-4916, DOI: 10.1016/j.aop.2007.05.001.

(http://www.sciencedirect.com/science/article/B6WB1-4NNYJBK-1/2/56df1f8184b786a722d43733e3d92765)Here's the gist (as much as I understand): given the conservation of energy

dU + p dV + T dS

How many independent variables are there?

Answer- only 2. The others are given by constitutive relations. Additionally, the 'phase space' has an *odd* number of variables, as opposed to 'normal' mechanics, which has an *even* number. The phase space of mechanics has a symplectic structure, thermodynamics has a contact structure.

An aside: "entropy" is the conjugate variable to "temperature". So there's another definition of what entropy 'is'...

I guess I have some summer reading lined up...
 
  • #114
Andy Resnick,

I live in Colorado. Thank you for those links and the thoughts that help get me started.

James
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
5K
Replies
1
Views
2K
  • · Replies 20 ·
Replies
20
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
16K
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 15 ·
Replies
15
Views
3K