Development of 2nd Law Of Thermodynamics

In summary, the 2nd law of thermodynamics states that the efficiency of an engine is only dependent on temperature. The 1st law of thermodynamics was demonstrated by Joule using a paddle wheel in a container of water. The 2nd law was then demonstrated by an experiment.
  • #36
I really enjoyed the part of Carnot's paper that made an analogy with the Paddle wheel. (I can wrap my engineer brain around that one!) For the heck of it, I did my own calculations showing the analogy. No integrals or statistical equations! Just some really basic undergraduate stuff.

The 2nd law says the maximum efficiency possible of a heat engine is

eff = 1 - TL/TH

where TH is the high temperature of the heat going into the engine and TL is the low temperature. And eff is defined as

W = Eff * Qin

where W is the useful work and Qin is the heat energy going into the process also called QH.

This is analogous to how a paddle wheel works. The maximum possible work rate (power) from a flow over a paddle wheel is defined by the potential energy

W = mgh

where W is a work rate and m is the mass flow rate. That’s the maximum possible. Friction in the wheel causes the real work to be less but this is the maximum possible.

But look at h. If we think of it as over sea level and not just the height of the wheel we can form this another way and think about the maximum potential work of the paddle wheel vs the potential work of the wheel if the wheel extended all the way down to sea level. For this example assume sea level is as absolutely low as one could possibly go, no holes in the ground!. Then we can say

W = mghH - mghL

where hH is the top of the wheel and hL is the bottom of the wheel and both are defined as the height over sea level.

If hL is 0 or sea level then the equation above would be the maximum amount of work possible based on the initial height (hH) of the water going into the wheel.

I can rewrite the equation above through algebra

W = mghH * (1-mghL/mghH)

or

W = mghH * (1-hL/hH)

And if I defined best efficiency possible as (1-hL/hH) then

W = eff * mghH

And that is the same as the thermal efficiency through analogy! The potential energy of the flow going into the wheel mghH is analogous to the thermal energy going in the system, QH. The best efficiency based on height difference (1-hL/hH) is analogous to the best efficiency possible through temperature difference, 1-TL/TH

And this makes sense too. The objects in the room around us have a tremendous amount of thermal energy in them. But they are all about the same temperature. If there is no temperature difference, no where for the thermal energy in the room to "flow down" then no work can be made. And that's the analogy.
 
Science news on Phys.org
  • #37
That's really excellent- mind if I use it in my Physics I class next semester?
 
  • #38
Thanks for the compliment!

I think it would be cool if that was taught as a lesson in a physics class. This stuff is more interesting then the daily grind of being an engineer. But I suppose that is what I'm suppose to be doing right now! Gotta go.
 
  • #39
Andy Resnick,

Thank you for this discussion. I am certain that blis is wrong and you are correct. It is not often that such diverse views are openly debated about something so fundamental as Thermodynamic Entropy. The difference that you point out between Clausius' definition and alternative definitions that followed is, I think, very important to fundamental scientific learning. The difference between the obvious inclusion of time in Clausius' definition and the leap to other definitions that are static is critical to fundamental understanding. I think it demonstrates that we do not yet know what Thermodynamic Entropy is. I think that understanding the role of 'time' in Clausius' definition is important for explaining why Thermodynamic Entropy is a process that takes time. The question, as I see it, is why does Thermodynamic Entropy require time? I would appreciate reading more of your input even if what I have said does not conform to it.

James Putnam
 
  • #40
Well I don't think there are any real disagreements about the physics. From a fundamental POV, the fluctuations cause the transport phenomena, but you can describe these phenomena in an effective way.
 
Last edited:
  • #41
Iblis,

Sorry for missing the 'I' in your name.

"..From a fundamental POV, the fluctuations cause the transport phenomena, but you can describe these phenomena in an effective way."

How does static produce fluctuations?" Clausius did not require fluctuations. In the ideal situation that he relied upon there were no fluctuations. Is this incorrect?

James
 
  • #42
Andy Resnick said:
<snip>

Transport is handled in continuum mechanics by means of balance equations. Brenner's text "Macrotransport Processes" is an excellent starting point, and Slattery's "Interfacial transport phenomena" may even be better. But how do we model fluctuations in the continuum picture?

I'll first note that the fluctuations are occurring *about equilibrium values*. Fluctuations cannot be defined without equilibrium.

Answer: By means of the fluctuation-dissipation theorem. I'm not sure what fluctuations mean in a fully dynamic situation. But if you want to restrict yourself to near-equilibrium conditions, go for it.


I was going to edit this, but since there have already been comments, I'll just do it this way:

It makes no sense to speak of fluctuations in a fully dynamical theory. What you are asking for is to first, remove the dynamics and then add them back in, in such a manner that requires statistical notions. In which case you can start with the Langevin equation or Smoluchowski equation. In the end, continuum mechanics is recovered.

Fluctuations 'cause' transport *only in a static theory*. In a dynamic theory, we have balance equations- have you heard of flux?. To be sure, in the quantum mechanical view, fluctuations occur in a more fundamental sense- but that's when I would rely on the fluctuation-dissipation theorem. And again, continuum mechanics is recovered.

In the end, Iblis, I really don't understand what your objection is to anything that I have written.
 
  • #43
James A. Putnam said:
Andy Resnick,

<snip>
I think it demonstrates that we do not yet know what Thermodynamic Entropy is. I think that understanding the role of 'time' in Clausius' definition is important for explaining why Thermodynamic Entropy is a process that takes time. The question, as I see it, is why does Thermodynamic Entropy require time? I would appreciate reading more of your input even if what I have said does not conform to it.

James Putnam

I think I understand your question, and I would answer it this way:

Asking what entropy 'is' (or what temperature 'is'), is very similar to asking (in the context of mechanics) what mass 'is'. In mechanics, we can't answer very much beyond 'mass is how much stuff there is'. And that is generally accepted without further comment. Recall that the first and second laws of thermodynamics are assertions- they are not derived from anything.

Now, it is clear there is some connection between entropy and *temperature*. In thermodynamics, there is not (AFAIK) a 'problem of the direction of time'. Irreversible processes (and time-dependent processes) pose no real conceptual issue: see my results on linear friction posted earlier. Put another way, When Joule's weight drops and the water heats up, we are not surprised. But we would be *very* surprised if instead, we heated the water and the mass raised up, cooling the water.

There are many unsolved problems in Thermodynamics (limit of small size is an obvious one, wetting is another), but relating entropy change and time is not one of them.
 
Last edited:
  • #44
Andy Resnick,

Absorbing energy under equilibrium conditions requires time. As thermodynamic entropy changes so does something that relates to time. Why do you say that a process that requires time does not relate to time? Perhaps I do not understand your point. How does thermodynamic entropy change without relating it to a change in time?

James
 
  • #45
Andy Resnick,

I should have added that the following questions are important to answer before moving on to higher level theory that includes them.

"...Asking what entropy 'is' (or what temperature 'is'), is very similar to asking (in the context of mechanics) what mass 'is'. ..."

What is Thermodynamic Entropy? What is temperature? What is mass? I think they are each required to be answered in order to advance theoretical physics. However, the most immediate one is: What is mass? The answer to this question could change almost everything.

James
 
  • #46
Iblis,

The truth is that I would like to read your answers first. The subject of this thread justifies responding only to: What is Thermodynamic Entropy? What is it that Clausius discovered? Did he discover microstates?

James
 
  • #47
James, I think one of the reasons entropy is less understood and has acquired a special mystique is the question of the system boundary.

If we think of energy we are comfortable with notions of where it come from and where it goes to. We have energy conservation laws. We can collect it in a container, store it, transport it, release it at will.

But if we think of entropy where is it located?
When entropy increases where does it come from?
Is it possible to fill my beer glass or other container with entropy?
 
  • #48
James A. Putnam said:
Andy Resnick,

Absorbing energy under equilibrium conditions requires time. <snip>

James

This is one of the notions I hope to disabuse you of. Maybe you are thinking of steady-state conditions, but at equilibrium, there can be *no* absorption. That's part of the definition of equilibrium!
 
  • #49
James A. Putnam said:
Andy Resnick,

I should have added that the following questions are important to answer before moving on to higher level theory that includes them.

"...Asking what entropy 'is' (or what temperature 'is'), is very similar to asking (in the context of mechanics) what mass 'is'. ..."

What is Thermodynamic Entropy? What is temperature? What is mass? I think they are each required to be answered in order to advance theoretical physics. However, the most immediate one is: What is mass? The answer to this question could change almost everything.

James

I disagree we need any definition other than:

"mass" is how much stuff there is
"temperature" is how hot something is

I confess, I don't have such a short, clean, definition of entropy (yet). Some people claim we need to understand where mass 'comes from' (i.e. Higgs mechanism), and while I don't disagree with their overall goal, I also note that continuum mechanics will be changed not a whit by future theories about the origin of mass, and thermodynamics will be changed not a whit by "understanding the origin of entropy and temperature".

It's similar to the occasional post around here, claiming that to understand how light reflects or refracts, we first have to understand quantum electrodynamics. It's a foolish notion and unhelpful for students.
 
  • #50
Andy Resnick,

"...This is one of the notions I hope to disabuse you of. Maybe you are thinking of steady-state conditions, but at equilibrium, there can be *no* absorption. That's part of the definition of equilibrium! ..."

Clausius defined thermodynamic entropy by allowing for absorbtion under conditions of equilibrium. Is this an incorrect statement?

From your following message:

"...Some people claim we need to understand where mass 'comes from' (i.e. Higgs mechanism), and while I don't disagree with their overall goal, I also note that continuum mechanics will be changed not a whit by future theories about the origin of mass, and thermodynamics will be changed not a whit by "understanding the origin of entropy and temperature. ..."

Ok I get your point, though I think it has to do with what is debatable here and not with what is settled science. I disagree completely with it, but I respect your position. I will not press the issue here.

James
 
  • #51
Andy, I don't think we disagree on the physics itself. It is more of a disagreement of what follows from what (in principle), and that can be a matter of taste.

In my book: Entropy is the number of bits contained in the information you would need to specify in order to give the exact description of the physical state of an object given its macroscopic description.

Also: The mass of an object is the total rest energy of the object.

And, of course, I agree with this:

http://insti.physics.sunysb.edu/~siegel/history.html
 
  • #52
Iblis,

"...Also: The mass of an object is the total rest energy of the object. ..."

What is the total rest energy of the object? In other words, what is the material composition of energy whether rest or otherwise?

James
 
  • #53
James A. Putnam said:
Iblis,

"...Also: The mass of an object is the total rest energy of the object. ..."

What is the total rest energy of the object? In other words, what is the material composition of energy whether rest or otherwise?

James

That depends on the interactions in the system. Particles get their rest energy from the interaction with the Higgs field as Andy pointed out. But bound states of particles will contain extra energy (e.g. the mass of the proton is mostly due to the interaction between the quarks and not the masses of the quarks).

Exotic example. Take 6 square mirrors and make a cube out of it that encloses a perfect vacuum. Then the total energy of the cube is due to the rest energy (mass) of the 6 mirrors plus the Casimir energy of the enclosed vacuum. The mass of the empty cube is thus slightly more than the mass of the 6 mirrors.
 
  • #54
Count Iblis said:
That depends on the interactions in the system. Particles get their rest energy from the interaction with the Higgs field as Andy pointed out. But bound states of particles will contain extra energy (e.g. the mass of the proton is mostly due to the interaction between the quarks and not the masses of the quarks).

Exotic example. Take 6 square mirrors and make a cube out of it that encloses a perfect vacuum. Then the total energy of the cube is due to the rest energy (mass) of the 6 mirrors plus the Casimir energy of the enclosed vacuum. The mass of the empty cube is thus slightly more than the mass of the 6 mirrors.

I am pretty sure you have this backwards. Bound states have lower masses than their composite parts, because the lower potential energy represents a negative contribution to the total energy. That is why the most stable nucleus that can be formed by normal stellar fusion (nickel-56, which decays rapidly by successive anti-beta decays to form iron-56) also has just about the most negative mass defect of any isotope (nickel-62 and iron-58 have lower masses, but are not accessible by fusion pathways).

Similarly with the cube example, the mirrors form a cavity that restrict the wavelengths of the virtual particles that can exist inside the cube ... I guess that is what you mean by the Casimir energy, correct? Since the Casimir force in such a cavity is attractive, it should also make a negative contribution to the total energy, and so the mass of the empty cube would be slightly lighter than the mass of the component mirrors.
 
  • #55
Yes, bound states of protons and neutrons forming a nucleus will have a lower mass than the sum of the masses of the neutrons and protons. However, in case of the QCD forces between quarks you get the opposite effect, see here:

http://en.wikipedia.org/wiki/Proton#Quarks_and_the_mass_of_the_proton

So, you need to compare the energy of the system relative to some reference state. Now, you don't get free quarks when you pump more energy in a proton, you actually get further away from such a state. But perhaps some expert in QCD can explain that better than I can...

About the Casimir energy, I remember reading somewhere that for a cube this is positive, but I'm not 100% sure...
 
  • #56
Count Iblis said:
Andy, I don't think we disagree on the physics itself. \

I wouldn't be too sure about that. Specifically:

Count Iblis said:
Also, strictly speaking there is no such thing as thermodynamics. Strictly speaking it is always thermostatics.

The entropy of a system can only be rigorously defined within the statistical framework. There is no rigorous definition of entropy within "thermodynamics".

Count Iblis said:
Thermodynamics withoiut a statistical foundation does not make sense.

Count Iblis said:
So, basically my point is that the statistical foundations are simply hidden in the assumptions you make.

Count Iblis said:
From a fundamental POV, the fluctuations cause the transport phenomena, but you can describe these phenomena in an effective way.

You have a fatally flawed understanding of the subject, and are apparently not interested in correcting it.
 
Last edited:
  • #57
Count Iblis said:
Yes, bound states of protons and neutrons forming a nucleus will have a lower mass than the sum of the masses of the neutrons and protons. However, in case of the QCD forces between quarks you get the opposite effect, see here:

http://en.wikipedia.org/wiki/Proton#Quarks_and_the_mass_of_the_proton

So, you need to compare the energy of the system relative to some reference state. Now, you don't get free quarks when you pump more energy in a proton, you actually get further away from such a state. But perhaps some expert in QCD can explain that better than I can...

About the Casimir energy, I remember reading somewhere that for a cube this is positive, but I'm not 100% sure...

Yes, I agree that you are right about the quarks .. the strong force is special, and I don't really understand it either ... just that the harder you pull them apart, the more they binding force increase, until enough energy is put in for spontaneous quark-pair production.

I was just pointing out that in cases not involving quarks, things work the other way, and more strongly bound systems have lower masses.
 
  • #58
You have a fatally flawed understanding of the subject, and are apparently not interested in correcting it.


I think you fail to appreciate that in theoretical physics we have a different perspective than in engineering. I have learned the topic this way from books and lecture notes. I have taught this subject this way at Uni.

The view in theoretical physics will typically be that a the general state of system consisting of 10^23 degrees of freedom cannot be desribed using a handful of variables. Your state space is then simply not large enough to cover all the states the system can be in. This is the sort of perspective any decent lecture note on theoretical physics will start with in the introduction. Thermodynamics will then be developed as a coarse grained description of the system in some sense. We certainly do not start with vague undefined concepts like heat, temperature work etc. etc.

It may well be that this approach involves quite a bit of hand waving that you can then criticize. Certainly there are many open problems in non eqilibrium statistical mechanics. However, as I have pointed out many times now, Reif always invokes an ensemble even when he discusses non equilibrium phenomena.


And entropy always counts the number of microstates compatible with the macrostate the system is in, i.e. the amount of information contained in the exact description of the system. Ignore that and you are vulnerable to a Maxwell Demon type argument. So, I can lower the entropy of any system at the expense of filling the memory of a computer with random information that comes from the system.

And such thought experiments are perfectly feasible for non equilibrium states too. That's not an issue at all.
 
  • #59
Count Iblis, I think that the phenomenological concepts like heat, temperature, work that you mention are experimentally very well defined quantities while e.g. microstates are an idealization that does not exist for a real macroscopic body. The problem is that you cannot, even theoretically, decouple a macroscopic body from its environment which has infinite degrees of freedom. Thermodynamic concepts cannot be derived from a microscopic theory. You have to use these very concepts as an input to do statistical mechanics. See e.g. the very interesting article by Hans Primas, "Emergence in exact sciences" which can be downloaded e.g. here: http://philsci-archive.pitt.edu/archive/00000953/00/EmergenceInExactSciences.pdf
 
Last edited by a moderator:
  • #60
Dr. Du, thanks for the link to that article, I'll read it in my next break. But I will note that a theorist is not so interested with the experimental or pactical reality (even at the level of theory itself, i.e. that in practice you do smuggle in vague intuitions from thermodynamics).

We know the structure of the laws of physics at the microlevel and that alone allows you to set up certain thought experiments which show that entropy must be a measure of the information conained in the system or else you can violate the second law in principle.

Also note that the whole issue about information loss in black hole evaporation is only an issue because of the purely theoretical problems that would pose. So, the objections you and Andy have are not seen to be relevant in the way we view this subject.

But that doesn't mean that you don't have a point regarding the way we do physics in practice. So, that's why I wrote that on the physics itself there are no disagreements. It is just one of these intepretational issues like what the meaning of thought experiments that purport to show something.

E.g. I can always imagine completely isolating any thermodynamic system by imposing the right boundary conditions far away. Then I can use some Maxwell's Demon that uses a quantum computer to cool the system down to its quantum mechanical ground state. The memory of the quantum computer will then be filled by the information that initially resided in the system.

This transformation from initial to final state has to be a unitiary transformation. The minimum size of the memory needed is te entropy of the system (in the final state we're in the static case where this is not controversial). So this then means that the entropy of any arbitrary system has to be identified with the dimensionality of the Hilbert space that you need to describe it.


So, this can be the only theoretically correct definition of entropy (it is in fact the standard definition of entropy used by theoretical physicists). But its relevance may be limited to solve purely theoretical problems.

But then such purely theoretical way of looking at things is important. Special Relativity would not have been discovered in 1905 if everyone had objected to doing thought experiments that cannot even remotely be realized.
 
  • #61
I don't like discussing off thread topics - it is the rudeness to the OP.

However I have said before, and I maintain this to be true, that there are absolutes you cannot determine with any statistcal theory, which by its nature implies variation or as you call it fluctuation.

By absolute I mean that each and every time you or nature makes the calculation or does the experiment you will get precisely the same numerical answer.

For instance if you calculate the number of regular polyhedral solids in crystallography,
Or if you calculate the minimum energy configuration for packing equal spheres
 
  • #62
Studiot,

Back on page one you said: "It would be very sad if the original question was lost in a fundamentalist wrangle over definitions that came after the time period wcg1989 is enquiring about."

I agree. I would enjoy reading posts that relate more closely to the actual 'Development of 2nd Law Of Thermodynamics'.

James
 
  • #63
I think it is worth summarising statements by various pioneers.

Carnot
Whenever a temperature difference exists, motive power canbe produced.

Clausius
It is impossible for a self acting machine, unaided by any external agency, to convey heat from a body at a low temperature to one at a higher temperature.

Thompson - Lord Kelvin
We cannot transfer heat into work merely by cooling a body already below the temperature of the lowest surrounding objects.

Planck
It is impossible to construct a system which will operate in a cycle, extract heat from a reservoir, and do an equivalent amount of work on the surroundings.

Kelvin-Planck
It is impossible to produce net work in a complete cycle if it exchanges heat only with bodies at a single fixed temperature.

These statements trace the development of the second law and ideas associated with it; it didn't all come at once.

I note again that only Carnot stated what can be done rather than what cannot.
 
  • #64
Studiot,

Where and how does the calculation of thermodynamic entropy enter into this timeline? I am not asking for myself. I think it would be helpful for someone other than myself to address the theoretical development and calculation of thermodynamic entropy.

James
 
  • #65
Count Iblis said:
I think you fail to appreciate that in theoretical physics we have a different perspective than in engineering. I have learned the topic this way from books and lecture notes. I have taught this subject this way at Uni.

<snip>

Our differences are far greater than this. I do not view Physics as an old and dusty body of facts and formulae, received wisdom jealously guarded.

Physics is vibrant- a dynamic interplay between the discovery of new facts and new ideas which place the new phenomena in context with what we already know to be true.

Any scientist- especially physicists, who recognize no boundaries in the quest to gain better understanding of nature- reacts to new facts and ideas with curiosity, not disdain. Old ideas are rejected as better ideas are introduced and constructed. Old experiments are re-evaluated based on what we know NOW, as contrasted to the more limited knowledge we had THEN. New experiments are devised to explore the limits of what we can currently explain.

Past ideas developed when we knew less may be used to guide our thinking now, but should *never* be used to hold back the development of new ideas which supersede them by virtue of enlarging that which can be explained.
 
  • #66
James A. Putnam said:
<snip>

I agree. I would enjoy reading posts that relate more closely to the actual 'Development of 2nd Law Of Thermodynamics'.

James

I did exactly this in posts #6, #11, and #22. Have you read them?
 
  • #67
Andy Resnick,

Yes I read them. I am printing them off and will read them again.

James
 
  • #68
Andy Resnick,

Ok I read your messages again. You gave the mathematical definition of entropy and showed further mathematical analysis that included it. However, if you gave an explanation as to what is thermodynamic entropy, then I did miss seeing that. Saying that thermodynamic entropy is energy in transit divided by temperature is not, I think, an answer to: What is thermodynamic entropy? I asked in an earlier message: What did Clausius discover? Whatever it is, it does require time. His method of calculating thermodynamic entropy involves a process that requires the passage of time. By the way, you did not answer my previous statement and question: 'Clausius defined thermodynamic entropy by allowing for absorbtion under conditions of equilibrium. Is this an incorrect statement?'

James
 
  • #69
Andy, as the discussion here starts to turn away from the main topic of the post: I am looking very much forward to reading more of your interesting account of the history of thermodynamics.
 
  • #70
James A. Putnam said:
Andy Resnick,

<snip>I asked in an earlier message: What did Clausius discover? <snip>

James

DrDu said:
Andy, as the discussion here starts to turn away from the main topic of the post: I am looking very much forward to reading more of your interesting account of the history of thermodynamics.

I would love to pick the thread back up- I wasn't sure there was still interest.

Historically, we left off at posts #6 and #11. Post #22 is totally ahistorical, and even some of #11 was ahistorical. I believe Clausius was indeed next to develop thermodynamics, after Carnot.

As you saw, there was very little quantitative notions of thermodynamic variables- certainly no calculus- and the theories were continuum models. Concepts like pressure, volume, temperature, specific heat etc. were used very operationally, people had a generated lot of data using various gases (including air and steam) as fluids and had the notion of an ideal gas as the physical limit of a cold, dilute gas.

I'll be able to post a summary of Clausius' papers tomorrow afternoon, and see where that gets us.
 

Similar threads

Replies
20
Views
1K
  • Thermodynamics
Replies
4
Views
3K
  • Advanced Physics Homework Help
Replies
5
Views
1K
Replies
5
Views
1K
Replies
7
Views
1K
Replies
7
Views
15K
Replies
15
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
2K
Replies
4
Views
1K
Back
Top