Entropy Confusion: Feynman's Explanation and My Question

In summary, Feynman discusses an example of putting a hot stone at temperature T_1 into cold water at temperature T_2 causing a flow of heat \Delta Q between the two. The entropy change is given to be \Delta S = \frac{\Delta Q}{T_2} - \frac{\Delta Q}{T_1}. If either of the above occurred then you would have to take these into account when calculating the entropy.
  • #1
gadje
23
0
I'm going over the notes from my thermodynamics course, with the aid of Feynman I (p44-12).

Feynman uses the example of putting a hot stone at temperature [tex]T_1[/tex] into cold water at temperature [tex]T_2[/tex] causing a flow of heat [tex]\Delta Q[/tex] between the two.

The entropy change is given to be [tex]\Delta S = \frac{\Delta Q}{T_2} - \frac{\Delta Q}{T_1}[/tex].

Does this mean that one only needs to take into account the initial temperatures and the total flow of heat from one to the other in order to calculate the change in entropy? I keep thinking that there should be some sort of [tex]\Delta T[/tex] term - why isn't this the case?

Cheers.
 
Science news on Phys.org
  • #2
This example works because the two temperatures are unambiguously defined, and they are unambiguous because an implicit assumption is made of *equilibrium*- before the stone is put in the water, both can be assigned single temperatures, and after the stone is placed with the water, the combined objects will again be at equilibrium- and in fact, be be in thermal equilibrium *with each other*. Additionally, it appears in this example that the water is a thermal reservoir: it remains at T2 regardless of how much heat is added (or subtracted)

So truthfully, this is not a problem of thermodynamics; it's a problem of thermostatics.
 
  • #3
Ah - if the water is acting as a reservoir, then that makes sense to me.

But what if, say, you have two blocks at different temperatures T1 and T2 (< T1) and bring them into contact. What is the change in entropy once heat Q has flowed from the block initially at T1 to the block initially at T2 and they are in equilibrium at T_f?

Is it still [tex]\Delta S = \frac{\Delta Q}{T_2} - \frac{\Delta Q}{T_1} = (\frac{\Delta Q}{T_2} - \frac{\Delta Q}{T_f}) + (\frac{\Delta Q}{T_f} - \frac{\Delta Q}{T_1}) [/tex] (i.e. the sum of the individual changes in entropy of the blocks) - or something different?
 
Last edited:
  • #4
Feynman's example works because it is constructed so that no work is done and no chemical reactions occur.
Consequently the only energy changes that occur are heat changes.

If either of the above occurred then you would have to take these into account when calculating the entropy.
 
  • #5
Yes, I'm aware of that - but I'm still not sure if I've understood the concept in terms of solely heat flow. Is my reasoning in my previous post correct?
 
  • #6
Hm, I find this a very odd example... I'm not sure I get it.

Is it possible that Feynman is talking about an infinitesimal heat flow? Or that both objects (water and stone) stay at their temperature? Otherwise I would say the formula is wrong, if it were not coming from the great Feynmann.

In general the entropy change of a SYSTEM, if it can be seen as REVERSIBLE (*), the entropy change is equal to dQ/T, implying you have to use an integral if the temperature is variable (aka the system is not a reservoir). So in your formula (the way I would interpret it for it to make sense) it looks as if an infinitesimal amount of heat Q is transferred to system 2 (implying T2 < T1), so its entropy rises with Q/T2. Conservation of energy says that Q has left system 1 and so its entropy has infinitesimally (een oneindig kleine verandering, als je nederlandstalig bent) changed with the value -Q/T1. This would imply an infinitesimal change of entropy of the universe, being the sum of both entropy changes, of Q/T2 - Q/T1. So I would not interpret the delta S as the entropy change of either system as you do, but the universal entropy change. Is this possible if you reread it in your book?

(*) of course, a glas warming up in a room is not reversible, but the key insight is that either system viewed seperately IS reversible, meaning a glas can also cool down and a room can also lose heat. And for entropy changes of a system, you can ignore what the environment does, so to calculate the entropy change of a glas that is warming up in a room, you just look at the glas and ignore the room, remembering yourself a glas can also cool down so it is reversible as is needed for the above-mentioned formula

I hope I'm making some sense :) If you are dutch, feel free to ask me to tell you the same in dutch
 
  • #7
gadje said:
Ah - if the water is acting as a reservoir, then that makes sense to me.

But what if, say, you have two blocks at different temperatures T1 and T2 (< T1) and bring them into contact. What is the change in entropy once heat Q has flowed from the block initially at T1 to the block initially at T2 and they are in equilibrium at T_f?

Is it still [tex]\Delta S = \frac{\Delta Q}{T_2} - \frac{\Delta Q}{T_1} = (\frac{\Delta Q}{T_2} - \frac{\Delta Q}{T_f}) + (\frac{\Delta Q}{T_f} - \frac{\Delta Q}{T_1}) [/tex] (i.e. the sum of the individual changes in entropy of the blocks) - or something different?

Good question- I think you have to be careful when deciding how much heat flow occurs. Working backward, say the final temp is T_f, and the specific heats of the stone and water are C_r and C_w (we need to be careful and define if this is at constant volume or pressure, but in any case...), we say the rock lost an amount of heat dQ = mC_r (T_f-T1), and the water gained that heat, causing a change in temperature dQ/mC_w = (T2-T_f). There's also latent heat (which I am ignoring for now) which is important if there is a phase change (melting ice, for example).

Putting those together, we have T_f-T1 = g (T2-T_f), where 'g' is the ratio of heat capacities (mC).

That's my initial attempt, anyway...
 
  • #8
The exact wording is this:

Another example of irreversibility is this: If we put together two objects that are at different temperatures, say T1 and T2, a certain amount of heat will flow from one to the other by itself. Suppose, for instance, we put a hot stone in cold water. Then when a certain heat ∆Q is transferred from T1 to T2, how much does the entropy of the hot stone change? It increases by ∆Q/T1. How much does the water entropy change? It increases by ∆Q/T2. The heat will, of course, flow only from the higher temperature T1 to the lower temperature T2, so that ∆Q is positive if T1 is greater than T2. So the change in entropy of the whole world is positive, and it is the difference of the two fractions:

∆S = ∆Q/T2 - ∆Q/T1

With regard to what you said about reversibility - I thought that the change in entropy of a system only depends on its beginning and end states, and is independent of the method used to go between them, regardless of reversibility (so the change in entropy of an adiabatic free expansion is the same as that of an isothermal reversible expansion between the same initial and final volumes/pressures/temperatures). Which, I suppose, goes some way to answer my own question...
 
  • #9
First of all you must define your system. This is often the difficulty in thermodynamics.

Is the system open or closed or isolated?

This is important because some of the relations are only properly applied to isolated or closed systems, but not to open ones.

Now the first example may be isolated, open or closed depending upon where we draw the boundaries.

This underlies what Andy was getting at.
 
  • #10
gadje, as I thought, he is talking about an infinitesimal change in entropy (or assuming the temperatures of both the stone and the water never change, but that's a bit silly). Maybe it is more clear if the delta's in his formula would be written as "d"'s? dS = dQ/T is the formula for infinitesimal entropy change.

And you are right that entropy is a state variable independent of the path, BUT! Q is not a state variable and IS dependent on your path, and because if you want to calculate the entropy change in this way, with the use of Q, you HAVE to see how it happens. To make my point clear:

Imagine a gas in half an isolated container (shielded by a membrane). The other half is pure vacuum. You puncture the membrane and the gas freely expands into the other half. Q = 0 because the container is isolated. So dS = dQ/T = 0, but this formula is simply wrong because it's only for cases where it can be seen as reversible, and a free expansion of a gas is irreversible. The dS = dQ/T is a VERY nasty formula and I heavily dislike it, but it's practical so you better learn to deal with it :(

Luckily for gases you can derive (without too much trouble) some other formula's which don't contain variables that are not state variables, making your idea about "only the initial and final values matter" true. Sadly, Q is very dependent on the path.

As an extra: what if you want to calculate the entropy change of the gas filling half the container? Well, since you are right and S is a state variable, you can imagine any process and measure the change in entropy that way. Wait, didn't I just contradict myself? Well no, because if you choose another path, Q will also change, an example:

Imagine you let the gas expand by doing work on a piston very slowly while staying at a constant temperature in a non-isolated container. It can be proven that this kind of expansion IS reversible. The work done by the gas on the piston as it fills the other half of the container is nRTln(Vf/i) = nRTln(2) with T the constant temperature of the gas. Now, since for an ideal gas E is a function of T and T is constant, E is constant. The first law of Td tells you dE = dQ - dW (with dW the work done by the gas, if you define it as the work done by the environment, it becomes a plus). So 0 = dQ - dW <=> dQ = dW <=> Q = W. You know W = nRTln(2) and since THIS path is reversible, you can use the formula dS = dQ/T => (constant T) S = Q/T = W/T = nRTln(2)/T = nRln(2)

So if a gas expands into vacuum to twice its volume, the entropy change is nRln(2) :)

I hope this helps?
 
  • #11
Ah, indeed, indeed. That clears up the initial confusion I had, thank you.Thermodynamics is pesky.
 
  • #12
Correction: classical thermodynamics is pesky.
statistical thermodynamics is quite beautiful, especially the part where things become understandable
 
  • #13
Quite. The next bit of the course just touches on that, looking at entropy in terms of macro- and microstates. I like it.
 
  • #14
I would liek to caution you that statistical mechanics has it's own set of limitations; it's simply incorrect to claim it underlies thermodynamics.

I take the opposite view- thermodynamics is an amazingly fundamental theory that has a range of applicability far beyond nearly every other physical theory.
 
  • #15
What do you mean? Where does the classical thermodynamics go beyond the statistical thermodynamics?
 
  • #16
Thermodynamics can deal with irreversible processes; thermodynamics can discuss dissipative processes; thermodynamics is based on the continuum model and so can discuss a much wider variety of materials and processes than statistical mechanics.

Statistical mechanics, AFAIK, *requires* the existence of an equilibrium state (as well as time-reversible processes). While many special solutions of statistical mechanics exist (rareified gases, perfect fluids and crystalline solids), it does not satisfy the most simple constitutive relations of a Newtonian fluid, nor does the Boltzmann H-theorem satisfy (in general) the Clausius-Duhem inequality. One would be foolish to design an engine using statistical mechanics.

Statistical theories are useful for modeling and emphasizing certain *aspects* of nature. They are not fundamental theories *about* nature.
 
  • #17
For my ease:
Classical Thermodynamics = CM
Statistical Thermodynamics = SM

What do you mean SM needs time-reversible processes and equilibrium states? E,V,S are always well-defined and in SM you can then define pressure and temperature as partial derivatives of these quantities, so they also always exist. A fact is that when a real system goes to equilibrium, P and T are in that process not well-defined, but the partial derivates stay well-defined and as soon as the system has reached equilibrium, the experimental P and T and the partial derivate definitions coincide. So SM can talk about, for example, the free expansion of a gas into vacuum, where experimentally P and T are not continuously well-defined. I see it as when in algebra you take a path through the complex numbers to get a result in the real numbers. The partial derivatives play the role of these complex numbers in the way they have no meaning in the system you are concerned about as it is in non-equilibrium state, but that it's correct to use them since the end result gives you the experimental P and T.

Am I wrong in that view, according to you?
 
  • #18
I know this isn't technically the right place to post this, but it's so closely linked to the original point of the thread...

I'm doing a problem sheet on entropy now, and the question is "What is the entropy change of the universe when a block of mass 10kg is dropped from a height of 100m into a lake (which I suppose is at some height H < 100)?"

My instinct is that you can treat GPE/KE as if it were heat, and height as if it were temperature (so g takes the role of specific heat capacity) and say that

∆S = ∆GPE/H - ∆GPE/100?
 
  • #19
mr. vodka said:
<snip>E,V,S are always well-defined and in SM you can then define pressure and temperature as partial derivatives of these quantities, so they also always exist. A fact is that when a real system goes to equilibrium, P and T are in that process not well-defined, but the partial derivates stay well-defined and as soon as the system has reached equilibrium, the experimental P and T and the partial derivate definitions coincide.

<snip> The partial derivatives play the role of these complex numbers in the way they have no meaning in the system you are concerned about as it is in non-equilibrium state, but that it's correct to use them since the end result gives you the experimental P and T.

Am I wrong in that view, according to you?

I believe you are wrong, and your statements above are inconsistent with each other. One example:

You claim that in nonequilibrium systems, one cannot define a P or T, but I (me, my person) have a well-measurable temperature in spite of existing far from equilibrium.
 
  • #20
If they appear inconsistent, then that is purely by my bad explaining, which of course would be my fault.

About your example: do you? I don't know a lot of biology, but in a strict sense I suppose your temperature is not exactly defined and fluctuates throughout your body. But that is not what I was really getting at. I was trying to say that there can be situations where very obviously you can't say what the temperature is and I thought you were thinking of those situations when you said SM fails, because you presumed that when T is non-defined, SM breaks down because it uses dT for example. My point was that indeed, in a physical sense dT makes no sense if T is not well defined, but SM does not have that problem because it defines T as a partial derivate of a quantity that is always well-defined.

Does this clear things up? Still in disagreement? I'm not trying to get you convinced about my point so much as to see where you think SM breaks down.

EDIT: gadje, about your problem, I wouldn't really know how to solve that. I do know however you can't just make up your own definition by replacing temperature with height and such (if you do, you'd have to show that it is compatible with the general definition). More about your problem: I find it interesting, but I think I'm missing something. Don't forget the formula you're thinking of , dS = dQ/T, is for gasses...
 
  • #21
I think I've got it, actually. It's not what I have up there, upon thinking about it.
 
  • #22
What is your solution/method then?
 
  • #23
There's an earlier part of the question which talks about the lake being at 10 Celsius and being of a high specific heat. Dropping the block into the lake introduces mgh of energy into it, changing the entropy of the universe by mgh/T = 10*g*100/(10+273).

I think.
 
  • #24
I agree with Mr. Vodka.

You can also consider Maxwell's Demon though experiment and the resolution of the paradox by Landauer to see that Classical Thermodynamics is incomplete. Of course that doesn't mean that statistical mechanics as it is used in practice doesn't have limitation due to the assumptions made there, but it is more fundamental.


This is a bit similar to discussing whether quantum mechanics is more fundamental that classical mechanics. One could easily argue along the same lines that Andy does here to argue that Classical Mechanics can be applied to certain areas where quantum mechechanics as we know it cannot, because of issues such as the measurement problem.
 
  • #25
gadje - that is indeed the entropy increase of the environment, (EDIT, first had something else following this) but I wonder if the entropy of the stone decreased. Should you see it as if there is a heat flow from the stone to the water? Maybe not, not sure...

Count Iblis - I wonder, does SM presume you are working with a reservoir as an environment? In the sense that the Tenv and Penv are constant. Anyway, the part of SM I have worked with always presumed this to derive free energy laws and such, but this does seem like one of the restrictions, unless I'm just wrong on the fact that it's an assumption in SM.
 
Last edited:
  • #26
I am confused as to what is meant by 'classical' thermodynamics, as opposed to thermodynamics. Is there quantum thermodynamics, as a distinct field?

I would be surprised if someone here claimed there is any process in the universe that does not obey the Clausius-Duhem inequality. By contrast, I can list of a long list of things not covered by statistical mechanics. Again, statistical mechanics is in no way more fundamental than thermodynamics. Simplified models do not elucidate the range of theory.

As for my original objection (body temperature- the fact that my core body temperature fluctuates is not germane. I'm not sure what you mean by dT in the context of statistical mechanics.

My central point is that statistical mechanics is not equipped to handle nonequilibrium systems. Onsager's relations are a linearization, and do not hold in general.

By contrast, thermodynamics covers *everything physically allowed*.
 
  • #27
mr. vodka said:
<snip>
Does this clear things up? Still in disagreement? I'm not trying to get you convinced about my point so much as to see where you think SM breaks down.
<snip>

I haven't given anyone a chance to respond, but I feel like I have been having this argument for over a month and not getting anywhere.

So, here's a simple challenge: solve this, and I will reconsider my earlier claims.

Using SM, solve either Q(t) or T(t) in the OP's question. If SM is truly a more fundamental theory than thermodynamics, this should be trivial.
 
  • #28
In many textbooks, thermodynamics and statistical mechanics are treated in a unified way. There is an introduction in which the fundamental postulate of equal prior probabilities is introduced. Then both thermodynamics and statistical mechanics are developed.

This is done in the book by F. Reif. He distinguishes this from classical thermodynamics; he points out that classical thermodynamics has more postulates than "statistical thermodynamics". Classical thermodynamics is basically early 19th century theory about Heat, temperature, work etc. developed by Carnot. Statistical thermodynamics is the modern version of this developed by Boltzmann, Gibbs etc.
 
  • #29
Count Iblis said:
In many textbooks, thermodynamics and statistical mechanics are treated in a unified way. There is an introduction in which the fundamental postulate of equal prior probabilities is introduced. Then both thermodynamics and statistical mechanics are developed.

This is done in the book by F. Reif. He distinguishes this from classical thermodynamics; he points out that classical thermodynamics has more postulates than "statistical thermodynamics". Classical thermodynamics is basically early 19th century theory about Heat, temperature, work etc. developed by Carnot. Statistical thermodynamics is the modern version of this developed by Boltzmann, Gibbs etc.

Ok .. from the point of view of an objective observer, one thing that would help clear this up is to get clear definitions of what folks mean by the following terms, what distinctions (if any) are drawn between them, and what fundamental limitations they may have.

Statistical Mechanics:

Statistical Thermodynamics:

Classical Thermodynamics (or just Thermodynamics, as per Andy's usage):


This would help me to better understand what y'all are discussing here.
 
  • #30
Count Iblis said:
In many textbooks, thermodynamics and statistical mechanics are treated in a unified way. There is an introduction in which the fundamental postulate of equal prior probabilities is introduced. Then both thermodynamics and statistical mechanics are developed.

This is done in the book by F. Reif. He distinguishes this from classical thermodynamics; he points out that classical thermodynamics has more postulates than "statistical thermodynamics". Classical thermodynamics is basically early 19th century theory about Heat, temperature, work etc. developed by Carnot. Statistical thermodynamics is the modern version of this developed by Boltzmann, Gibbs etc.

I'm not letting you off the hook that easily, since you accused me of

"...Andy does here to argue that Classical Mechanics can be applied to certain areas where quantum mechechanics as we know it cannot"

Statistical mechanics is *not* a modern version of thermodynamics. Anyone that seriously tries to argue that knows little about either.

Hiding behind a bad textbook is poor form.
 
  • #31
SpectraCat said:
Ok .. from the point of view of an objective observer, one thing that would help clear this up is to get clear definitions of what folks mean by the following terms, what distinctions (if any) are drawn between them, and what fundamental limitations they may have.

Statistical Mechanics:

Statistical Thermodynamics:

Classical Thermodynamics (or just Thermodynamics, as per Andy's usage):This would help me to better understand what y'all are discussing here.

Fair enough- here's mine:

Statistical Mechanics: (classical or quantum) mechanics of a large number of discrete particles.

Classical Thermodynamics: theory of heat and mass transfer

Statistical Thermodynamics: A misguided amalgamation of the two.

Distinctions:
Statistical mechanics is a discretized approach, thermodynamics is a continuum approach.

Statistical mechanics postulates 'states' and defines 'equilibrium' and is based on distribution functions and the existence of a partition function. Thermodynamics postulates 'temperature' and 'heat', and has been largely axiomatized (along with continuum mechanics). Thermodynamics also postulates the conservation of energy and second law.

Statistical mechanics is a linear theory, thermodynamics is a nonlinear theory. Whether this is good or bad depends on your perspective.

Limitations: statistical mechanics cannot calculate time-dependent behavior except in a very limited sense (involving equilibrium concepts). Thermodynamics requires constitutive relations that do not have a basis in physical theory.There's more, I'm sure...
 
Last edited:
  • #32
I have to say that I think the whole SM v CT issue rather like the big endian v little endian argument in Gulliver's Travels.
They are both aspects of the same egg, each useful in their own way. In cases where they overlap they predict the same thing, unlike wave v corpuscular theory of light.

Isn't that glorious corroberation?

I can't let this pass however.

Statistical mechanics is a linear theory........Limitations: statistical mechanics cannot calculate time-dependent behavior except in a very limited sense (involving equilibrium concepts).

The whole of chemical reaction dynamics, and therefore chemical engineering, is built on statistical mechanics. Only the simplest reactions have linear dynamics.
 
  • #33
Hm, I've come to the realization I probably know too little to have enough insight for decent arguments.

But Andy Resnick, let me ask you one thing. SM has a rigid definition of entropy that in essence allows you to calculate the entropy of any situation (in practice, one might be lacking knowledge/skill to actually do it). But more importantly, and basically my question, how is entropy defined in classical thermodynamics? I was taught that dS = dQ/T viewed from a reversible process, but this definition is incredibly limited, so I assume there is a more general one out there that everybody has neglected to teach me. Could you tell me?

EDIT: re-reading my post, it may sound as though I am sniping, but that wasn't meant that way, it was/is an honest question
 
Last edited:
  • #34
Studiot said:
<snip>

I can't let this pass however.

[...]

The whole of chemical reaction dynamics, and therefore chemical engineering, is built on statistical mechanics. Only the simplest reactions have linear dynamics.

You should have, because you are completely wrong. Find me *any* discussion of statistical mechanics in this:

https://www.amazon.com/dp/0071422943/?tag=pfamazon01-20

or this:

https://www.amazon.com/dp/0131406345/?tag=pfamazon01-20

or this:

https://www.amazon.com/dp/0130113867/?tag=pfamazon01-20

Attitudes like the one you express here are a reason why Physicists are taken less and less seriously by other technical folks.
 
  • #35
Andy Resnick said:
I'm not letting you off the hook that easily, since you accused me of

"...Andy does here to argue that Classical Mechanics can be applied to certain areas where quantum mechechanics as we know it cannot"

Statistical mechanics is *not* a modern version of thermodynamics. Anyone that seriously tries to argue that knows little about either.

Hiding behind a bad textbook is poor form.

Statistical mechanics as used in practice (like evaluating partition functions or doing monte carlo simulations), may not be all that useful to engineers who use thermodynamics. But that doesn't mean that the foundations of thermodynamics lie firmly within the realm of what we in theoretical physics call "statistical mechanics".

Note that statistical mechanics can also be the study of chaos theory, far out of equilibrium phenomena etc. etc.
 

Similar threads

Replies
9
Views
1K
Replies
11
Views
282
Replies
35
Views
3K
Replies
21
Views
4K
Replies
22
Views
2K
Replies
6
Views
2K
Replies
4
Views
879
Replies
4
Views
1K
  • Thermodynamics
Replies
10
Views
3K
  • Introductory Physics Homework Help
Replies
2
Views
670
Back
Top