Share this thread: 
#1
Apr1310, 11:03 AM

P: 23

I'm going over the notes from my thermodynamics course, with the aid of Feynman I (p4412).
Feynman uses the example of putting a hot stone at temperature [tex]T_1[/tex] into cold water at temperature [tex]T_2[/tex] causing a flow of heat [tex]\Delta Q[/tex] between the two. The entropy change is given to be [tex]\Delta S = \frac{\Delta Q}{T_2}  \frac{\Delta Q}{T_1}[/tex]. Does this mean that one only needs to take into account the initial temperatures and the total flow of heat from one to the other in order to calculate the change in entropy? I keep thinking that there should be some sort of [tex]\Delta T[/tex] term  why isn't this the case? Cheers. 


#2
Apr1310, 11:41 AM

Sci Advisor
P: 5,513

This example works because the two temperatures are unambiguously defined, and they are unambiguous because an implicit assumption is made of *equilibrium* before the stone is put in the water, both can be assigned single temperatures, and after the stone is placed with the water, the combined objects will again be at equilibrium and in fact, be be in thermal equilibrium *with each other*. Additionally, it appears in this example that the water is a thermal reservoir: it remains at T2 regardless of how much heat is added (or subtracted)
So truthfully, this is not a problem of thermodynamics; it's a problem of thermostatics. 


#3
Apr1310, 11:54 AM

P: 23

Ah  if the water is acting as a reservoir, then that makes sense to me.
But what if, say, you have two blocks at different temperatures T1 and T2 (< T1) and bring them into contact. What is the change in entropy once heat Q has flowed from the block initially at T1 to the block initially at T2 and they are in equilibrium at T_f? Is it still [tex]\Delta S = \frac{\Delta Q}{T_2}  \frac{\Delta Q}{T_1} = (\frac{\Delta Q}{T_2}  \frac{\Delta Q}{T_f}) + (\frac{\Delta Q}{T_f}  \frac{\Delta Q}{T_1}) [/tex] (i.e. the sum of the individual changes in entropy of the blocks)  or something different? 


#4
Apr1310, 03:14 PM

P: 5,462

Entropy Confusion
Feynman's example works because it is constructed so that no work is done and no chemical reactions occur.
Consequently the only energy changes that occur are heat changes. If either of the above occurred then you would have to take these into account when calculating the entropy. 


#5
Apr1310, 03:25 PM

P: 23

Yes, I'm aware of that  but I'm still not sure if I've understood the concept in terms of solely heat flow. Is my reasoning in my previous post correct?



#6
Apr1310, 05:41 PM

P: 1,412

Hm, I find this a very odd example... I'm not sure I get it.
Is it possible that Feynmann is talking about an infinitesimal heat flow? Or that both objects (water and stone) stay at their temperature? Otherwise I would say the formula is wrong, if it were not coming from the great Feynmann. In general the entropy change of a SYSTEM, if it can be seen as REVERSIBLE (*), the entropy change is equal to dQ/T, implying you have to use an integral if the temperature is variable (aka the system is not a reservoir). So in your formula (the way I would interpret it for it to make sense) it looks as if an infinitesimal amount of heat Q is transfered to system 2 (implying T_{2} < T_{1}), so its entropy rises with Q/T_{2}. Conservation of energy says that Q has left system 1 and so its entropy has infinitesimally (een oneindig kleine verandering, als je nederlandstalig bent) changed with the value Q/T_{1}. This would imply an infinitesimal change of entropy of the universe, being the sum of both entropy changes, of Q/T_{2}  Q/T_{1}. So I would not interpret the delta S as the entropy change of either system as you do, but the universal entropy change. Is this possible if you reread it in your book? (*) of course, a glas warming up in a room is not reversible, but the key insight is that either system viewed seperately IS reversible, meaning a glas can also cool down and a room can also lose heat. And for entropy changes of a system, you can ignore what the environment does, so to calculate the entropy change of a glas that is warming up in a room, you just look at the glas and ignore the room, remembering yourself a glas can also cool down so it is reversible as is needed for the abovementioned formula I hope I'm making some sense :) If you are dutch, feel free to ask me to tell you the same in dutch 


#7
Apr1310, 05:58 PM

Sci Advisor
P: 5,513

Putting those together, we have T_fT1 = g (T2T_f), where 'g' is the ratio of heat capacities (mC). That's my initial attempt, anyway.... 


#8
Apr1310, 06:06 PM

P: 23

The exact wording is this:



#9
Apr1310, 06:09 PM

P: 5,462

First of all you must define your system. This is often the difficulty in thermodynamics.
Is the system open or closed or isolated? This is important because some of the relations are only properly applied to isolated or closed systems, but not to open ones. Now the first example may be isolated, open or closed depending upon where we draw the boundaries. This underlies what Andy was getting at. 


#10
Apr1310, 06:33 PM

P: 1,412

gadje, as I thought, he is talking about an infinitesimal change in entropy (or assuming the temperatures of both the stone and the water never change, but that's a bit silly). Maybe it is more clear if the delta's in his formula would be written as "d"'s? dS = dQ/T is the formula for infinitesimal entropy change.
And you are right that entropy is a state variable independent of the path, BUT!!! Q is not a state variable and IS dependent on your path, and because if you want to calculate the entropy change in this way, with the use of Q, you HAVE to see how it happens. To make my point clear: Imagine a gas in half an isolated container (shielded by a membrane). The other half is pure vacuum. You puncture the membrane and the gas freely expands into the other half. Q = 0 because the container is isolated. So dS = dQ/T = 0, but this formula is simply wrong because it's only for cases where it can be seen as reversible, and a free expansion of a gas is irreversible. The dS = dQ/T is a VERY nasty formula and I heavily dislike it, but it's practical so you better learn to deal with it :( Luckily for gases you can derive (without too much trouble) some other formula's which don't contain variables that are not state variables, making your idea about "only the initial and final values matter" true. Sadly, Q is very dependent on the path. As an extra: what if you want to calculate the entropy change of the gas filling half the container? Well, since you are right and S is a state variable, you can imagine any process and measure the change in entropy that way. Wait, didn't I just contradict myself? Well no, because if you choose another path, Q will also change, an example: Imagine you let the gas expand by doing work on a piston very slowly while staying at a constant temperature in a nonisolated container. It can be proven that this kind of expansion IS reversible. The work done by the gas on the piston as it fills the other half of the container is nRTln(V_{f}/_{i}) = nRTln(2) with T the constant temperature of the gas. Now, since for an ideal gas E is a function of T and T is constant, E is constant. The first law of Td tells you dE = dQ  dW (with dW the work done by the gas, if you define it as the work done by the environment, it becomes a plus). So 0 = dQ  dW <=> dQ = dW <=> Q = W. You know W = nRTln(2) and since THIS path is reversible, you can use the formula dS = dQ/T => (constant T) S = Q/T = W/T = nRTln(2)/T = nRln(2) So if a gas expands into vacuum to twice its volume, the entropy change is nRln(2) :) I hope this helps? 


#11
Apr1310, 06:48 PM

P: 23

Ah, indeed, indeed. That clears up the initial confusion I had, thank you.
Thermodynamics is pesky. 


#12
Apr1310, 07:04 PM

P: 1,412

Correction: classical thermodynamics is pesky.
statistical thermodynamics is quite beautiful, especially the part where things become understandable 


#13
Apr1310, 07:08 PM

P: 23

Quite. The next bit of the course just touches on that, looking at entropy in terms of macro and microstates. I like it.



#14
Apr1410, 07:37 AM

Sci Advisor
P: 5,513

I would liek to caution you that statistical mechanics has it's own set of limitations; it's simply incorrect to claim it underlies thermodynamics.
I take the opposite view thermodynamics is an amazingly fundamental theory that has a range of applicability far beyond nearly every other physical theory. 


#15
Apr1410, 09:10 AM

P: 1,412

What do you mean? Where does the classical thermodynamics go beyond the statistical thermodynamics?



#16
Apr1410, 09:34 AM

Sci Advisor
P: 5,513

Thermodynamics can deal with irreversible processes; thermodynamics can discuss dissipative processes; thermodynamics is based on the continuum model and so can discuss a much wider variety of materials and processes than statistical mechanics.
Statistical mechanics, AFAIK, *requires* the existence of an equilibrium state (as well as timereversible processes). While many special solutions of statistical mechanics exist (rareified gases, perfect fluids and crystalline solids), it does not satisfy the most simple constitutive relations of a Newtonian fluid, nor does the Boltzmann Htheorem satisfy (in general) the ClausiusDuhem inequality. One would be foolish to design an engine using statistical mechanics. Statistical theories are useful for modeling and emphasizing certain *aspects* of nature. They are not fundamental theories *about* nature. 


#17
Apr1410, 09:50 AM

P: 1,412

For my ease:
Classical Thermodynamics = CM Statistical Thermodynamics = SM What do you mean SM needs timereversible processes and equilibrium states? E,V,S are always welldefined and in SM you can then define pressure and temperature as partial derivatives of these quantities, so they also always exist. A fact is that when a real system goes to equilibrium, P and T are in that process not welldefined, but the partial derivates stay welldefined and as soon as the system has reached equilibrium, the experimental P and T and the partial derivate definitions coincide. So SM can talk about, for example, the free expansion of a gas into vacuum, where experimentally P and T are not continuously welldefined. I see it as when in algebra you take a path through the complex numbers to get a result in the real numbers. The partial derivatives play the role of these complex numbers in the way they have no meaning in the system you are concerned about as it is in nonequilibrium state, but that it's correct to use them since the end result gives you the experimental P and T. Am I wrong in that view, according to you? 


#18
Apr1410, 11:38 AM

P: 23

I know this isn't technically the right place to post this, but it's so closely linked to the original point of the thread...
I'm doing a problem sheet on entropy now, and the question is "What is the entropy change of the universe when a block of mass 10kg is dropped from a height of 100m into a lake (which I suppose is at some height H < 100)?" My instinct is that you can treat GPE/KE as if it were heat, and height as if it were temperature (so g takes the role of specific heat capacity) and say that ∆S = ∆GPE/H  ∆GPE/100? 


Register to reply 
Related Discussions  
String entropy and black hole entropy  Beyond the Standard Model  3  
Entanglement entropy vs Entropy  Quantum Physics  1  
Minimize entropy of P == maximize entropy of ?  Set Theory, Logic, Probability, Statistics  2  
Is the Entropy of the Universe Zero? Entropy as Entanglement)  Quantum Physics  10  
Entropy (information entropy)  Advanced Physics Homework  5 