Register to reply 
Entropy is a measure of energy availiable for work ? 
Share this thread: 
#19
Dec2012, 08:32 PM

P: 789

For complex systems contained inside a thermally insulating boundary, entropy may move around inside the system, driven by temperature differences inside the system, equalizing them when possible, but can never be transferred across the boundary. During these internal subprocesses, entropy may also be created. In an isothermal process, the boundary is thermally open, and entropy may move across the boundary, again driven by temperature differences between the system and the environment, in such a way as to equalize internal temperatures at the constant temperature of the environment, when possible. Entropy transfer goes hand in hand with energy transfer via dU=T dS. If a process is converting energy to work, and you want to know how much of that energy is converted to work, in order to keep the bookwork straight, you cannot bring in energy or entropy from somewhere else to accomplish that work. To ensure this, the process has to be adiabatic, i.e. inside a thermally insulating boundary which prevents entropy and energy coming in from somewhere else. For finite temperature differences, transfer of entropy across a thermally open boundary causes creation of entropy at the boundary which is transferred to the lower temperature system. The transfer of entropy is of order ∆T, the creation of entropy at the boundary is of order ∆T^2, so, in the limit of small ∆T, entropy may be transferred without creation at the boundary. If ∆T is identically zero, there will be no transfer of entropy, since only temperature differences will drive entropy transfer. 


#20
Dec2012, 10:53 PM

Sci Advisor
HW Helper
P: 6,679

And I wouldn't say it moves because entropy is not a conserved quantity such as energy or momentum. We can speak of energy or momentum transfer because the loss of energy/momentum must result in the gain of energy/momentum of some other body so it behaves as if it moves. Entropy does not behave like. So I would suggest that the concept of entropy moving is not a particularly helpful one. In a reversible isothermal process total entropy change is 0. In a real isothermal process, the entropy of the system + surroundings inevitably increases. It is not entropy that moves. It is energy. And the faster the energy moves, the greater the increase in entropy. So I might suggest that entropy increase is related more to the speed of energy transfer (heat flow) than to the fact that a body remains at the same temperature. AM 


#21
Dec2112, 12:32 AM

P: 789




#22
Dec2112, 06:16 AM

Sci Advisor
HW Helper
P: 6,679

But I think that, overall, the concept of entropy "moving" just adds to confusion about an already very difficult concept. It gives the impression that entropy is a physical quantity. Entropy is a statistical. It is a bit like temperature in that respect. We would not say that temperature moves between bodies or is lost or created. We do say that Q (= mass x temperature x heat capacity) moves or is lost or created (heat flow) but we relate it to U and W, which together are always conserved. You can't do that with entropy. AM 


#23
Dec2112, 08:50 AM

P: 789

1. Volume is conserved in a mechanically isolated system and volume changes are driven by pressure differences alone. 2. Particle number is conserved in a materially isolated system and particle number changes are driven by chemical potential differences alone. 3. Similarly, entropy is conserved in a thermally isolated system, entropy flows are driven by temperature differences alone. If X is intensive, then differences in X drive changes in an extensive Y, where X and Y are conjugate variables. T and S, P and V, µ and N, are conjugate variables. The product of X and dY has units of energy and the fundamental law states that the sum of all those products is a conserved, extensive internal energy dU=TdSPdV+µdN. (No worry about the sign of PdV) If you relax the above constraints on the systems, conservation becomes more problematical. Volume is still always conserved in a mechanically isolated system. For a homogeneous system in which there may be chemical reactions, particle number is not conserved in a materially isolated system (although conservation of "component particles" is). Similarly, in an irreversible process, entropy is not conserved, it is created, in a thermally isolated system. I think the simple fact that entropy is extensive implies that the concept of entropy flow, entropy creation at particular locations, entropy density, etc. is viable. I don't think it adds confusion, I think it brings insight and clarity to the concept of "classical entropy". If its true, I fully expect statistical mechanics to verify this, rather than muddying the waters. 


#24
Dec2112, 12:41 PM

P: 741

The OP said that the expansion is quasistatic. This means that the expansion is so slow, the ideal gas is in a state infinitesimally close to thermal equilibrium. This means that variation of temperature and pressure within the ideal gas are negligible. If there is a significant amount of entropy created, it is not from the temperature differences in the gas. Similarly, there can't be any coherent sound energy. The gas can't "squeak". A coherent sound wave would imply that the macroscopic pressure and the macroscopic temperature were inhomogeneous. The only thing that we can be sure of from his description is that the expansion is adiabatic. Otherwise, work won't entirely come from the internal energy of the gas. Therefore, I presume that the container of ideal gas is comprised of a piston and cylinder made of some thermal insulator with zero heat capacity. The pressure outside the container doesn't really matter. It could be zero. In the OP's scenario, there could be friction between the piston surface and the cylinder surface that contains the gas. Because of heat conduction from surface to gas, the "heat energy" created by the friction goes right back into the ideal gas. Because the expansion is adiabatic, neither entropy nor heat energy can be conducted out of the container. Therefore, work done by the friction can't affect the internal energy of the ideal gas. In the simplest case that I can imagine, the friction is a combination of sliding friction and static friction. Simple formulas for sliding friction and static friction are taught in introductory physics classes. Although real friction is more complicated, I think this approximation is good enough as an illustration. I intend to show a calculation for this situation, where all entropy created comes from sliding fiction. In this example, there will be no aerodynamic friction or temperature differences in the ideal gas. The expansion will be adiabatic and quasistatic, but not reversible. 


#25
Dec2112, 01:41 PM

P: 789




#26
Dec2112, 10:13 PM

Sci Advisor
HW Helper
P: 6,679

AM 


#27
Dec2212, 05:15 AM

P: 789

There is no entropy creation inside either body, yet the drop in entropy of the hot body is less than the increase of entropy of the cold body as a result of the entropy transfer, so the process is irreversible. Entropy is created in or at the boundary. If the process were not quasistatic, there would be temperature gradients inside the bodies as well (rather than just at the boundary) and temperature gradients create entropy, so entropy would be created in the bodies themselves, not just at the boundary. 


#28
Dec2212, 04:24 PM

Sci Advisor
HW Helper
P: 6,679

AM 


#29
Dec2212, 05:14 PM

P: 741

I assumed that the word “quasistatic” used by the OP was the same as nearequilibrium. A later post by the OP confirmed that he was talking about a process which was the sum of a series of small processes where the state of the ideal gas was near equilibrium. Entropy can be created even if a system is “near equilibrium” at each infinitesimal step. Maybe you are right in that there is a temperature gradient where the entropy is created. At the point of contact between two surfaces, the temperature gradient magnitude can be very large. However, the inverse of the temperature gradient may not be completely macroscopic. On the length scale determined by the dimensions of the container of gas, the inverse gradient is microscopic. On the length scale determined by the dimensions of a molecule, the inverse gradient is rather large. Frictional forces do create entropy. Most of the container of gas in an adiabatic expansion can be near thermal equilibrium. The important thing about a system that is “near equilibrium” is that the intensive quantities will be uniform over most of the system. There may be a large temperature in a 1 cubic micron volume at the point of contact. However, the temperature of the ideal gas over most of the volume will be at a single value, T. Similarly, the pressure at the point of contact may be huge, resulting in a large stress at the point of contact. However, one can approximate the pressure over most of the system by a single value, P. The following article addresses the issue of how friction is treated in a thermodynamic analysis. Note that there was at least one investigator, Rymuza, who examined sliding friction in a near equilibrium process. http://www.mdpi.com/10994300/12/5/1021 “On the Thermodynamics of Friction and Wear―A Review Abstract: An extensive survey of the papers pertaining to the thermodynamic approach to tribosystems, particularly using the concept of entropy as a natural time base, is presented with a summary of the important contributions of leading researchers. … Friction is an energy transformation process. Using a nearequilibrium analysis, one can demonstrate (see Section 4) how sliding energy is dissipated. Rymuza [31] considers friction as a process that transforms the external mechanical energy to the energy of internal processes. Rymuza proposes that the traditional ‘laws’ of friction is incapable of reflecting its energetic nature and suggests a new parameter called ‘coefficient of friction losses’ (CFL), so as to reflect both the dissipative nature of friction process and simultaneously provide a useful formulation for applicationin engineering practice. … As discussed in Sections 4 and 5, the temperature, and particularly temperature gradient within the mating bodies plays an important role in assessment of entropy generation in a tribosystem. Both theoretical and experimental methods have been developed for determination of temperature rise at the contact surface. Blok [69] is credited to be the first researcher who proposed a model for determination of the temperature rise at the surfaces of contacting bodies under boundary lubricated condition.” The authors point out that the generation of entropy is caused by the large temperature gradient at the point of contact. However, the temperature gradient is caused by friction. The inhomogeneity in temperature is confined to “the point of contact”. I find it reasonable to assume that this is a “quasistatic” case. Maybe if you find a formal definition of quasistatic, then I would be forced to agree with you. If quasistatic were formally defined by dS=0, then the OP and I are making a mistake. However, I didn’t know that quasistatic was defined so precisely. Scientists established early that friction can generate caloric (another word for entropy). That famous cannon investigated by Count Rumford was probably an excellent example of a quasistatic system. The cannon bore was placed in water, so that the system was isothermal. The temperature of the cannon and the water it was immersed in was 100 degrees centigrade, and did not fluctuate enough for scientists to measure. Except at the point of contact, that cannon remained at the boiling point of water. Obviously, the temperature of the iron must have been much higher in the region near the point of contact. This contact temperature could not and probably cannot be measured by ordinary thermometry. However, it is enough to know that friction creates entropy. http://en.wikipedia.org/wiki/Entropy “Carnot based his views of heat partially on the early 18th century "Newtonian hypothesis" that both heat and light were types of indestructible forms of matter, which are attracted and repelled by other matter, and partially on the contemporary views of Count Rumford who showed (1789) that heat could be created by friction as when cannon bores are machined. … In the 1850s and 1860s, German physicist Rudolf Clausius objected to this supposition, i.e. that no change occurs in the working body, and gave this "change" a mathematical interpretation by questioning the nature of the inherent loss of usable heat when work is done, e.g. heat produced by friction.” The following link provides a free copy of the article. However, you have to request it. This looks like a great article. I have never seen anyone handle the problem of friction in thermodynamics so thoroughly. I may have to read it carefully before I get back into the thread. 


#30
Dec2212, 05:42 PM

P: 789

I sound like I know exactly what I am talking about, but I'm still trying to piece this together, so I am looking for cases that challenge this viewpoint that Darwin123 put forth, that the entropy of classical thermodynamics can be treated sort of like a fluid that is both transported and created at particular places, but never destroyed. Like entropy of mixing  there are no temperature or pressure gradients, only a chemical potential gradient, yet entropy is created. I guess entropy transport is driven by temperature differences, but entropy creation is not limited to a temperature gradient. 


#31
Dec2212, 06:14 PM

P: 5,462

My history books have 'caloric' in use before Clausius was a gleam in his father's eye let alone entropy being a gleam in Clausius' eyes.
Originally there term calor or caloric or calorie was used to represent both temperature and heat since these were not properly distinguished. Black enunciated the difference in "Lectures on the Elements of Chemistry" (published in 1803 four years posthumously). I can agree with Rap's definition of a quasistatic process as a process that is plottable as a continuous curve on an indicator diagram. By itself this does not make it reversible or irreversible since reversible changes can always be plotted and irreversible changes sometimes plotted. It is also worth noting the difference in the definitions of entropy between Caratheodory and previous workers such as Clausius and Kelvin. 


#32
Dec2212, 06:41 PM

P: 5,462




#33
Dec2212, 09:29 PM

P: 741

Clausius starts to use the word "heat" to refer to some type of energy. However, he also makes it clear that entropy can carry energy. Entropy is an intensive property. This means that it is localized. Every bit of entropy is located at a spatial point. Entropy that is created also exists at a spatial point. This is why entropy can move. Clausius argues that heat is a form of motion rather than a fluid. This is based explicitly on the fact that friction creates entropy. However, the equations that he writes are consistent with entropy flowing. I think it is useful to think of entropy as a fluid analog with temperature a pressure analog. Temperature is the pressure that the entropy is under. Or if you like electrodynamics, entropy is analogous to electric charge. Temperature is analogous to electrical potential. Entropy flows from a high to low temperature the way positive electric charge flows from high to low electric potential. The temperature is a monotonically increasing function of entropy density. If the density of entropy is high, then the temperature is high. The motion of entropy is entirely consistent with the creation of entropy. Motion is a consequence of the fact that entropy is an intensive property. The motion of entropy has nothing to do with whether or not it is conserved. Fluids don't have to be conserved in order to flow. Chemical reactions can change the concentration of fluids even while they are flowing. In the case of friction, entropy is created in a region where there is a nonzero gradient of some thermodynamic quantity. However, the temperature at the point of contact is very high. Therefore, entropy flows to a region of lower temperature. In the case of a mixture of dissimilar gases, it is incorrect to say that "the" pressure is constant throughout the process. The sum of the partial pressures may be constant. However, the partial pressures are each changing while the mixture is going on. In fact, all the partial pressures are decreasing.The gradients of each partial pressure is a nonzero vector. Therefore, there are gradients that are creating entropy. Each partial pressure is a thermodynamic quantity. The equation of state explicitly includes with each partial pressure. One can express the equation of state as a function of partial pressures. The important thing to notice in the case of "isobaric mixing" is that only the sum of pressures is constant. The two biggest thing that makes entropy different from electric charge is that entropy can be created, and that entropy has only one sign. Electric charge is conserved, but entropy can be created. Electric charge can be positive or negative. The third law of thermodynamics shows that there is an minimum to the absolute entropy of a system. However, both electric charge and entropy are intensive properties. The fact that they are intensive implies that they can move. 


#34
Dec2212, 10:39 PM

Sci Advisor
HW Helper
P: 6,679

a) in thermal equilibrium internally; b) are arbitrarily close to thermal equilibrium with all components with which they are in thermal contact; and c) are in, or are arbitrarily close to, dynamic equilibrium at all times with each other. So, for example a Carnot engine operates using quasistatic processes. Whether the Carnot engine processes are reversible depends on whether the work produced during the process is turned into heat flow. If it is, the work cannot be used to reverse the Carnot engine to return the system and surroundings to its initial state. This is a subtle distinction between quasistatic and reversible that is not always made clear. AM 


#35
Dec2212, 11:36 PM

P: 789

Regarding definitions of entropy, I think Lieb and Yngvason have taken the next step beyond Caratheodory. Caratheodory's definition of classical entropy is restricted to quasistatic transformations from equilibrium state 1 to equilibrium state 2, while Lieb and Yngvason's definition of classical entropy removes the quasistatic constraint. Their papers are rather hairy, but Thess gives a more user friendly description  see http://www.amazon.com/TheEntropyPr.../dp/3642133487  but I only get the general idea of LiebYngvason and Caratheodory, I still haven't mastered either. My hunch is that Lieb and Yngvason are on to something. 


#36
Dec2312, 02:54 AM

P: 5,462

@Darwin
My point was that caloric was a word used in the english speaking world and the concept already dispelled by the time of Clausius, Clausius was a mid 19th century worker, Black a later 18th century one. @Rap I am not suggesting Caratheodory as the most recent authority. I raised his definition as it seems to me the most pertinent to this discussion on lines on indicator diagrams and mathematical continuity. Workers prior to Caratheodory all specified what we now call the second law in terms of cyclic processes ie closed loops on the indicator diagram composed of several lines. That is from State A to State B and back again. Caratheodory was the first to offer a definition that could be applied to a single line ie from State A to State B. My English translation has "In the neighbourhood of any equilibrium state of a system there are states that are inaccessible by an adiathermal process" 


Register to reply 
Related Discussions  
How can we measure entropy using experiments.  Classical Physics  2  
Entropy changes: Thermal energy to work and (some) back to Thermal energy  Classical Physics  2  
Entropy as a measure of time in relativity  Special & General Relativity  3  
Can energy be transferred without doing work?(entropy)  Classical Physics  7 