# Entropy is a measure of energy availiable for work ????

by Rap
Tags: availiable, energy, entropy, measure, work
P: 786
 Quote by Darwin123 Entropy can change only two ways. It can move or it can be created. In an adiabatic process, entropy can't move. The temperature changes instead. In an isothermal process, the entropy moves in such a way as to keep the temperature constant.
Note, that is only for a simple system (homogeneous).

For complex systems contained inside a thermally insulating boundary, entropy may move around inside the system, driven by temperature differences inside the system, equalizing them when possible, but can never be transferred across the boundary. During these internal sub-processes, entropy may also be created. In an isothermal process, the boundary is thermally open, and entropy may move across the boundary, again driven by temperature differences between the system and the environment, in such a way as to equalize internal temperatures at the constant temperature of the environment, when possible.

Entropy transfer goes hand in hand with energy transfer via dU=T dS. If a process is converting energy to work, and you want to know how much of that energy is converted to work, in order to keep the bookwork straight, you cannot bring in energy or entropy from somewhere else to accomplish that work. To ensure this, the process has to be adiabatic, i.e. inside a thermally insulating boundary which prevents entropy and energy coming in from somewhere else.

For finite temperature differences, transfer of entropy across a thermally open boundary causes creation of entropy at the boundary which is transferred to the lower temperature system. The transfer of entropy is of order ∆T, the creation of entropy at the boundary is of order ∆T^2, so, in the limit of small ∆T, entropy may be transferred without creation at the boundary. If ∆T is identically zero, there will be no transfer of entropy, since only temperature differences will drive entropy transfer.
HW Helper
P: 6,347
 Quote by Darwin123 Entropy can change only two ways. It can move or it can be created. In an adiabatic process, entropy can't move. The temperature changes instead. In an isothermal process, the entropy moves in such a way as to keep the temperature constant.
I think you have to qualify this statement. You have to speaking about a reversible process. Entropy can and certainly does change in an adiabatic irreversible expansion.

And I wouldn't say it moves because entropy is not a conserved quantity such as energy or momentum. We can speak of energy or momentum transfer because the loss of energy/momentum must result in the gain of energy/momentum of some other body so it behaves as if it moves. Entropy does not behave like. So I would suggest that the concept of entropy moving is not a particularly helpful one.

In a reversible isothermal process total entropy change is 0. In a real isothermal process, the entropy of the system + surroundings inevitably increases. It is not entropy that moves. It is energy. And the faster the energy moves, the greater the increase in entropy. So I might suggest that entropy increase is related more to the speed of energy transfer (heat flow) than to the fact that a body remains at the same temperature.

AM
P: 786
 Quote by Andrew Mason I think you have to qualify this statement. You have to speaking about a reversible process. Entropy can and certainly does change in an adiabatic irreversible expansion. And I wouldn't say it moves because entropy is not a conserved quantity such as energy or momentum. We can speak of energy or momentum transfer because the loss of energy/momentum must result in the gain of energy/momentum of some other body so it behaves as if it moves. Entropy does not behave like. So I would suggest that the concept of entropy moving is not a particularly helpful one. In a reversible isothermal process total entropy change is 0. In a real isothermal process, the entropy of the system + surroundings inevitably increases. It is not entropy that moves. It is energy. And the faster the energy moves, the greater the increase in entropy. So I might suggest that entropy increase is related more to the speed of energy transfer (heat flow) than to the fact that a body remains at the same temperature. AM
When entropy is conserved, i.e. when a process is reversible, the concept of entropy moving is just as valid as the concept of conserved energy moving, and is therefore very helpful. If you can identify the location at which entropy is created (e.g. at the thermally open boundary between two systems with a finite temperature difference), and identify into which system the created entropy is deposited, then I see no conceptual problem with Dirac123's statement resulting from the non-conservation of entropy. Not yet being totally certain of this concept, I would be interested in any specific process you can think of where this concept is inappropriate.
HW Helper
P: 6,347
 Quote by Rap Note, that is only for a simple system (homogeneous). Entropy transfer goes hand in hand with energy transfer via dU=T dS. If a process is converting energy to work, and you want to know how much of that energy is converted to work, in order to keep the bookwork straight, you cannot bring in energy or entropy from somewhere else to accomplish that work. To ensure this, the process has to be adiabatic, i.e. inside a thermally insulating boundary which prevents entropy and energy coming in from somewhere else.
Is that really correct? It seems to me that a thermally insulating boundary does not prevent entropy or energy "coming in" from somewhere else: it only prevents heat flow across the boundary.

 Quote by Rap When entropy is conserved, i.e. when a process is reversible, the concept of entropy moving is just as valid as the concept of conserved energy moving, and is therefore very helpful. If you can identify the location at which entropy is created (e.g. at the thermally open boundary between two systems with a finite temperature difference), and identify into which system the created entropy is deposited, then I see no conceptual problem with Dirac123's [Darwin123] statement resulting from the non-conservation of entropy. Not yet being totally certain of this concept, I would be interested in any specific process you can think of where this concept is inappropriate.
It is not necessarily inappropriate and it may be a helpful concept when analysing reversible processes in which entropy does not change.

But I think that, overall, the concept of entropy "moving" just adds to confusion about an already very difficult concept. It gives the impression that entropy is a physical quantity. Entropy is a statistical. It is a bit like temperature in that respect. We would not say that temperature moves between bodies or is lost or created.

We do say that Q (= mass x temperature x heat capacity) moves or is lost or created (heat flow) but we relate it to U and W, which together are always conserved. You can't do that with entropy.

AM
P: 786
 Quote by Andrew Mason Is that really correct? It seems to me that a thermally insulating boundary does not prevent entropy or energy "coming in" from somewhere else: it only prevents heat flow across the boundary. It is not necessarily inappropriate and it may be a helpful concept when analysing reversible processes in which entropy does not change. AM
I think it is correct: dQ=TdS means the flow of heat energy into a boundary is the temperature of the "in" side of the boundary times the entropy flow into the boundary. The flow of heat energy out of a boundary is temperature of the "out" side of the boundary times the entropy flow out of the boundary. No heat flow (dQ=0) means no entropy flow (dS=0) and vice versa. Energy into the boundary equals energy out, so if the temperatures are different, entropy in does not equal entropy out - entropy is being created at the boundary and deposited in the low temperature system.

 Quote by Andrew Mason But I think that, overall, the concept of entropy "moving" just adds to confusion about an already very difficult concept. It gives the impression that entropy is a physical quantity. Entropy is a statistical. It is a bit like temperature in that respect. We would not say that temperature moves between bodies or is lost or created. We do say that Q (= mass x temperature x heat capacity) moves or is lost or created (heat flow) but we relate it to U and W, which together are always conserved. You can't do that with entropy. AM
Temperature is also statistical, but it is an intensive quantity, not extensive. The concepts of "conserved" and "created" and "destroyed" do not apply to intensive quantities like temperature, pressure, chemical potential. They do apply to extensive quantities. For a simple system undergoing a reversible process,

1. Volume is conserved in a mechanically isolated system and volume changes are driven by pressure differences alone.

2. Particle number is conserved in a materially isolated system and particle number changes are driven by chemical potential differences alone.

3. Similarly, entropy is conserved in a thermally isolated system, entropy flows are driven by temperature differences alone.

If X is intensive, then differences in X drive changes in an extensive Y, where X and Y are conjugate variables. T and S, P and V, µ and N, are conjugate variables. The product of X and dY has units of energy and the fundamental law states that the sum of all those products is a conserved, extensive internal energy dU=TdS-PdV+µdN. (No worry about the sign of PdV)

If you relax the above constraints on the systems, conservation becomes more problematical. Volume is still always conserved in a mechanically isolated system. For a homogeneous system in which there may be chemical reactions, particle number is not conserved in a materially isolated system (although conservation of "component particles" is). Similarly, in an irreversible process, entropy is not conserved, it is created, in a thermally isolated system.

I think the simple fact that entropy is extensive implies that the concept of entropy flow, entropy creation at particular locations, entropy density, etc. is viable. I don't think it adds confusion, I think it brings insight and clarity to the concept of "classical entropy". If its true, I fully expect statistical mechanics to verify this, rather than muddying the waters.
P: 741
 Quote by Rap I think it is correct: dQ=TdS means the flow of heat energy into a boundary is the temperature of the "in" side of the boundary times the entropy flow into the boundary. The flow of heat energy out of a boundary is temperature of the "out" side of the boundary times the entropy flow out of the boundary. No heat flow (dQ=0) means no entropy flow (dS=0) and vice versa. Energy into the boundary equals energy out, so if the temperatures are different, entropy in does not equal entropy out - entropy is being created at the boundary and deposited in the low temperature system. Temperature is also statistical, but it is an intensive quantity, not extensive. The concepts of "conserved" and "created" and "destroyed" do not apply to intensive quantities like temperature, pressure, chemical potential. They do apply to extensive quantities. For a simple system undergoing a reversible process, 1. Volume is conserved in a mechanically isolated system and volume changes are driven by pressure differences alone. 2. Particle number is conserved in a materially isolated system and particle number changes are driven by chemical potential differences alone. 3. Similarly, entropy is conserved in a thermally isolated system, entropy flows are driven by temperature differences alone.
This I disagree with, at least in the OP's scenario.

The OP said that the expansion is quasistatic. This means that the expansion is so slow, the ideal gas is in a state infinitesimally close to thermal equilibrium. This means that variation of temperature and pressure within the ideal gas are negligible. If there is a significant amount of entropy created, it is not from the temperature differences in the gas.

Similarly, there can't be any coherent sound energy. The gas can't "squeak". A coherent sound wave would imply that the macroscopic pressure and the macroscopic temperature were inhomogeneous.

The only thing that we can be sure of from his description is that the expansion is adiabatic. Otherwise, work won't entirely come from the internal energy of the gas. Therefore, I presume that the container of ideal gas is comprised of a piston and cylinder made of some thermal insulator with zero heat capacity. The pressure outside the container doesn't really matter. It could be zero.

In the OP's scenario, there could be friction between the piston surface and the cylinder surface that contains the gas. Because of heat conduction from surface to gas, the "heat energy" created by the friction goes right back into the ideal gas. Because the expansion is adiabatic, neither entropy nor heat energy can be conducted out of the container. Therefore, work done by the friction can't affect the internal energy of the ideal gas.

In the simplest case that I can imagine, the friction is a combination of sliding friction and static friction. Simple formulas for sliding friction and static friction are taught in introductory physics classes. Although real friction is more complicated, I think this approximation is good enough as an illustration.

I intend to show a calculation for this situation, where all entropy created comes from sliding fiction. In this example, there will be no aerodynamic friction or temperature differences in the ideal gas. The expansion will be adiabatic and quasistatic, but not reversible.
P: 786
 Quote by Darwin123 This I disagree with, at least in the OP's scenario. The OP said that the expansion is quasistatic. This means that the expansion is so slow, the ideal gas is in a state infinitesimally close to thermal equilibrium. This means that variation of temperature and pressure within the ideal gas are negligible. If there is a significant amount of entropy created, it is not from the temperature differences in the gas. Similarly, there can't be any coherent sound energy. The gas can't "squeak". A coherent sound wave would imply that the macroscopic pressure and the macroscopic temperature were inhomogeneous. The only thing that we can be sure of from his description is that the expansion is adiabatic. Otherwise, work won't entirely come from the internal energy of the gas. Therefore, I presume that the container of ideal gas is comprised of a piston and cylinder made of some thermal insulator with zero heat capacity. The pressure outside the container doesn't really matter. It could be zero. In the OP's scenario, there could be friction between the piston surface and the cylinder surface that contains the gas. Because of heat conduction from surface to gas, the "heat energy" created by the friction goes right back into the ideal gas. Because the expansion is adiabatic, neither entropy nor heat energy can be conducted out of the container. Therefore, work done by the friction can't affect the internal energy of the ideal gas. In the simplest case that I can imagine, the friction is a combination of sliding friction and static friction. Simple formulas for sliding friction and static friction are taught in introductory physics classes. Although real friction is more complicated, I think this approximation is good enough as an illustration. I intend to show a calculation for this situation, where all entropy created comes from sliding fiction. In this example, there will be no aerodynamic friction or temperature differences in the ideal gas. The expansion will be adiabatic and quasistatic, but not reversible.
I guess I don't know which point you disagree with in the quote, but I will try to make the same calculation. A quasistatic expansion of a gas in a thermally and materially isolated system, but mechanically open, with sliding friction. Specifically a cylinder with one circular end (the piston) a movable boundary. I'm guessing the force of friction can be considered constant and opposing the motion of the piston. So the heat energy created by friction is K F x where K is a constant, F is the friction force and x is the distance moved. If the force of the gas on the piston is ever less than F, the piston stops. I will try to do that calculation.
HW Helper
P: 6,347
 Quote by Rap I think it is correct: dQ=TdS means the flow of heat energy into a boundary is the temperature of the "in" side of the boundary times the entropy flow into the boundary.
The dQ is the reversible heat flow, not the actual heat flow.

 The flow of heat energy out of a boundary is temperature of the "out" side of the boundary times the entropy flow out of the boundary. No heat flow (dQ=0) means no entropy flow (dS=0) and vice versa.
If the reversible heat flow, dQrev = 0, dS = 0. But for an irreversible adiabatic process ΔS>0.

 3. Similarly, entropy is conserved in a thermally isolated system, entropy flows are driven by temperature differences alone.
Entropy is conserved only if all processes within the thermally isolated system are reversible.

AM
P: 786
 Quote by Andrew Mason The dQ is the reversible heat flow, not the actual heat flow. If the reversible heat flow, dQrev = 0, dS = 0. But for an irreversible adiabatic process ΔS>0.
Yes, and looking at what I said, I failed to specify that the process needed to be quasistatic. Then there is no entropy created in the hot body as a result of energy flowing from the hot body into the boundary. Likewise, there is no entropy created in the cold body as a result of energy flowing out of the boundary into the cold body. In other words, both bodies' equilibria are not disturbed by the energy flow across the boundary. Their state parameters may change, but the state parameters stay homogeneous in both bodies.

There is no entropy creation inside either body, yet the drop in entropy of the hot body is less than the increase of entropy of the cold body as a result of the entropy transfer, so the process is irreversible. Entropy is created in or at the boundary.

If the process were not quasistatic, there would be temperature gradients inside the bodies as well (rather than just at the boundary) and temperature gradients create entropy, so entropy would be created in the bodies themselves, not just at the boundary.

 Quote by Andrew Mason Entropy is conserved only if all processes within the thermally isolated system are reversible.
Right - I stated before I listed those 3 points "For a simple system undergoing a reversible process,"
HW Helper
P: 6,347
 Quote by Rap Yes, and looking at what I said, I failed to specify that the process needed to be quasistatic. Then there is no entropy created in the hot body as a result of energy flowing from the hot body into the boundary. Likewise, there is no entropy created in the cold body as a result of energy flowing out of the boundary into the cold body. In other words, both bodies' equilibria are not disturbed by the energy flow across the boundary. Their state parameters may change, but the state parameters stay homogeneous in both bodies. There is no entropy creation inside either body, yet the drop in entropy of the hot body is less than the increase of entropy of the cold body as a result of the entropy transfer, so the process is irreversible. Entropy is created in or at the boundary.
I am just trying to understand what you are saying here. It appears that you saying that entropy is created as the result of a quasi-static heat flow. I don't follow this. The only way to make heat flow quasi-static is to have an infinitessimal temperature difference. If that is the case, ΔS = 0 so I don't see entropy being created anywhere. If there is entropy "created" at a boundary, it is not a quasi-static process.

 If the process were not quasistatic, there would be temperature gradients inside the bodies as well (rather than just at the boundary) and temperature gradients create entropy, so entropy would be created in the bodies themselves, not just at the boundary.
Temperature gradients are not the only sources of increased entropy. Heat does not have to flow to create entropy eg. the mixing of two different gases.

AM
P: 741
 Quote by Andrew Mason I am just trying to understand what you are saying here. It appears that you saying that entropy is created as the result of a quasi-static heat flow. I don't follow this. The only way to make heat flow quasi-static is to have an infinitessimal temperature difference. If that is the case, ΔS = 0 so I don't see entropy being created anywhere. If there is entropy "created" at a boundary, it is not a quasi-static process.
Help us out, here. Maybe you can find us a formal definition of the word "quasistatic".

I assumed that the word “quasistatic” used by the OP was the same as near-equilibrium. A later post by the OP confirmed that he was talking about a process which was the sum of a series of small processes where the state of the ideal gas was near equilibrium.

Entropy can be created even if a system is “near equilibrium” at each infinitesimal step. Maybe you are right in that there is a temperature gradient where the entropy is created. At the point of contact between two surfaces, the temperature gradient magnitude can be very large. However, the inverse of the temperature gradient may not be completely macroscopic. On the length scale determined by the dimensions of the container of gas, the inverse gradient is microscopic. On the length scale determined by the dimensions of a molecule, the inverse gradient is rather large.

Frictional forces do create entropy. Most of the container of gas in an adiabatic expansion can be near thermal equilibrium. The important thing about a system that is “near equilibrium” is that the intensive quantities will be uniform over most of the system. There may be a large temperature in a 1 cubic micron volume at the point of contact. However, the temperature of the ideal gas over most of the volume will be at a single value, T. Similarly, the pressure at the point of contact may be huge, resulting in a large stress at the point of contact. However, one can approximate the pressure over most of the system by a single value, P.

The following article addresses the issue of how friction is treated in a thermodynamic analysis. Note that there was at least one investigator, Rymuza, who examined sliding friction in a near equilibrium process.

http://www.mdpi.com/1099-4300/12/5/1021
“On the Thermodynamics of Friction and Wear―A Review
Abstract: An extensive survey of the papers pertaining to the thermodynamic approach to tribosystems, particularly using the concept of entropy as a natural time base, is presented with a summary of the important contributions of leading researchers.

Friction is an energy transformation process. Using a near-equilibrium analysis, one can
demonstrate (see Section 4) how sliding energy is dissipated. Rymuza [31] considers friction as a process that transforms the external mechanical energy to the energy of internal processes. Rymuza proposes that the traditional ‘laws’ of friction is incapable of reflecting its energetic nature and suggests a new parameter called ‘coefficient of friction losses’ (CFL), so as to reflect both the dissipative nature of friction process and simultaneously provide a useful formulation for applicationin engineering practice.

As discussed in Sections 4 and 5, the temperature, and particularly temperature gradient within the mating bodies plays an important role in assessment of entropy generation in a tribosystem. Both theoretical and experimental methods have been developed for determination of temperature rise at the contact surface. Blok [69] is credited to be the first researcher who proposed a model for determination of the temperature rise at the surfaces of contacting bodies under boundary lubricated condition.”

The authors point out that the generation of entropy is caused by the large temperature gradient at the point of contact. However, the temperature gradient is caused by friction. The inhomogeneity in temperature is confined to “the point of contact”. I find it reasonable to assume that this is a “quasistatic” case.

Maybe if you find a formal definition of quasistatic, then I would be forced to agree with you. If quasistatic were formally defined by dS=0, then the OP and I are making a mistake. However, I didn’t know that quasistatic was defined so precisely.

Scientists established early that friction can generate caloric (another word for entropy). That famous cannon investigated by Count Rumford was probably an excellent example of a quasistatic system. The cannon bore was placed in water, so that the system was isothermal. The temperature of the cannon and the water it was immersed in was 100 degrees centigrade, and did not fluctuate enough for scientists to measure. Except at the point of contact, that cannon remained at the boiling point of water. Obviously, the temperature of the iron must have been much higher in the region near the point of contact. This contact temperature could not and probably cannot be measured by ordinary thermometry. However, it is enough to know that friction creates entropy.

http://en.wikipedia.org/wiki/Entropy
“Carnot based his views of heat partially on the early 18th century "Newtonian hypothesis" that both heat and light were types of indestructible forms of matter, which are attracted and repelled by other matter, and partially on the contemporary views of Count Rumford who showed (1789) that heat could be created by friction as when cannon bores are machined.

In the 1850s and 1860s, German physicist Rudolf Clausius objected to this supposition, i.e. that no change occurs in the working body, and gave this "change" a mathematical interpretation by questioning the nature of the inherent loss of usable heat when work is done, e.g. heat produced by friction.”

The following link provides a free copy of the article. However, you have to request it.

This looks like a great article. I have never seen anyone handle the problem of friction in thermodynamics so thoroughly. I may have to read it carefully before I get back into the thread.
P: 786
 Quote by Andrew Mason I am just trying to understand what you are saying here. It appears that you saying that entropy is created as the result of a quasi-static heat flow. I don't follow this. The only way to make heat flow quasi-static is to have an infinitessimal temperature difference. If that is the case, ΔS = 0 so I don't see entropy being created anywhere. If there is entropy "created" at a boundary, it is not a quasi-static process. Temperature gradients are not the only sources of increased entropy. Heat does not have to flow to create entropy eg. the mixing of two different gases. AM
All reversible processes are quasistatic but not vice versa. Reversible processes do not create entropy, quasistatic processes may or may not. Quasistatic processes are described by a continuum of equilibrium states. (A curve on a PV or TS diagram). The only way to make heat flow REVERSIBLE is to have an infinitesimal temperature difference, because entropy is not created. I'm calling the process quasistatic because it is so slow that the two systems are always in practical equilibrium, but irreversible because entropy is being created.

I sound like I know exactly what I am talking about, but I'm still trying to piece this together, so I am looking for cases that challenge this viewpoint that Darwin123 put forth, that the entropy of classical thermodynamics can be treated sort of like a fluid that is both transported and created at particular places, but never destroyed. Like entropy of mixing - there are no temperature or pressure gradients, only a chemical potential gradient, yet entropy is created. I guess entropy transport is driven by temperature differences, but entropy creation is not limited to a temperature gradient.
 P: 5,462 My history books have 'caloric' in use before Clausius was a gleam in his father's eye let alone entropy being a gleam in Clausius' eyes. Originally there term calor or caloric or calorie was used to represent both temperature and heat since these were not properly distinguished. Black enunciated the difference in "Lectures on the Elements of Chemistry" (published in 1803 four years posthumously). I can agree with Rap's definition of a quasistatic process as a process that is plottable as a continuous curve on an indicator diagram. By itself this does not make it reversible or irreversible since reversible changes can always be plotted and irreversible changes sometimes plotted. It is also worth noting the difference in the definitions of entropy between Caratheodory and previous workers such as Clausius and Kelvin.
P: 5,462
 From a post deleted ? or lost? by Darwin123 There most definitely are a pressure gradient when two gases mix. In fact, there are two pressure gradients
Rap mentioned fluids not gases mixing. It is possible to devise fluid mixing that occurs under a concentration gradient, with neither pressure nor temperature involved.
P: 741
 Quote by Studiot My history books have 'caloric' in use before Clausius was a gleam in his father's eye let alone entropy being a gleam in Clausius' eyes. Originally there term calor or caloric or calorie was used to represent both temperature and heat since these were not properly distinguished.
I read English translations of Carnot and Clausius essays. Carnot does not use the word caloric to designate energy. Caloric is a fluid that carries energy, which is also heat. He never confuses temperature with caloric. Temperature appears analogous to pressure. His equations distinguish between temperature and caloric.

Clausius starts to use the word "heat" to refer to some type of energy. However, he also makes it clear that entropy can carry energy. Entropy is an intensive property. This means that it is localized. Every bit of entropy is located at a spatial point. Entropy that is created also exists at a spatial point. This is why entropy can move.

Clausius argues that heat is a form of motion rather than a fluid. This is based explicitly on the fact that friction creates entropy. However, the equations that he writes are consistent with entropy flowing.

I think it is useful to think of entropy as a fluid analog with temperature a pressure analog. Temperature is the pressure that the entropy is under. Or if you like electrodynamics, entropy is analogous to electric charge. Temperature is analogous to electrical potential. Entropy flows from a high to low temperature the way positive electric charge flows from high to low electric potential. The temperature is a monotonically increasing function of entropy density. If the density of entropy is high, then the temperature is high.

The motion of entropy is entirely consistent with the creation of entropy. Motion is a consequence of the fact that entropy is an intensive property. The motion of entropy has nothing to do with whether or not it is conserved. Fluids don't have to be conserved in order to flow. Chemical reactions can change the concentration of fluids even while they are flowing.

In the case of friction, entropy is created in a region where there is a nonzero gradient of some thermodynamic quantity. However, the temperature at the point of contact is very high. Therefore, entropy flows to a region of lower temperature.

In the case of a mixture of dissimilar gases, it is incorrect to say that "the" pressure is constant throughout the process. The sum of the partial pressures may be constant. However, the partial pressures are each changing while the mixture is going on. In fact, all the partial pressures are decreasing.The gradients of each partial pressure is a nonzero vector. Therefore, there are gradients that are creating entropy.

Each partial pressure is a thermodynamic quantity. The equation of state explicitly includes with each partial pressure. One can express the equation of state as a function of partial pressures. The important thing to notice in the case of "isobaric mixing" is that only the sum of pressures is constant.

The two biggest thing that makes entropy different from electric charge is that entropy can be created, and that entropy has only one sign. Electric charge is conserved, but entropy can be created. Electric charge can be positive or negative. The third law of thermodynamics shows that there is an minimum to the absolute entropy of a system. However, both electric charge and entropy are intensive properties. The fact that they are intensive implies that they can move.
HW Helper
P: 6,347
 Quote by Darwin123 Help us out, here. Maybe you can find us a formal definition of the word "quasistatic". I assumed that the word “quasistatic” used by the OP was the same as near-equilibrium. A later post by the OP confirmed that he was talking about a process which was the sum of a series of small processes where the state of the ideal gas was near equilibrium.
Here is how I would define a quasistatic process: it is one that moves at an arbitrarily slow rate so that all components of the system and the surroundings are
a) in thermal equilibrium internally;
b) are arbitrarily close to thermal equilibrium with all components with which they are in thermal contact; and
c) are in, or are arbitrarily close to, dynamic equilibrium at all times with each other.

So, for example a Carnot engine operates using quasistatic processes. Whether the Carnot engine processes are reversible depends on whether the work produced during the process is turned into heat flow. If it is, the work cannot be used to reverse the Carnot engine to return the system and surroundings to its initial state. This is a subtle distinction between quasistatic and reversible that is not always made clear.

AM
P: 786
 Quote by Studiot My history books have 'caloric' in use before Clausius was a gleam in his father's eye let alone entropy being a gleam in Clausius' eyes. Originally there term calor or caloric or calorie was used to represent both temperature and heat since these were not properly distinguished. Black enunciated the difference in "Lectures on the Elements of Chemistry" (published in 1803 four years posthumously). I can agree with Rap's definition of a quasistatic process as a process that is plottable as a continuous curve on an indicator diagram. By itself this does not make it reversible or irreversible since reversible changes can always be plotted and irreversible changes sometimes plotted. It is also worth noting the difference in the definitions of entropy between Caratheodory and previous workers such as Clausius and Kelvin.
Yes, I didn't clarify that. I always draw quasistatic lines on an indicator diagram as a solid line, non-quasistatic, but with some state function remaining constant, as a dotted line. (e.g. Joule expansion). Reversible is a subset of quasistatic, so yes, reversible can always be plotted with a solid line. Non-quasistatic, with no state function constant, its just two unconnectable points. (Can't think of an example right now).

Regarding definitions of entropy, I think Lieb and Yngvason have taken the next step beyond Caratheodory. Caratheodory's definition of classical entropy is restricted to quasistatic transformations from equilibrium state 1 to equilibrium state 2, while Lieb and Yngvason's definition of classical entropy removes the quasistatic constraint. Their papers are rather hairy, but Thess gives a more user friendly description - see http://www.amazon.com/The-Entropy-Pr.../dp/3642133487 - but I only get the general idea of Lieb-Yngvason and Caratheodory, I still haven't mastered either. My hunch is that Lieb and Yngvason are on to something.

 Quote by Studiot Rap mentioned fluids not gases mixing. It is possible to devise fluid mixing that occurs under a concentration gradient, with neither pressure nor temperature involved.
Yes, the "entropy of mixing" problem, where you have A particles on one side, B particles on the other, separated by a partition, both sides at the same temperature and pressure, but obviously not at the same chemical potentials. Then you quickly remove the partition. When equilibrium finally occurs, the two types are fully mixed and the total entropy is larger than the sum of the two original entropies and the chemical potentials are uniform. Usually done for two non-reacting ideal gases, so that the final entropy can actually be calculated. Unlike Joule expansion, you could assume LTE, where any small volume element is a thermodynamic system in equilibrium with a universally constant T and P, and actually solve the problem for two ideal gases as it develops in time. It would be interesting to calculate the entropy density as a function of position as it develops in time. I've been thinking of doing this, just to get a better grasp of entropy creation.
 P: 5,462 @Darwin My point was that caloric was a word used in the english speaking world and the concept already dispelled by the time of Clausius, Clausius was a mid 19th century worker, Black a later 18th century one. @Rap I am not suggesting Caratheodory as the most recent authority. I raised his definition as it seems to me the most pertinent to this discussion on lines on indicator diagrams and mathematical continuity. Workers prior to Caratheodory all specified what we now call the second law in terms of cyclic processes ie closed loops on the indicator diagram composed of several lines. That is from State A to State B and back again. Caratheodory was the first to offer a definition that could be applied to a single line ie from State A to State B. My English translation has "In the neighbourhood of any equilibrium state of a system there are states that are inaccessible by an adiathermal process"

 Related Discussions Classical Physics 2 Classical Physics 2 Special & General Relativity 3 Classical Physics 7