Entropy is a measure of energy availiable for work ?

Click For Summary
SUMMARY

The discussion centers on the concept of entropy as a measure of energy unavailable for work, specifically addressing the statement "Entropy is a measure of energy available for work." Participants clarify that this statement is incorrect without the crucial word "not." They explore examples involving gas expansion, emphasizing that expanding into a vacuum does no work, while expansion against an external pressure does work until equilibrium is reached. The conversation highlights the need for precise definitions and the mathematical relationships governing entropy and work, particularly in the context of Carnot cycles and reversible processes.

PREREQUISITES
  • Understanding of thermodynamics, particularly the laws of thermodynamics.
  • Familiarity with the concept of entropy and its mathematical definitions.
  • Knowledge of Carnot cycles and their implications for energy conversion.
  • Basic grasp of ideal gas laws and their applications in thermodynamic processes.
NEXT STEPS
  • Study the mathematical derivation of entropy in thermodynamic systems.
  • Learn about the Carnot cycle and its efficiency in converting heat to work.
  • Explore the implications of adiabatic and reversible processes in thermodynamics.
  • Investigate the relationship between temperature, heat flow, and work in thermodynamic systems.
USEFUL FOR

Students of thermodynamics, physicists, and engineers involved in energy systems and heat engines will benefit from this discussion, particularly those seeking to clarify the principles of entropy and its role in energy conversion.

  • #31


My history books have 'caloric' in use before Clausius was a gleam in his father's eye let alone entropy being a gleam in Clausius' eyes.

Originally there term calor or caloric or calorie was used to represent both temperature and heat since these were not properly distinguished.

Black enunciated the difference in "Lectures on the Elements of Chemistry" (published in 1803 four years posthumously).

I can agree with Rap's definition of a quasistatic process as a process that is plottable as a continuous curve on an indicator diagram.
By itself this does not make it reversible or irreversible since reversible changes can always be plotted and irreversible changes sometimes plotted.

It is also worth noting the difference in the definitions of entropy between Caratheodory and previous workers such as Clausius and Kelvin.
 
Science news on Phys.org
  • #32


From a post deleted ? or lost? by Darwin123
There most definitely are a pressure gradient when two gases mix. In fact, there are two pressure gradients

Rap mentioned fluids not gases mixing. It is possible to devise fluid mixing that occurs under a concentration gradient, with neither pressure nor temperature involved.
 
Last edited:
  • #33


Studiot said:
My history books have 'caloric' in use before Clausius was a gleam in his father's eye let alone entropy being a gleam in Clausius' eyes.

Originally there term calor or caloric or calorie was used to represent both temperature and heat since these were not properly distinguished.

I read English translations of Carnot and Clausius essays. Carnot does not use the word caloric to designate energy. Caloric is a fluid that carries energy, which is also heat. He never confuses temperature with caloric. Temperature appears analogous to pressure. His equations distinguish between temperature and caloric.

Clausius starts to use the word "heat" to refer to some type of energy. However, he also makes it clear that entropy can carry energy. Entropy is an intensive property. This means that it is localized. Every bit of entropy is located at a spatial point. Entropy that is created also exists at a spatial point. This is why entropy can move.

Clausius argues that heat is a form of motion rather than a fluid. This is based explicitly on the fact that friction creates entropy. However, the equations that he writes are consistent with entropy flowing.

I think it is useful to think of entropy as a fluid analog with temperature a pressure analog. Temperature is the pressure that the entropy is under. Or if you like electrodynamics, entropy is analogous to electric charge. Temperature is analogous to electrical potential. Entropy flows from a high to low temperature the way positive electric charge flows from high to low electric potential. The temperature is a monotonically increasing function of entropy density. If the density of entropy is high, then the temperature is high.

The motion of entropy is entirely consistent with the creation of entropy. Motion is a consequence of the fact that entropy is an intensive property. The motion of entropy has nothing to do with whether or not it is conserved. Fluids don't have to be conserved in order to flow. Chemical reactions can change the concentration of fluids even while they are flowing.

In the case of friction, entropy is created in a region where there is a nonzero gradient of some thermodynamic quantity. However, the temperature at the point of contact is very high. Therefore, entropy flows to a region of lower temperature.

In the case of a mixture of dissimilar gases, it is incorrect to say that "the" pressure is constant throughout the process. The sum of the partial pressures may be constant. However, the partial pressures are each changing while the mixture is going on. In fact, all the partial pressures are decreasing.The gradients of each partial pressure is a nonzero vector. Therefore, there are gradients that are creating entropy.

Each partial pressure is a thermodynamic quantity. The equation of state explicitly includes with each partial pressure. One can express the equation of state as a function of partial pressures. The important thing to notice in the case of "isobaric mixing" is that only the sum of pressures is constant.

The two biggest thing that makes entropy different from electric charge is that entropy can be created, and that entropy has only one sign. Electric charge is conserved, but entropy can be created. Electric charge can be positive or negative. The third law of thermodynamics shows that there is an minimum to the absolute entropy of a system. However, both electric charge and entropy are intensive properties. The fact that they are intensive implies that they can move.
 
  • #34


Darwin123 said:
Help us out, here. Maybe you can find us a formal definition of the word "quasistatic".

I assumed that the word “quasistatic” used by the OP was the same as near-equilibrium. A later post by the OP confirmed that he was talking about a process which was the sum of a series of small processes where the state of the ideal gas was near equilibrium.
Here is how I would define a quasistatic process: it is one that moves at an arbitrarily slow rate so that all components of the system and the surroundings are
a) in thermal equilibrium internally;
b) are arbitrarily close to thermal equilibrium with all components with which they are in thermal contact; and
c) are in, or are arbitrarily close to, dynamic equilibrium at all times with each other.

So, for example a Carnot engine operates using quasistatic processes. Whether the Carnot engine processes are reversible depends on whether the work produced during the process is turned into heat flow. If it is, the work cannot be used to reverse the Carnot engine to return the system and surroundings to its initial state. This is a subtle distinction between quasistatic and reversible that is not always made clear.

AM
 
Last edited:
  • #35


Studiot said:
My history books have 'caloric' in use before Clausius was a gleam in his father's eye let alone entropy being a gleam in Clausius' eyes.

Originally there term calor or caloric or calorie was used to represent both temperature and heat since these were not properly distinguished.

Black enunciated the difference in "Lectures on the Elements of Chemistry" (published in 1803 four years posthumously).

I can agree with Rap's definition of a quasistatic process as a process that is plottable as a continuous curve on an indicator diagram.
By itself this does not make it reversible or irreversible since reversible changes can always be plotted and irreversible changes sometimes plotted.

It is also worth noting the difference in the definitions of entropy between Caratheodory and previous workers such as Clausius and Kelvin.

Yes, I didn't clarify that. I always draw quasistatic lines on an indicator diagram as a solid line, non-quasistatic, but with some state function remaining constant, as a dotted line. (e.g. Joule expansion). Reversible is a subset of quasistatic, so yes, reversible can always be plotted with a solid line. Non-quasistatic, with no state function constant, its just two unconnectable points. (Can't think of an example right now).

Regarding definitions of entropy, I think Lieb and Yngvason have taken the next step beyond Caratheodory. Caratheodory's definition of classical entropy is restricted to quasistatic transformations from equilibrium state 1 to equilibrium state 2, while Lieb and Yngvason's definition of classical entropy removes the quasistatic constraint. Their papers are rather hairy, but Thess gives a more user friendly description - see https://www.amazon.com/dp/3642133487/?tag=pfamazon01-20 - but I only get the general idea of Lieb-Yngvason and Caratheodory, I still haven't mastered either. My hunch is that Lieb and Yngvason are on to something.

Studiot said:
Rap mentioned fluids not gases mixing. It is possible to devise fluid mixing that occurs under a concentration gradient, with neither pressure nor temperature involved.

Yes, the "entropy of mixing" problem, where you have A particles on one side, B particles on the other, separated by a partition, both sides at the same temperature and pressure, but obviously not at the same chemical potentials. Then you quickly remove the partition. When equilibrium finally occurs, the two types are fully mixed and the total entropy is larger than the sum of the two original entropies and the chemical potentials are uniform. Usually done for two non-reacting ideal gases, so that the final entropy can actually be calculated. Unlike Joule expansion, you could assume LTE, where any small volume element is a thermodynamic system in equilibrium with a universally constant T and P, and actually solve the problem for two ideal gases as it develops in time. It would be interesting to calculate the entropy density as a function of position as it develops in time. I've been thinking of doing this, just to get a better grasp of entropy creation.
 
Last edited by a moderator:
  • #36


@Darwin

My point was that caloric was a word used in the english speaking world and the concept already dispelled by the time of Clausius,
Clausius was a mid 19th century worker, Black a later 18th century one.

@Rap

I am not suggesting Caratheodory as the most recent authority. I raised his definition as it seems to me the most pertinent to this discussion on lines on indicator diagrams and mathematical continuity.
Workers prior to Caratheodory all specified what we now call the second law in terms of cyclic processes ie closed loops on the indicator diagram composed of several lines. That is from State A to State B and back again. Caratheodory was the first to offer a definition that could be applied to a single line ie from State A to State B.

My English translation has

"In the neighbourhood of any equilibrium state of a system there are states that are inaccessible by an adiathermal process"
 
  • #37
Rap said:
Yes, I didn't clarify that. I always draw quasistatic lines on an indicator diagram as a solid line, non-quasistatic, but with some state function remaining constant, as a dotted line. (e.g. Joule expansion). Reversible is a subset of quasistatic, so yes, reversible can always be plotted with a solid line. Non-quasistatic, with no state function constant, its just two unconnectable points. (Can't think of an example right now).

Regarding definitions of entropy, I think Lieb and Yngvason have taken the next step beyond Caratheodory. Caratheodory's definition of classical entropy is restricted to quasistatic transformations from equilibrium state 1 to equilibrium state 2, while Lieb and Yngvason's definition of classical entropy removes the quasistatic constraint. Their papers are rather hairy, but Thess gives a more user friendly description - see https://www.amazon.com/dp/3642133487/?tag=pfamazon01-20 - but I only get the general idea of Lieb-Yngvason and Caratheodory, I still haven't mastered either. My hunch is that Lieb and Yngvason are on to something.



Yes, the "entropy of mixing" problem, where you have A particles on one side, B particles on the other, separated by a partition, both sides at the same temperature and pressure, but obviously not at the same chemical potentials. Then you quickly remove the partition. When equilibrium finally occurs, the two types are fully mixed and the total entropy is larger than the sum of the two original entropies and the chemical potentials are uniform. Usually done for two non-reacting ideal gases, so that the final entropy can actually be calculated. Unlike Joule expansion, you could assume LTE, where any small volume element is a thermodynamic system in equilibrium with a universally constant T and P, and actually solve the problem for two ideal gases as it develops in time. It would be interesting to calculate the entropy density as a function of position as it develops in time. I've been thinking of doing this, just to get a better grasp of entropy creation.

It should be note that the process of mixing is driven by the gradient of the partial pressure. The gradient of partial pressure is sometimes considered the first contribution of entropy production.

Making an unsupported claim is against the rules of the forum. I neglected to provide references and links before in my previous links. Therefore, here are a few references and links that show how gradients of partial pressure created entropy.


http://www.landfood.ubc.ca/soil200/components/air.htm
“Diffusion - moving force is gradient of partial pressure of any constituent member of air to migrate from a zone of higher to lower pressure, even while air as a whole may remain stationary. In other words, through diffusion each gas moves in a direction determined by its own partial pressure.”



http://www.wiley-vch.de/books/sample/352731024X_c01.pdf
“Momentum transfer arguments lead to the conclusion that for diffusion in a gas mixture the gradient of partial pressure should be regarded as the fundamental .driving force,. since that formulation remains valid even under non-isothermal conditions.”

http://www.chem.ntnu.no/nonequilibrium-thermodynamics/pub/getfilee173.pdf
“The small magnitude of the gradient of partial pressure along the airways suggests that the first contribution of entropy production can be neglected in the simplified model.”


http://books.google.com/books?id=rY...page&q="gradient of partial pressure"&f=false
“The rate of diffusion of a gas is proportional to the gradient of partial pressure…”

http://books.google.com/books?id=1w...page&q="gradient of partial pressure"&f=false
“As the permeate pressure is decreased, there occurred an increase in the gradient of partial pressure or chemical potential across a dried, nonswollen layer. As a result, when the permeate components become easily vaporized, the flux or total amount of product is increased.”


http://www.decompression.org/maiken/Bubble_Decompression_Strategies.htm
“The gradient of partial pressure across the bubble surface. G = T - Pb, where T is tissue tension and Pb is the pressure inside the bubble.”


No link here. However, Sears and Zemansky is a a classic textbook on thermodynamics. The following pages describe the process of how when different gases mix.
“Heat and Thermodynamics, an Intermediate Testbook – 6th edition” by Mark Zemansky and Richard Dittman, (Mcgraw Hill, 1979) pp 361-363.
 
Last edited by a moderator:
  • #38


It should be note that the process of mixing is driven by the gradient of the partial pressure. The gradient of partial pressure is sometimes considered the first contribution of entropy production.

Not under the conditions I outlined or Rap agreed with in the extract you quoted from his post.

Here is another situation for you to consider:-

Consider a salt or sugar solution a few degrees above freezing point in an open vessel under constant pressure P.

Now add a cube of ice at freezing point, of just sufficient mass that the result in the vessel will be a more dilute solution at exactly freezing point.

What is the entropy of mixing?
 
  • #39


Darwin123 said:
It should be note that the process of mixing is driven by the gradient of the partial pressure. The gradient of partial pressure is sometimes considered the first contribution of entropy production.

I don't think its partial pressure differences that drive particle transport, its chemical potential differences. If you had a boundary that passed particles slowly but did not pass heat or work, you could have the partial pressures the same on each side, but different temperatures, and therefore different densities (higher density on the cold side), and there would be a net flow of particles to equalize density. You can see this because on the hot side, there would be fewer particles passing, but with greater momentum per particle, while on the cold side, more would be passing but with less momentum per particle. There would be particle transfer from the cold side to the hot side. The chemical potential on either side, would not be the same, the chemical potential on the cold side would be higher than on the hot side.

Chemical potential gradients will also cause entropy production.
 
Last edited:
  • #40


Rap said:
I don't think its partial pressure differences that drive particle transport, its chemical potential differences. If you had a boundary that passed particles slowly but did not pass heat or work, you could have the partial pressures the same on each side, but different temperatures, and therefore different densities (higher density on the cold side), and there would be a net flow of particles to equalize density. You can see this because on the hot side, there would be fewer particles passing, but with greater momentum per particle, while on the cold side, more would be passing but with less momentum per particle. There would be particle transfer from the cold side to the hot side. The chemical potential on either side, would not be the same, the chemical potential on the cold side would be higher than on the hot side.

Chemical potential gradients will also cause entropy production.
Mixing is driven simply by the randomness of molecular motion. The only thing that can prevent it is a barrier.

Consider two identical volumes and quantities of ideal chemically inert gas molecules, say He and Ne, and both are at the same temperature and pressure but are separated by a wall. If you remove the wall the gas molecules will mix. And you can wait forever and the molecules will not un-mix so it is not reversible. Entropy will increase.

AM
 
  • #41


Rap said:
I don't think its partial pressure differences that drive particle transport, its chemical potential differences.

Here is a link that says that they are basically the same thing in the special case of an ideal gas, or ideal solute.

http://books.google.com/books?id=1wA...ure"&f=false
“As the permeate pressure is decreased, there occurred an increase in the gradient of partial pressure or chemical potential across a dried, nonswollen layer. As a result, when the permeate components become easily vaporized, the flux or total amount of product is increased.”

I understand that chemical potential and partial pressure aren't precisely the same thing for other than ideal gases and solutes. I am not trying to promote my own definition of partial pressure.

However...

The specific scenarios discussed in this thread involve ideal gases. So until we get away from ideal gases, chemical potential of a component and the partial pressure of a component can be discussed interchangeably. Furthermore, the chemical potential of a component generally increases with the partial pressure of a component even for components that are not ideal gases.
 
Last edited:
  • #42


Would it be correct to say that TS is the energy not available for doing useful work?

When you write down any thermodynamic potential, you have the TS term, but the other terms (like PV, \muN, HM, etc) are the "work terms", which tell you how much useful energy you can get out of your system (depending on the constraints).

Can anyone comment on this?
 
  • #43


GravitatisVis said:
Would it be correct to say that TS is the energy not available for doing useful work?

When you write down any thermodynamic potential, you have the TS term, but the other terms (like PV, \muN, HM, etc) are the "work terms", which tell you how much useful energy you can get out of your system (depending on the constraints).

Can anyone comment on this?
If you mean: T = Tc and S = ΔS, this would be correct.

When a thermodynamic process occurs that results in a heat flow Qh from temperature Th, heat flow Qc to temperature Tc and work output W = Qh-Qc, the energy that is "unavailable to do work" (which is also known as the "lost work") is the difference between the maximum work that can be produced (ie. in a reversible process) for that heat flow Qh from Th to Tc and the work actually produced. The "lost work" is TcΔS.

For a reversible process, ΔS = -Qh/Th + Qc/Tc = 0 so Qc/Qh = Tc/Th. For an irreversible process, -Qh/Th + Qc/Tc = ΔS > 0. So the reversible work is Qh-Qc = Qh - Qh(Tc/Th) = Qh(1-Tc/Th).

For the irreversible process involving the same Qh, the actual work is Qh-Qc'. The difference between the reversible work and the irreversible work (ie. the lost work) is:

W_L = (Q_h - Q_c) - (Q_h - Q_c') = Q_h(1-T_c/T_h) - (Q_h - Q_c')

= -Q_h(T_c/T_h) + Q_c' = T_c(-Q_h/T_h + Q_c'/T_c )= T_c(ΔS)


AM
 
  • #44


Darwin123 said:
Here is a link that says that they are basically the same thing in the special case of an ideal gas, or ideal solute.

http://books.google.com/books?id=1wA...ure"&f=false
“As the permeate pressure is decreased, there occurred an increase in the gradient of partial pressure or chemical potential across a dried, nonswollen layer. As a result, when the permeate components become easily vaporized, the flux or total amount of product is increased.”

I understand that chemical potential and partial pressure aren't precisely the same thing for other than ideal gases and solutes. I am not trying to promote my own definition of partial pressure.

However...

The specific scenarios discussed in this thread involve ideal gases. So until we get away from ideal gases, chemical potential of a component and the partial pressure of a component can be discussed interchangeably. Furthermore, the chemical potential of a component generally increases with the partial pressure of a component even for components that are not ideal gases.

The partial pressure and chemical potential are "equivalent" for ideal gases only if the temperatures are the same. The chemical potential for i-type particles in an ideal gas is \mu_i=\mu_{i0}+kT\ln(P_iT^{-5/2}) so you can see that the chemical potential is proportional to the logarithm of the partial pressure when temperatures are constant.
 
  • #45


Rap said:
The partial pressure and chemical potential are "equivalent" for ideal gases only if the temperatures are the same. The chemical potential for i-type particles in an ideal gas is \mu_i=\mu_{i0}+kT\ln(P_iT^{-5/2}) so you can see that the chemical potential is proportional to the logarithm of the partial pressure when temperatures are constant.

According to your formula, the chemical potential increases with partial pressure under isothermal conditions. Therefore, if the gradient of partial pressure is a nonzero vector then the gradient of chemical potential is a nonzero vector pointing in the same direction. If the gradient of partial pressure is a zero vector, then the gradient of chemical potential is a zero vector.

Of course, this leaves the problem of systems that are not isothermal.

Entropy is created in the regions of space where the gradient of chemical potential is not zero. That is what you said. Someone else said that entropy is only created in places where the temperature has a nonzero potential.

According to your formula, the two statements are not equivalent. In a region where both temperature and partial pressure is nonzero, it may be possible to find points where the chemical potential is zero. For instance, the temperature and the partial pressure can be pointing in opposite directions with equal magnitudes.

My current question is:
1) What type of gradients cause the generation of entropy? However, entropy is still an intensive quantity. Therefore, entropy is created at specific points regardless of what gradient generates it. It has to be created in the region where that gradient is nonzero.

Entropy acts like an indestructible fluid, where temperature is like the pressure of the entropy. It can be created in a particular spot by a gradient, although I am not sure precisely what type of gradient that is. I am trying to find out. I haven't found any references where the author spells it out.
 
Last edited:
  • #46


So I am open to suggestions.

You receive (proper) suggestions in a discussion by addressing others' (proper) remarks to yourself.
 
  • #47


Darwin123 said:
Entropy is created in the regions of space where the gradient of chemical potential is not zero. That is what you said. Someone else said that entropy is only created in places where the temperature has a nonzero potential..

Well, I don't remember saying that, but if I did, then I've changed my mind. I'm sure I said a temperature gradient creates entropy, but as to what else, I too am trying to figure that out.

Darwin123 said:
According to your formula, the two statements are not equivalent. In a region where both temperature and partial pressure is nonzero, it may be possible to find points where the chemical potential is zero. For instance, the temperature and the partial pressure can be pointing in opposite directions with equal magnitudes...

That sounds possible...

Darwin123 said:
Entropy acts like an indestructible fluid, where temperature is like the pressure of the entropy. It can be created in a particular spot by a gradient, although I am not sure precisely what type of gradient that is. I am trying to find out. I haven't found any references where the author spells it out.

Agree with everything, except that because entropy can be created, but volume cannot, I only think of temperature being to entropy like pressure is to volume in the case where entropy is not created. When entropy is created, the analogy starts to puzzle me.

Regarding gradients producing entropy, I think there are other possibilities, like the friction of a piston doing work. The entropy is created at the points of contact between piston and cylinder and is added to the system. Energy added is T dS if you can assume the whole thing happens at a constant temperature.

Studiot said:
You receive (proper) suggestions in a discussion by addressing others' (proper) remarks to yourself.

Did I miss something? If I did, I apologize. What did I miss?
 
  • #48


Rap said:
Well, I don't remember saying that, but if I did, then I've changed my mind. I'm sure I said a temperature gradient creates entropy, but as to what else, I too am trying to figure that out.



That sounds possible...



Agree with everything, except that because entropy can be created, but volume cannot, I only think of temperature being to entropy like pressure is to volume in the case where entropy is not created. When entropy is created, the analogy starts to puzzle me.

Regarding gradients producing entropy, I think there are other possibilities, like the friction of a piston doing work. The entropy is created at the points of contact between piston and cylinder and is added to the system. Energy added is T dS if you can assume the whole thing happens at a constant temperature.



Did I miss something? If I did, I apologize. What did I miss?
I think that he was replying to my remark, not yours.
I apologize. I tried to correct the situation. In any case, you don't have to worry about it.
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 26 ·
Replies
26
Views
3K
  • · Replies 22 ·
Replies
22
Views
6K
Replies
5
Views
618
  • · Replies 16 ·
Replies
16
Views
2K
Replies
1
Views
1K
  • · Replies 9 ·
Replies
9
Views
3K