Entropy is a measure of energy availiable for work ?

In summary, the statement "entropy is a measure of the energy unavailable for work" is often misunderstood and can be interpreted in different ways. It is important to have a clear understanding of this concept and its implications, as it is a fundamental concept in thermodynamics.
  • #36


@Darwin

My point was that caloric was a word used in the english speaking world and the concept already dispelled by the time of Clausius,
Clausius was a mid 19th century worker, Black a later 18th century one.

@Rap

I am not suggesting Caratheodory as the most recent authority. I raised his definition as it seems to me the most pertinent to this discussion on lines on indicator diagrams and mathematical continuity.
Workers prior to Caratheodory all specified what we now call the second law in terms of cyclic processes ie closed loops on the indicator diagram composed of several lines. That is from State A to State B and back again. Caratheodory was the first to offer a definition that could be applied to a single line ie from State A to State B.

My English translation has

"In the neighbourhood of any equilibrium state of a system there are states that are inaccessible by an adiathermal process"
 
Science news on Phys.org
  • #37
Rap said:
Yes, I didn't clarify that. I always draw quasistatic lines on an indicator diagram as a solid line, non-quasistatic, but with some state function remaining constant, as a dotted line. (e.g. Joule expansion). Reversible is a subset of quasistatic, so yes, reversible can always be plotted with a solid line. Non-quasistatic, with no state function constant, its just two unconnectable points. (Can't think of an example right now).

Regarding definitions of entropy, I think Lieb and Yngvason have taken the next step beyond Caratheodory. Caratheodory's definition of classical entropy is restricted to quasistatic transformations from equilibrium state 1 to equilibrium state 2, while Lieb and Yngvason's definition of classical entropy removes the quasistatic constraint. Their papers are rather hairy, but Thess gives a more user friendly description - see https://www.amazon.com/dp/3642133487/?tag=pfamazon01-20 - but I only get the general idea of Lieb-Yngvason and Caratheodory, I still haven't mastered either. My hunch is that Lieb and Yngvason are on to something.



Yes, the "entropy of mixing" problem, where you have A particles on one side, B particles on the other, separated by a partition, both sides at the same temperature and pressure, but obviously not at the same chemical potentials. Then you quickly remove the partition. When equilibrium finally occurs, the two types are fully mixed and the total entropy is larger than the sum of the two original entropies and the chemical potentials are uniform. Usually done for two non-reacting ideal gases, so that the final entropy can actually be calculated. Unlike Joule expansion, you could assume LTE, where any small volume element is a thermodynamic system in equilibrium with a universally constant T and P, and actually solve the problem for two ideal gases as it develops in time. It would be interesting to calculate the entropy density as a function of position as it develops in time. I've been thinking of doing this, just to get a better grasp of entropy creation.

It should be note that the process of mixing is driven by the gradient of the partial pressure. The gradient of partial pressure is sometimes considered the first contribution of entropy production.

Making an unsupported claim is against the rules of the forum. I neglected to provide references and links before in my previous links. Therefore, here are a few references and links that show how gradients of partial pressure created entropy.


http://www.landfood.ubc.ca/soil200/components/air.htm
“Diffusion - moving force is gradient of partial pressure of any constituent member of air to migrate from a zone of higher to lower pressure, even while air as a whole may remain stationary. In other words, through diffusion each gas moves in a direction determined by its own partial pressure.”



http://www.wiley-vch.de/books/sample/352731024X_c01.pdf
“Momentum transfer arguments lead to the conclusion that for diffusion in a gas mixture the gradient of partial pressure should be regarded as the fundamental .driving force,. since that formulation remains valid even under non-isothermal conditions.”

http://www.chem.ntnu.no/nonequilibrium-thermodynamics/pub/getfilee173.pdf
“The small magnitude of the gradient of partial pressure along the airways suggests that the first contribution of entropy production can be neglected in the simplified model.”


http://books.google.com/books?id=rY...page&q="gradient of partial pressure"&f=false
“The rate of diffusion of a gas is proportional to the gradient of partial pressure…”

http://books.google.com/books?id=1w...page&q="gradient of partial pressure"&f=false
“As the permeate pressure is decreased, there occurred an increase in the gradient of partial pressure or chemical potential across a dried, nonswollen layer. As a result, when the permeate components become easily vaporized, the flux or total amount of product is increased.”


http://www.decompression.org/maiken/Bubble_Decompression_Strategies.htm
“The gradient of partial pressure across the bubble surface. G = T - Pb, where T is tissue tension and Pb is the pressure inside the bubble.”


No link here. However, Sears and Zemansky is a a classic textbook on thermodynamics. The following pages describe the process of how when different gases mix.
“Heat and Thermodynamics, an Intermediate Testbook – 6th edition” by Mark Zemansky and Richard Dittman, (Mcgraw Hill, 1979) pp 361-363.
 
Last edited by a moderator:
  • #38


It should be note that the process of mixing is driven by the gradient of the partial pressure. The gradient of partial pressure is sometimes considered the first contribution of entropy production.

Not under the conditions I outlined or Rap agreed with in the extract you quoted from his post.

Here is another situation for you to consider:-

Consider a salt or sugar solution a few degrees above freezing point in an open vessel under constant pressure P.

Now add a cube of ice at freezing point, of just sufficient mass that the result in the vessel will be a more dilute solution at exactly freezing point.

What is the entropy of mixing?
 
  • #39


Darwin123 said:
It should be note that the process of mixing is driven by the gradient of the partial pressure. The gradient of partial pressure is sometimes considered the first contribution of entropy production.

I don't think its partial pressure differences that drive particle transport, its chemical potential differences. If you had a boundary that passed particles slowly but did not pass heat or work, you could have the partial pressures the same on each side, but different temperatures, and therefore different densities (higher density on the cold side), and there would be a net flow of particles to equalize density. You can see this because on the hot side, there would be fewer particles passing, but with greater momentum per particle, while on the cold side, more would be passing but with less momentum per particle. There would be particle transfer from the cold side to the hot side. The chemical potential on either side, would not be the same, the chemical potential on the cold side would be higher than on the hot side.

Chemical potential gradients will also cause entropy production.
 
Last edited:
  • #40


Rap said:
I don't think its partial pressure differences that drive particle transport, its chemical potential differences. If you had a boundary that passed particles slowly but did not pass heat or work, you could have the partial pressures the same on each side, but different temperatures, and therefore different densities (higher density on the cold side), and there would be a net flow of particles to equalize density. You can see this because on the hot side, there would be fewer particles passing, but with greater momentum per particle, while on the cold side, more would be passing but with less momentum per particle. There would be particle transfer from the cold side to the hot side. The chemical potential on either side, would not be the same, the chemical potential on the cold side would be higher than on the hot side.

Chemical potential gradients will also cause entropy production.
Mixing is driven simply by the randomness of molecular motion. The only thing that can prevent it is a barrier.

Consider two identical volumes and quantities of ideal chemically inert gas molecules, say He and Ne, and both are at the same temperature and pressure but are separated by a wall. If you remove the wall the gas molecules will mix. And you can wait forever and the molecules will not un-mix so it is not reversible. Entropy will increase.

AM
 
  • #41


Rap said:
I don't think its partial pressure differences that drive particle transport, its chemical potential differences.

Here is a link that says that they are basically the same thing in the special case of an ideal gas, or ideal solute.

http://books.google.com/books?id=1wA...ure"&f=false
“As the permeate pressure is decreased, there occurred an increase in the gradient of partial pressure or chemical potential across a dried, nonswollen layer. As a result, when the permeate components become easily vaporized, the flux or total amount of product is increased.”

I understand that chemical potential and partial pressure aren't precisely the same thing for other than ideal gases and solutes. I am not trying to promote my own definition of partial pressure.

However...

The specific scenarios discussed in this thread involve ideal gases. So until we get away from ideal gases, chemical potential of a component and the partial pressure of a component can be discussed interchangeably. Furthermore, the chemical potential of a component generally increases with the partial pressure of a component even for components that are not ideal gases.
 
Last edited:
  • #42


Would it be correct to say that TS is the energy not available for doing useful work?

When you write down any thermodynamic potential, you have the TS term, but the other terms (like PV, [itex]\mu[/itex]N, HM, etc) are the "work terms", which tell you how much useful energy you can get out of your system (depending on the constraints).

Can anyone comment on this?
 
  • #43


GravitatisVis said:
Would it be correct to say that TS is the energy not available for doing useful work?

When you write down any thermodynamic potential, you have the TS term, but the other terms (like PV, [itex]\mu[/itex]N, HM, etc) are the "work terms", which tell you how much useful energy you can get out of your system (depending on the constraints).

Can anyone comment on this?
If you mean: T = Tc and S = ΔS, this would be correct.

When a thermodynamic process occurs that results in a heat flow Qh from temperature Th, heat flow Qc to temperature Tc and work output W = Qh-Qc, the energy that is "unavailable to do work" (which is also known as the "lost work") is the difference between the maximum work that can be produced (ie. in a reversible process) for that heat flow Qh from Th to Tc and the work actually produced. The "lost work" is TcΔS.

For a reversible process, ΔS = -Qh/Th + Qc/Tc = 0 so Qc/Qh = Tc/Th. For an irreversible process, -Qh/Th + Qc/Tc = ΔS > 0. So the reversible work is Qh-Qc = Qh - Qh(Tc/Th) = Qh(1-Tc/Th).

For the irreversible process involving the same Qh, the actual work is Qh-Qc'. The difference between the reversible work and the irreversible work (ie. the lost work) is:

[tex]W_L = (Q_h - Q_c) - (Q_h - Q_c') = Q_h(1-T_c/T_h) - (Q_h - Q_c')[/tex]

[tex]= -Q_h(T_c/T_h) + Q_c' = T_c(-Q_h/T_h + Q_c'/T_c )= T_c(ΔS)[/tex]


AM
 
  • #44


Darwin123 said:
Here is a link that says that they are basically the same thing in the special case of an ideal gas, or ideal solute.

http://books.google.com/books?id=1wA...ure"&f=false
“As the permeate pressure is decreased, there occurred an increase in the gradient of partial pressure or chemical potential across a dried, nonswollen layer. As a result, when the permeate components become easily vaporized, the flux or total amount of product is increased.”

I understand that chemical potential and partial pressure aren't precisely the same thing for other than ideal gases and solutes. I am not trying to promote my own definition of partial pressure.

However...

The specific scenarios discussed in this thread involve ideal gases. So until we get away from ideal gases, chemical potential of a component and the partial pressure of a component can be discussed interchangeably. Furthermore, the chemical potential of a component generally increases with the partial pressure of a component even for components that are not ideal gases.

The partial pressure and chemical potential are "equivalent" for ideal gases only if the temperatures are the same. The chemical potential for i-type particles in an ideal gas is [tex]\mu_i=\mu_{i0}+kT\ln(P_iT^{-5/2})[/tex] so you can see that the chemical potential is proportional to the logarithm of the partial pressure when temperatures are constant.
 
  • #45


Rap said:
The partial pressure and chemical potential are "equivalent" for ideal gases only if the temperatures are the same. The chemical potential for i-type particles in an ideal gas is [tex]\mu_i=\mu_{i0}+kT\ln(P_iT^{-5/2})[/tex] so you can see that the chemical potential is proportional to the logarithm of the partial pressure when temperatures are constant.

According to your formula, the chemical potential increases with partial pressure under isothermal conditions. Therefore, if the gradient of partial pressure is a nonzero vector then the gradient of chemical potential is a nonzero vector pointing in the same direction. If the gradient of partial pressure is a zero vector, then the gradient of chemical potential is a zero vector.

Of course, this leaves the problem of systems that are not isothermal.

Entropy is created in the regions of space where the gradient of chemical potential is not zero. That is what you said. Someone else said that entropy is only created in places where the temperature has a nonzero potential.

According to your formula, the two statements are not equivalent. In a region where both temperature and partial pressure is nonzero, it may be possible to find points where the chemical potential is zero. For instance, the temperature and the partial pressure can be pointing in opposite directions with equal magnitudes.

My current question is:
1) What type of gradients cause the generation of entropy? However, entropy is still an intensive quantity. Therefore, entropy is created at specific points regardless of what gradient generates it. It has to be created in the region where that gradient is nonzero.

Entropy acts like an indestructible fluid, where temperature is like the pressure of the entropy. It can be created in a particular spot by a gradient, although I am not sure precisely what type of gradient that is. I am trying to find out. I haven't found any references where the author spells it out.
 
Last edited:
  • #46


So I am open to suggestions.

You receive (proper) suggestions in a discussion by addressing others' (proper) remarks to yourself.
 
  • #47


Darwin123 said:
Entropy is created in the regions of space where the gradient of chemical potential is not zero. That is what you said. Someone else said that entropy is only created in places where the temperature has a nonzero potential..

Well, I don't remember saying that, but if I did, then I've changed my mind. I'm sure I said a temperature gradient creates entropy, but as to what else, I too am trying to figure that out.

Darwin123 said:
According to your formula, the two statements are not equivalent. In a region where both temperature and partial pressure is nonzero, it may be possible to find points where the chemical potential is zero. For instance, the temperature and the partial pressure can be pointing in opposite directions with equal magnitudes...

That sounds possible...

Darwin123 said:
Entropy acts like an indestructible fluid, where temperature is like the pressure of the entropy. It can be created in a particular spot by a gradient, although I am not sure precisely what type of gradient that is. I am trying to find out. I haven't found any references where the author spells it out.

Agree with everything, except that because entropy can be created, but volume cannot, I only think of temperature being to entropy like pressure is to volume in the case where entropy is not created. When entropy is created, the analogy starts to puzzle me.

Regarding gradients producing entropy, I think there are other possibilities, like the friction of a piston doing work. The entropy is created at the points of contact between piston and cylinder and is added to the system. Energy added is T dS if you can assume the whole thing happens at a constant temperature.

Studiot said:
You receive (proper) suggestions in a discussion by addressing others' (proper) remarks to yourself.

Did I miss something? If I did, I apologize. What did I miss?
 
  • #48


Rap said:
Well, I don't remember saying that, but if I did, then I've changed my mind. I'm sure I said a temperature gradient creates entropy, but as to what else, I too am trying to figure that out.



That sounds possible...



Agree with everything, except that because entropy can be created, but volume cannot, I only think of temperature being to entropy like pressure is to volume in the case where entropy is not created. When entropy is created, the analogy starts to puzzle me.

Regarding gradients producing entropy, I think there are other possibilities, like the friction of a piston doing work. The entropy is created at the points of contact between piston and cylinder and is added to the system. Energy added is T dS if you can assume the whole thing happens at a constant temperature.



Did I miss something? If I did, I apologize. What did I miss?
I think that he was replying to my remark, not yours.
I apologize. I tried to correct the situation. In any case, you don't have to worry about it.
 
<h2>What is entropy?</h2><p>Entropy is a measure of the amount of energy in a system that is not available for work. It is a measure of the disorder or randomness in a system.</p><h2>How is entropy related to energy available for work?</h2><p>Entropy is inversely related to the amount of energy available for work. As entropy increases, the amount of energy available for work decreases.</p><h2>What factors affect entropy?</h2><p>The factors that affect entropy include temperature, pressure, and the number of particles in a system. Changes in these factors can cause changes in the amount of energy available for work.</p><h2>Why is entropy important in thermodynamics?</h2><p>Entropy is important in thermodynamics because it helps us understand how energy is transferred and transformed in a system. It also helps us predict the direction of chemical reactions and the efficiency of energy conversion processes.</p><h2>How is entropy measured?</h2><p>Entropy is measured in units of joules per kelvin (J/K). It can also be calculated using the equation S = k ln W, where S is entropy, k is the Boltzmann constant, and W is the number of microstates available to a system.</p>

What is entropy?

Entropy is a measure of the amount of energy in a system that is not available for work. It is a measure of the disorder or randomness in a system.

How is entropy related to energy available for work?

Entropy is inversely related to the amount of energy available for work. As entropy increases, the amount of energy available for work decreases.

What factors affect entropy?

The factors that affect entropy include temperature, pressure, and the number of particles in a system. Changes in these factors can cause changes in the amount of energy available for work.

Why is entropy important in thermodynamics?

Entropy is important in thermodynamics because it helps us understand how energy is transferred and transformed in a system. It also helps us predict the direction of chemical reactions and the efficiency of energy conversion processes.

How is entropy measured?

Entropy is measured in units of joules per kelvin (J/K). It can also be calculated using the equation S = k ln W, where S is entropy, k is the Boltzmann constant, and W is the number of microstates available to a system.

Similar threads

Replies
13
Views
1K
  • Thermodynamics
Replies
3
Views
731
  • Thermodynamics
Replies
4
Views
230
Replies
22
Views
2K
Replies
17
Views
1K
  • Thermodynamics
Replies
26
Views
1K
Replies
16
Views
778
Replies
2
Views
1K
Replies
9
Views
1K
Replies
1
Views
610
Back
Top