Register to reply

Does Gradient of Fugacity Create Entropy?

by Darwin123
Tags: entropy, fugacity, gradient
Share this thread:
Darwin123
#1
Dec25-12, 12:54 PM
P: 741
There have been some discussions here as to what type of processes create entropy rather than just move it around. It is established that a gradient of temperature can create entropy. However, the issue moved to partial pressure, and then even away from that.

The previous discussion seemed to revolve around nonequilibrium thermodynamics.

For clarity, I thought it would be useful to discuss the entropy creation one gradient at a time. This post presents problems with regards to the gradient of fugacity.

1) Can a nonzero gradient of fugacity create entropy?

2) Are there any conditions where a nonzero gradient of fugacity can't create entropy?

3) What does the adjective "quasistatic" mean with respect to the gradient of fugacity?

Here is a link where fugacity is defined.
http://en.wikipedia.org/wiki/Fugacity
"Fugacity
In chemical thermodynamics, the fugacity ( ) of a real gas is an effective pressure which replaces the true mechanical pressure in accurate chemical equilibrium calculations. It is equal to the pressure of an ideal gas which has the same chemical potential as the real gas. For example, nitrogen gas (N2) at 0C and a pressure of 100 atm has a fugacity of 97.03 atm.[1] This means that the chemical potential of real nitrogen at a pressure of 100 atm is less than if nitrogen were an ideal gas; the value of the chemical potential is that which nitrogen as an ideal gas would have at a pressure of 97.03 atm.”

I think that the concept of fugacity is also used with respect to phases that aren't gases. I don't swear to it, but I think the formal definition is quite general with respect to phase. However, the definition in this Wiki article is general enough to start the discussion.
Phys.Org News Partner Physics news on Phys.org
New approach to form non-equilibrium structures
Nike krypton laser achieves spot in Guinness World Records
Unleashing the power of quantum dot triplets
Andrew Mason
#2
Dec27-12, 02:33 PM
Sci Advisor
HW Helper
P: 6,654
Quote Quote by Darwin123 View Post
1) Can a nonzero gradient of fugacity create entropy?
I don't think it is the fugacity that creates entropy. You can have two different gases with the same fugacity (eg. ideal gases with fugacity = 100%) and when you put them together entropy will increase. You could just as easily say that a density gradient creates entropy. But it seems to me that the density gradient, like the fugacity gradient, is simply the consequence of molecules mixing ie. kinetic theory.

AM
Studiot
#3
Dec27-12, 05:38 PM
P: 5,462
I think that the concept of fugacity is also used with respect to phases that aren't gases. I don't swear to it, but I think the formal definition is quite general with respect to phase.
Yes, but this is really ChesterMiller's stomping ground so if he would like to chip in that would be great.

A common criterion for equilibrium in Chemical Engineering Thermodynamics is:-

Multiple phases at the same T and P are in equilibrium when the fugacity of each species is uniform throughout the system.

By implication a fugacity gradient implies a non equilibrium system and thus the possibility of reaching equilibrium by maximising entropy.

Chestermiller
#4
Dec30-12, 11:14 AM
Sci Advisor
HW Helper
Thanks ∞
PF Gold
Chestermiller's Avatar
P: 5,045
Does Gradient of Fugacity Create Entropy?

Quote Quote by Studiot View Post
Yes, but this is really ChesterMiller's stomping ground so if he would like to chip in that would be great.

A common criterion for equilibrium in Chemical Engineering Thermodynamics is:-

Multiple phases at the same T and P are in equilibrium when the fugacity of each species is uniform throughout the system.

By implication a fugacity gradient implies a non equilibrium system and thus the possibility of reaching equilibrium by maximising entropy.
Thanks, StudioT, but I don't regard myself as a thermo maven. Still, I have some comments about this subject.

I agree with everything you said. The chemical potential of a species is defined as the partial molar free energy of the species, and the chemical potential of each species is related to its fugacity through the definition of fugacity.

On the other hand, in the example that Andrew Mason sited, if there are two separated gases in a system that are put together initially, the fugacity of each gas species within the other gas is zero initially, and thus there is a substantial initial variation in fugacity of each species within the system. As the diffusive process proceeds, the variation in the fugacity of each of the various species decreases, and the fugacity of each species becomes uniform at the final equilibrium state.

I need to get off line now, but will be back later to discuss entropy change for a system in terms of the Clausius inequality, which is a simple concept that makes sense to me and captures the essence of the Second Law.
Chestermiller
#5
Dec30-12, 07:14 PM
Sci Advisor
HW Helper
Thanks ∞
PF Gold
Chestermiller's Avatar
P: 5,045
The OP seems to be focusing on two separate issues, and this tends to create a great deal of confusion:

1. How is the parameter fugacity applied in practice?
2. What causes entropy to be generated?

In what I discuss now, I only want to focus on the second of these. The form of the second law and the definition of entropy that works best for me personally is based on the Clausius inequality. I tend to regard the Clausius inequality as a physical law that was arrived at based on a vast body of empirical observation. To date, there has never been an experiment performed that is inconsistent with the Clausius inequality (to my knowledge).

Throughout most of the thermodynamics literature, the mathematical form presented for the Clausius inequality is very imprecise and ambiguous (dS > dQ/T). This has caused a great deal of confusion for students over the years. Therefore, in what I'm going to discuss here, I'm going to state it and apply it in a much more mathematically precise way.

Let C be a closed system (no mass entering or leaving), and let A be the surface area (boundary) enclosing C and separating this closed system from its surroundings. Let V represent the volume contained within the closed surface S. Let C start out at equilibrium state α, and, let us consider the infinite variety of different time-dependent processes (reversible or irreversible) that can take C to its final equilibrium state β. During these processes, the surface area of the boundary A, and the volume of the system V can change with time as a result of work we do at the boundary of the system. We can also apply heat fluxes q at the boundary of the system, which can vary with position over the boundary and time.

Suppose that, for each of the various processes, we calculate the value of the following integral over time and over the surface A of the system:

[tex]I = \int\int \frac{\mathbf{q}\cdot \mathbf{n}}{T} dAdt[/tex]

where q is the heat flux vector at the surface , n is an inwardly directed normal to the surface, T is the temperature at the surface, and t is time. The parameters q, n, and T are assumed to vary with position along the surface.

Based on an overwhelming body of observational evidence, the Clausius inequality requires that, over the infinite variety of possible time-dependent processes for moving from equilibrium state α to equilibrium state β, the value of the integral I is not arbitrary, but instead exhibits an upper bound. This upper bound depends only on the two equilibrium states α and β. The upper bound is called the change in entropy ΔS from state α to state β. Over the range of possible processes for moving between the two equilibrium states, the upper bound to the integral is achieved when the process is reversible. If the process is irreversible, the integral I will be less than the upper bound ΔS. Thus, for all possible processes,

[tex]\Delta{S} \geq \int\int \frac{\mathbf{q}\cdot \mathbf{n}}{T} dAdt[/tex]

where the equal sign applies to all reversible processes, and the greater than sign applies to all irreversible processes.

Irreversible processes are characterized by spatial variations (gradients) in pressure, and/or temperature, and/or species concentrations within the system during the transition from state α to state β.

So, in terms of the Clausius inequality, the presence of nonuniformities in pressure, temperature, and concentration during irreversible paths from state α to state β causes the integral I involving the surface heat fluxes to be smaller than it would have been if the process were reversible (and smaller than the change in entropy between the two states). In this sense, the irreversibility compensates for this deficit such that the change in entropy is the same. So, the presence of nonuniformities in pressure, temperature, and concentration during irreversible paths can be regarded as generators of entropy.
Rap
#6
Jan5-13, 11:30 PM
P: 789
I approach it in a less general but maybe more understandable way. Suppose you have a thermally isolated cylinder with a boundary of some sort separating it into two systems. The boundary holds negligible extensive parameters (energy, entropy, volume, particles). On the left you have system 1, with intensive parameters [itex]T_1, P_1, \mu_1[/itex] (temperature, pressure and chemical potential) and on the left you have [itex]T_1, P_1, \mu_1[/itex]. The fundamental law states that [tex]dU_1=T_1 dS_1-P_1 dV_1+\mu_1 dN_1[/tex][tex]dU_2=T_2 dS_2-P_2 dV_2+\mu_2 dN_2[/tex]along with conservation laws:[tex]dU_1+dU_2=0\,\mathrm{Conservation\,of\,energy}[/tex][tex]dV_1+dV_2=0\,\mathrm{Conservation\,of\,volume}[/tex][tex]dN_1+dN_2=0\,\mathrm{Conservation\,of\,particle\,number}[/tex][tex]dS_1+dS_2=dS_c\,\mathrm{Non-conservation\,of\,entropy}, dS_c>0[/tex]Now consider "flows" going from left to right. Define:
[tex]dU=-dU_1=dU_2[/tex][tex]dV=-dV_1=dV_2[/tex][tex]dN=-dN_1=dN_2[/tex][tex]dS=-dS_1=dS_2-dS_c[/tex]So now the fundamental laws become:[tex]-dU=-T_1 dS+P_1 dV-\mu_1 dN[/tex][tex]dU=T_2 (dS+dS_c)-P_2 dV+\mu_2 dN[/tex] Adding, and defining [itex]\Delta X=X_2-X_1[/itex] for the intensive variable X, gives
[tex]0=T_2 dS_c+\Delta T dS-\Delta P dV+\Delta \mu dN[/tex] So you can see how the differences in the intensive variables relate to the created entropy [itex]dS_c[/itex].

For example, for the case of a thermally open ([itex]dS\ne 0[/itex]) but mechanically closed ([itex]dV= 0[/itex]) and materially closed ([itex]dN= 0[/itex]), the created entropy is [itex]dS_c=-\Delta T dS/T_2[/itex]. We can characterize the effect of the boundary with Fourier's law somewhat restated: [itex]dS=-K\Delta T \Delta t[/itex], where [itex]\Delta t[/itex] is the time interval considered for the process, and K is an effective thermal conductivity, which we can adjust experimentally, and so we can adjust [itex]dS[/itex]. You can see that the ratio of created to passed entropy ([itex]dS_c/dS[/itex]) is proportional to [itex]\Delta T[/itex], so it can be made arbitrarily small by making [itex]\Delta T[/itex] arbitrarily small. You can still have a non-zero [itex]dS[/itex] however, by making the product [itex]\Delta T \Delta t[/itex] non zero - i.e. making [itex]\Delta t[/itex] approach infinity. In this case you will have a quasistatic, reversible process, with no creation of entropy.

For the case of a mechanically open ([itex]dV\ne 0[/itex]) but thermally closed ([itex]dS= 0[/itex]) and materially closed ([itex]dN= 0[/itex]), the created entropy is [itex]dS_c=\Delta P dV/T_2[/itex]. If we keep the analogy with the thermal case, this entropy is created by friction. We can characterize the effect of the boundary with Fourier's law somewhat restated: [itex]dV=-\gamma\Delta P \Delta t[/itex], where \gamma is an effective coefficient of friction, which we can adjust experimentally, and so we can adjust [itex]dV[/itex]. If we don't include friction, then we have to deal with the fact that the work done on the left system (1) is [itex]-P_1 dV[/itex] while the work done by the right system is [itex]P_2 dV[/itex]. We could make a mechanical connection to the boundary to counteract the force, and do work on the environment, but that would violate the assumption of an isolated system. We could have a "spring loaded" boundary that absorbed the difference as potential energy, but that would violate the assumption that the boundary contains no energy.

For the case of a materially open ([itex]dN\ne 0[/itex]) but thermally closed ([itex]dS= 0[/itex]) and mechanically closed ([itex]dV= 0[/itex]) system, the created entropy is [itex]dS_c=\Delta \mu dN/T_2[/itex]. We can characterize the effect of the boundary with Fick's law somewhat restated: [itex]dN=-D\Delta \mu \Delta t[/itex], where D is an effective coefficient of diffusion, which we can adjust experimentally, and so we can adjust [itex]dN[/itex]. The problem with this is I don't know how a materially open but thermally closed boundary can be made. A semi-permeable membrane allows particles to pass through but if you had a case where the chemical potential on each side was the same, the same number of particles would be transferred forward and backward, but the hot particles would bring more thermal energy and the cold ones less.

Anyway, the result is that differences in any intensive variable (temperature, pressure, chemical potential) will produce entropy under the above assumptions. All of the above processes are irreversible as long as the [itex]\Delta X[/itex]'s are neither zero nor infinitesimally small, which produces an entropy creation [itex]dS_c[/itex] which is neither zero nor infinitesimally small. In the limit of zero [itex]\Delta X[/itex]'s, but non-zero [itex]\Delta X \Delta t[/itex]'s, the processes will become quasistatic and reversible, with no creation of entropy.
Studiot
#7
Jan6-13, 03:44 AM
P: 5,462
Darwin
For clarity, I thought it would be useful to discuss the entropy creation one gradient at a time. This post presents problems with regards to the gradient of fugacity.
Good morning, Rap.

Your post was a significant piece of work and I'm sorry I haven't had time to study the detail.

However since Darwin appears to be no longer with us for reasons unknown to me, I would point out that the purpose of this thread was to separate the agents of entropy creation and discuss them individually.

This thread was stated to be about fugacity and you don't seem to have mentioned that property?
Rap
#8
Jan6-13, 10:01 AM
P: 789
Quote Quote by Studiot View Post
Good morning, Rap.

Your post was a significant piece of work and I'm sorry I haven't had time to study the detail.

However since Darwin appears to be no longer with us for reasons unknown to me, I would point out that the purpose of this thread was to separate the agents of entropy creation and discuss them individually.

This thread was stated to be about fugacity and you don't seem to have mentioned that property?
LOL - I know. I had some notes that were half finished and I kind of took it from there. Regarding the fugacity, the transport coefficients (K and D, anyway) are expressed in terms of a mean free path: [itex]\lambda=\frac{kT}{\sqrt{2}\sigma P}[/itex] where [itex]\sigma[/itex] is the collisional cross section (area). But [itex]P_{ideal}=nkT[/itex] so thats [itex]\lambda=\frac{1}{\sqrt{2}\sigma n }\,\frac{P_{ideal}}{P}[/itex]. I guess [itex]P_{ideal}[/itex] is the fugacity. Maybe the above development can be used to analyse a fugacity difference between the two systems?
Chestermiller
#9
Jan6-13, 11:28 AM
Sci Advisor
HW Helper
Thanks ∞
PF Gold
Chestermiller's Avatar
P: 5,045
Hi Rap,

I read over what you wrote, but was not really able to follow it. I guess what seems simple to you seems complicated to me, and what seems simple to me, in some way, seems more complicated to you.

I have something additional that I would like to run past you. In the classic Chemical Engineering textbook, Transport Phenomena by Bird, Stewart, and Lightfoot, there is a discussion of entropy generation on page 372 of the 2nd Edition (the most recent edition). It basically shows how to calculate the transient change in entropy within a system in which an irreversible process is occurring. It also spells out explicitly and in detail the local rate of change of entropy with respect to time at each location within the system. The development is very compelling. By using this formulation, you don't need to consider a reversible path in determining the change in entropy from equilibrium state A to equilibrium state B. But, instead, you would need to solve the transient differential equations of heat transfer and fluid flow, and from this solution you could calculate the entropy change. I think this is the kind of development that most students of thermo crave, and it can greatly contribute to one's understanding of entropy generation. Please, if you have a chance, look it over and let us know if it makes sense to you. The development is only one page long.

Chet
Rap
#10
Jan6-13, 12:36 PM
P: 789
Quote Quote by Chestermiller View Post
Hi Rap,

I read over what you wrote, but was not really able to follow it. I guess what seems simple to you seems complicated to me, and what seems simple to me, in some way, seems more complicated to you.

I have something additional that I would like to run past you. In the classic Chemical Engineering textbook, Transport Phenomena by Bird, Stewart, and Lightfoot, there is a discussion of entropy generation on page 372 of the 2nd Edition (the most recent edition). It basically shows how to calculate the transient change in entropy within a system in which an irreversible process is occurring. It also spells out explicitly and in detail the local rate of change of entropy with respect to time at each location within the system. The development is very compelling. By using this formulation, you don't need to consider a reversible path in determining the change in entropy from equilibrium state A to equilibrium state B. But, instead, you would need to solve the transient differential equations of heat transfer and fluid flow, and from this solution you could calculate the entropy change. I think this is the kind of development that most students of thermo crave, and it can greatly contribute to one's understanding of entropy generation. Please, if you have a chance, look it over and let us know if it makes sense to you. The development is only one page long.

Chet
I will check it out. I think asking about gradients automatically means you need to go to the Boltzmann transport equations which are a step up from simple systems in equilibrium. Maybe my development looks mathematically complicated, but I thought it was simple because it was a simple case - a cylinder with a boundary, both sides always effectively in equilibrium, and then playing with the boundary properties and seeing what happens to entropy. I think the mathematics are kind of tedious, but not complicated.
Chestermiller
#11
Jan6-13, 11:08 PM
Sci Advisor
HW Helper
Thanks ∞
PF Gold
Chestermiller's Avatar
P: 5,045
Quote Quote by Rap View Post
I will check it out. I think asking about gradients automatically means you need to go to the Boltzmann transport equations which are a step up from simple systems in equilibrium. Maybe my development looks mathematically complicated, but I thought it was simple because it was a simple case - a cylinder with a boundary, both sides always effectively in equilibrium, and then playing with the boundary properties and seeing what happens to entropy. I think the mathematics are kind of tedious, but not complicated.
Thanks Rap. As an engineer, I definitely don't think in terms of the Boltzmann transport equations. We typically analyze problems using continuum mechanics: differential force balance, differential equation of mechanical energy, differential equation of thermal energy. The analysis in BSL allows you to change the Clausius inequality into an equality (applicable to any arbitrary irreversible path) by including the entropy generation terms in the relationship. For the situation of a closed system described in my previous post, the equation for a so-called control volume becomes:

[tex]\Delta{S}= \int\int \frac{\mathbf q \cdot \mathbf n}{T}dAdt+\int\int (-\frac{\mathbf q \cdot \nabla T}{T^2}+\frac{(\mathbf{\sigma} +p \mathbf I) : \nabla \mathbf v}{T})dVdt[/tex]
This equation applies to a Newtonian fluid, either gas or liquid. In this equation, σ
is the stress tensor, p is the thermodynamic pressure, I is the identity tensor (aka the unit tensor or the metric tensor) and v is the velocity vector.

The volume integral in the above equation is positive definite. The first term represents the irreversible entropy generation from temperature gradients within the system; the second term represents the irreversible entropy generation from viscous heating. If one solves the fluid dynamic and energy equations for an arbitrary irreversible process and substitutes the results of the calculations into the above equation, one will correctly obtain the change in entropy for the system between the initial and final equilibrium states. If one is also willing to assume that the entropy of a macroscopic system can be defined not only at equilibrium but also instantaneously during an irreversible process, then the above equation can also be expressed as:

[tex]\frac{dS}{dt}= \int \frac{\mathbf q \cdot \mathbf n}{T}dA+\int (-\frac{\mathbf q \cdot \nabla T}{T^2}+\frac{(\mathbf{\sigma} +p \mathbf I) : \nabla \mathbf v}{T})dV[/tex]

This analysis quantitatively elucidates the specific contributions of entropy generation from temperature gradients and from viscous heating to the change in entropy of a system undergoing a reversible process.
Rap
#12
Jan7-13, 01:04 AM
P: 789
The equations on 372-373 are for the first case - a thermally open system that does no work (dV=0) and exchanges no particles (dN=0). Applying it to the cylinder and boundary situation, the equations describe the situation inside the boundary, since the two systems (1 and 2) on either side of the boundary are essentially in equilibrium, while the boundary is where the disequilibrium occurs. By Fourier's law, we can say there is a constant temperature gradient inside the boundary equal to [itex]\Delta T/\Delta x[/itex] where [itex]\Delta x[/itex] is the thickness of the boundary. The boundary passes entropy [itex]dS (=-dS_1)[/itex] and creates entropy [itex]dS_c (=dS_1+dS_2)[/itex] which is passed to the right (2) system.

The equations can be cast for the case of the cylinder/boundary scenario by making the following identifications, where [itex]\Delta t[/itex] is the time interval, A is the cross sectional area of the cylinder (and boundary). A flux in some extensive quantity Z is [itex]\frac{dZ}{A \Delta t}[/itex][tex]\mathbf{v}=0\,\mathrm{(no\,velocities)}[/tex][tex]D/Dt=0\,\mathrm{(steady\,state,\,no\,velocities)}[/tex][tex]g_s=\frac{dS_c}{A \Delta x \Delta t}[/tex][tex]\mathbf{s}=\frac{-dS_1}{A \Delta t}\,\mathrm{at\,the\,left\,side\,of\,the\,boundary}[/tex][tex]\mathbf{s}=\frac{dS_2}{A \Delta t}\,\mathrm{at\,the\,right\,side\,of\,the\,boundary}[/tex][tex]\mathbf{q}=\frac{-T_1 dS_1}{A \Delta t} = \frac{T_2 dS_2}{A \Delta t}\,\mathrm{(same\,on\,both\,sides\,of\,the\,boundary)}[/tex][tex]\nabla f=\frac{\Delta f}{\Delta x}\,\mathrm{for\,any\,function\,f}[/tex]If you number the indented equations in the cylinder/boundary scenario 1 through 13, then

11D.1-2 becomes [itex]dS_2+dS_1=dS_c[/itex] which is equation 6.

11D.1-3 becomes, at the right side of the boundary, [itex]0=\frac{-1}{T_2}\frac{T_2 dS_2+T_1dS_1}{\Delta x}[/itex] which is equation 13, for dV and dN equal to zero.

11D.1-4 becomes, at the right side of the boundary, [itex]dS_c = \frac{-1}{T_2^2} (T_2 dS_2)\Delta T[/itex] which is the conclusion ([itex]dS_c=−\Delta T dS/T_2[/itex]) found in the paragraph for a thermally open boundary.

The equations given are more general in that velocities, time rates of change, and three dimensions are considered. There is another cylinder/boundary case - the shock wave, where the velocities on either side of a stationary boundary are not equal. I wonder how that would work out. It looks like the general equations might handle it. Still haven't figured out fugacity gradient for the cylinder/boundary situation. However, the above equations are general, not specific to an ideal gas. I tend to think that a fugacity gradient does not create entropy per se, only if it somehow creates a gradient in T, P, or N in a situation where an ideal gas would not.
Rap
#13
Jan7-13, 01:22 AM
P: 789
Quote Quote by Chestermiller View Post
Thanks Rap. As an engineer, I definitely don't think in terms of the Boltzmann transport equations. We typically analyze problems using continuum mechanics: differential force balance, differential equation of mechanical energy, differential equation of thermal energy. The analysis in BSL allows you to change the Clausius inequality into an equality (applicable to any arbitrary irreversible path) by including the entropy generation terms in the relationship.
As a physicist, I call them Boltzmann transport equations, but I'm sure they are the same thing as the continuum differential equations you mention. I have not even considered viscosity in the cylinder/boundary scenario, I can't think of how it would fit right now. Isn't your equation missing entropy generation due to chemical potential gradients? Chemical potential is equivalent to density if the temperature is constant. Or is the control volume assumed to contain a fixed number of particles?
Chestermiller
#14
Jan8-13, 11:05 PM
Sci Advisor
HW Helper
Thanks ∞
PF Gold
Chestermiller's Avatar
P: 5,045
Hi Rap,

I don't have time to address your questions right now, but in the next couple of days I will provide detailed explanations. Please stay tuned.

Chet
Chestermiller
#15
Jan9-13, 11:58 AM
Sci Advisor
HW Helper
Thanks ∞
PF Gold
Chestermiller's Avatar
P: 5,045
The analysis in BSL uses the material time derivative D/Dt to follow an infinitecimal parcel of fluid mass (closed system) traveling with the local fluid velocity through a macroscopic system that is experiencing an irreversible process. The parcel is subjected to pressure and viscous stresses as well as a variable temperature gradient and conductive heat flux on its boundary. The parameters on the boundary are assumed to vary with time and boundary location. The analysis temporarily delays inclusion of mass transfer (diffusion) effects until a later chapter, but we can learn much from the development including just momentum- and heat transport.

The key part of the BSL development is the analysis in part (b). Part (b) starts out with the following two equations:

[tex]\rho\frac{D\widehat{U}}{Dt}=-\nabla\cdot\vec{q}-p\nabla\cdot\vec{v}-\vec{\tau}:\nabla \vec{v}[/tex]

[tex]\frac{D\widehat{U}}{Dt}=T\frac{D\widehat{S}}{Dt}-p\frac{D\widehat{V}}{Dt}[/tex]

where ρ is the density, [itex]\widehat{U}[/itex] is the internal energy per unit mass, [itex]\widehat{S}[/itex] is the entropy per unit mass, [itex]\widehat{V}[/itex] is the volume per unit mass (=1/ρ), p is the (thermodynamic) pressure, T is the temperature, [itex]\vec{q}[/itex] is the conductive heat flux vector, [itex]\vec{v}[/itex] is the velocity vector, and [itex]\vec{\tau}[/itex] is the viscous portion of the compressive stress tensor.

The first equation is referred to as the thermal energy balance equation. The second equation is the fundamental differential equation arising from the second law.

In order to combine the above two equations to arrive at a relationship for the material derivative of the specific entropy in terms of entropy generation parameters, we need to also employ the following two equations:

[tex]\widehat{V}=1/\rho[/tex]

[tex]\frac{1}{\rho}\frac{D\rho}{Dt}=-(\nabla\cdot\vec{v})[/tex]

The first of these states that the volume per unit mass (specific volume) is equal to the reciprocal of the mass per unit volume (density). The second of these is the differential mass balance equation (the continuity equation) expressed in terms of the material time derivative of density.

If we combine the above four equations, we obtain the desired equation for the material time derivative of the entropy per unit mass:

[tex]\rho\frac{D\widehat{S}}{Dt}=-\frac{(\nabla\cdot\vec{q})}{T}-\frac{(\vec{\tau}:\nabla \vec{v})}{T}=-\nabla\cdot(\frac{\vec{q}}{T})-\frac{(\vec{q}\cdot\nabla T)}{T^2}-\frac{(\vec{\tau}:\nabla \vec{v})}{T}[/tex]

The first term on the right hand side is what we would employ to calculate the entropy change for a reversible process. The second term, which is quadratic in the temperature gradient, provides the additional entropy generation ascribable to irreversible heat conduction. The third term, which is quadratic in the rate of deformation tensor (and proportional to the viscosity), provides the additional entropy generation ascribable to irreversible viscous heating (BSL provide an explicit expression for this quantity in cartesian coordinates in Eqn. 3.3-3 of the second edition of Transport Phenomena (p. 82)).

The equation for entropy I presented in a previous posting in this thread simply uses the divergence theorem in conjunction with the continuity equation to re-express the BSL results for the case of a macroscopic control volume (closed system) that is moving and deforming with the system.

Rap: In your most recent posting, you raised the question as to how discontinuities in properties might be handled using this type of general development for the case of irreversible gas expansions or compressions within a cylinder. Suppose there were a gas contained within a cylinder, and the pressure at the face of the piston were forced to suddenly drop to a lower constant value (say by using a flush mounted pressure transducer in the piston face to controll the movement of the piston manually). What would happen within the cylinder? Well, whatever pressure disturbance occurs at the piston face cannot propogate throughout the cylinder instantaneously. The disturbance can only travel at a velocity on the order of the speed of sound. So, consequently, there will be a pressure non-uniformity that develops within the cylinder, and this pressure non-uniformity would be accompanied by a temperature non-uniformity. This is what we can expect the differential gas dynamics equations to tell us. At any instant of time, it will be possible to identify two separate pressure regions within the cylinder. One pressure region will be toward the piston, and the pressure throughout this region will be the same as at the piston face. The region away from the piston will not have had time to receive the disturbance yet, and its pressure will be at the original pressure of the cylinder. There will be a pressure and temperature discontinuity at the boundary between the two regions. This boundary will move away for the piston face at a velocity on the order of the speed of sound. In a more precise description, the zone between the two regions will not be a sharp discontinuity, but, instead, will display a very narrow width. Within this narrow zone, the gas pressure, temperature, and velocity will show very sharp gradients. The bulk of the entropy generation will occur within this zone. The velocity gradients will translate into lots of viscous heating. If one were to accurately solve the gas dynamics equations numerically for the cylinder and were to plug the results into the equations that I presented in my earlier posting, one could precisely quantify the entropy generation and entropy change between to initial and final equilibrium states for the irreversible path considered, and this entropy change would agree with that obtained by considering a reversible path between the same initial and final states.
Rap
#16
Jan11-13, 12:12 AM
P: 789
Quote Quote by Chestermiller View Post
Rap: In your most recent posting, you raised the question as to how discontinuities in properties might be handled using this type of general development for the case of irreversible gas expansions or compressions within a cylinder. Suppose there were a gas contained within a cylinder, and the pressure at the face of the piston were forced to suddenly drop to a lower constant value (say by using a flush mounted pressure transducer in the piston face to controll the movement of the piston manually). What would happen within the cylinder? Well, whatever pressure disturbance occurs at the piston face cannot propogate throughout the cylinder instantaneously. The disturbance can only travel at a velocity on the order of the speed of sound. So, consequently, there will be a pressure non-uniformity that develops within the cylinder, and this pressure non-uniformity would be accompanied by a temperature non-uniformity. This is what we can expect the differential gas dynamics equations to tell us. At any instant of time, it will be possible to identify two separate pressure regions within the cylinder. One pressure region will be toward the piston, and the pressure throughout this region will be the same as at the piston face. The region away from the piston will not have had time to receive the disturbance yet, and its pressure will be at the original pressure of the cylinder. There will be a pressure and temperature discontinuity at the boundary between the two regions. This boundary will move away for the piston face at a velocity on the order of the speed of sound. In a more precise description, the zone between the two regions will not be a sharp discontinuity, but, instead, will display a very narrow width. Within this narrow zone, the gas pressure, temperature, and velocity will show very sharp gradients. The bulk of the entropy generation will occur within this zone. The velocity gradients will translate into lots of viscous heating. If one were to accurately solve the gas dynamics equations numerically for the cylinder and were to plug the results into the equations that I presented in my earlier posting, one could precisely quantify the entropy generation and entropy change between to initial and final equilibrium states for the irreversible path considered, and this entropy change would agree with that obtained by considering a reversible path between the same initial and final states.
Thanks for posting those equations, I read them using googlebooks and once you read them, you cannot re-read them.

I'm still thinking about those equations, but the statements you made above I cannot see. If I understand your scenario, you have a cylinder with a gas at equilibrium, and you move the piston to the right with a velocity much larger than the speed of sound. This is Joule expansion, which I think of as having a cylinder with a boundary in the middle. No particles go thru, no motion. On the left is the equilibrium gas, on the right, a vacuum. Then you remove the boundary instantaneously. I think the two scenarios are effectively the same. Anyway, looking at it intuitively from a particle point of view, I see the really hot particles in the region, where the boundary was, heading immediately into the vacuum with few collisions, the warm particles doing the same, but more slowly and with more collisions, etc. and the region around the boundary cooling as a result of losing these hot particles. Mathematically, since the Maxwell-Boltzmann distribution admits particles with arbitrarily high energy, the rate of propagation to the right will be instantaneous. Also, the effect of the boundary removal moves in the other direction, to the left, into the gas, at the speed of sound, dropping its temperature. The slower rate will be due to the high density and consequent collisions. I see a non-maxwellian distribution occurring to the right, and if you define a pseudo-temperature as the average kinetic energy per particle divided by 3k/2, then the pseudo-temperature profile will go down to the right of the dividing point, then going up towards the new right face of the cylinder. A short time after the boundary removal, the pseudo-temperature at the right face will be extremely high (corresponding to the very high energy tail of the formerly Maxwell-Boltzmann distribution) but will then drop rapidly as the whole volume equilibrates. I don't think there will be a narrow transition region during the equilibration process. The transition region will be much more spread out. This is not to say that the mathematical equations above will not predict this, only to say that if they do, then Joule expansion is outside their ability to accurately calculate.

During the main part of the expansion, I think the region to the right of the dividing line will be seriously out of equilibrium, not describable by a Maxwell-Boltzmann distribution, maybe not really having a well defined temperature or entropy. I don't think it can be described using the above equations. On an S-T diagram, I see the gas being a point, then the boundary is removed, the point disappears, then appears at the new equilibrium point at a later time, same T (and N), larger S. Maybe the final approach to equilibrium can be roughly estimated, always plotted on that constant T line.
DrDu
#17
Jan11-13, 01:42 AM
Sci Advisor
P: 3,555
What creates entropy is a gradient of ##\mu/T##, where μ is the chemical potential.
Now ##\mu=\mu_0+RT \ln f/f_0##. So a gradient of f will generically lead to entropy production.
Rap
#18
Jan11-13, 01:49 AM
P: 789
Quote Quote by DrDu View Post
What creates entropy is a gradient of ##\mu/T##, where μ is the chemical potential.
Now ##\mu=\mu_0+RT \ln f/f_0##. So a gradient of f will generically lead to entropy production.
Are you saying that a gradient in will not create entropy, but /T will? If that is the case, I would be interested in the reasoning.


Register to reply

Related Discussions
Fugacity, from the virial equation of state Advanced Physics Homework 5
How to create a magnetic field gradient Classical Physics 7
Calculating fugacity General Engineering 3
Explain what's fugacity Advanced Physics Homework 4
Calculate fugacicty of a certain subtance Materials & Chemical Engineering 1