Show that entropy is a state function

  • #1
199
20

Summary:

I'm trying to show that entropy is a state function based on an analysis of the Carnot cycle and without using advanced mathematics. I'm not satisfied with the presentation in the textbook "University Physics" by Young and Freedman and would like some feedback.

Main Question or Discussion Point

In a (reversible) Carnot cycle the entropy increase of the system during isothermal expansion at temperature TH is the same as its decrease during isothermal compression at TC. We can conclude that the entropy change of the system is zero after a complete Carnot cycle.
The mentioned textbook now states that any reversible cyclic process can be constructed from Carnot cycles (towards the end of chapter 20 in the 14th global edition of Young and Freedman.)

The conclusion is that any reversible cyclic process has zero entropy change for the system.

As I see it, this does not show that entropy is a state function.
One would also have to show that the entropy change of the system is zero in any irreversible cyclic process.

Is it okay to simply generalise the argument in the textbook and say that any reversible or irreversible cyclic process can be approximated by (reversible) Carnot processes and therefore the entropy of the system is always unchanged after one complete cycle, no matter whether the process is reversible or irreversible?
 

Answers and Replies

  • #2
vanhees71
Science Advisor
Insights Author
Gold Member
2019 Award
13,805
5,618
If the process is not reversible, you look at an open system, but entropy (and the other thermodynamical variables) are only state functions for closed systems!
 
  • #3
Lord Jestocost
Gold Member
491
327
Summary:: I'm trying to show that entropy is a state function based on an analysis of the Carnot cycle and without using advanced mathematics. I'm not satisfied with the presentation in the textbook "University Physics" by Young and Freedman and would like some feedback.
Maybe, chapter "B. The Entroy" in Richard Becker's book "The Theory of Heat" might be of help:
https://books.google.de/books?id=wSvvCAAAQBAJ&pg=PA21&hl=de&source=gbs_toc_r&cad=3#v=onepage&q&f=false
 
  • #4
199
20
Maybe, chapter "B. The Entroy" in Richard Becker's book "The Theory of Heat" might be of help:
https://books.google.de/books?id=wSvvCAAAQBAJ&pg=PA21&hl=de&source=gbs_toc_r&cad=3#v=onepage&q&f=false
Thanks for the reference.
I'm trying to work only with Young and Freedman at the moment since that is what we use in our courses.
Unfortunately I'm missing some step that would make the presentation logically consistent, at least for me.
I'll discuss it more in my reply to vanHees.
 
  • #5
199
20
If the process is not reversible, you look at an open system, but entropy (and the other thermodynamical variables) are only state functions for closed systems!
In that case I have a serious, didactic problem later on in the same chapter.
Young and Freedman proceed to calculating entropy changes for irreversible processes such as free expansion and irreversible heat transfer.
The argument they offer is that for such calculations, any irreversible process can be replaced by a reversible process between the same initial and final state, because entropy is a state function.
Now, if entropy is not a state function for irreversible processes, how do I know that I will get the correct value for its change after making the replacement.
Do you see my problem?
 
  • #6
19,677
3,987
If an irreversible process is cyclic, this means that it starts and ends in the same state. It is possible to devise an infinite number of alternate reversible processes that also start and end in this same state. The change in entropy for all these reversible processes is zero. Therefore the change in entropy for the irreversible process is zero.
 
  • #7
Andy Resnick
Science Advisor
Education Advisor
Insights Author
7,341
1,766
Summary:: I'm trying to show that entropy is a state function based on an analysis of the Carnot cycle and without using advanced mathematics. I'm not satisfied with the presentation in the textbook "University Physics" by Young and Freedman and would like some feedback.
Thinking about this question sent me down a rabbit hole... not sure if I have a good response, but here goes:

First, it's important to note the difference between the entropy of a system S and *changes* to the system entropy ΔS that occurs during a process.

Regarding 'S', since S is a property of a system, the entropy value associated with a particular *equilibrium* state can be expressed in terms of other state variables such as P, V, and T. Then, the entropy can be simply considered as another variable specifying the state and the state can be specified in terms of, for example, S and V or T and S, etc. Thus, S is a state variable. [There is an unstated assumption that S is uniquely determined in any particular equilibrium state, not sure if we need to 'prove' that.]

I got hung up (and honestly, and still uncertain) on both the difference between S and ΔS and the requirement for the state to be in equilibrium. For example:

Consider the copy of Young and Freedman on your desk- put the book into a box. Your system is the book. What is S? I honestly don't know how to calculate it and I believe it's actually an open question:

Xiong W, Faes L, Ivanov PC. Entropy measures, entropy estimators, and their performance in quantifying complex dynamics: Effects of artifacts, nonstationarity, and long-range correlations. Phys Rev E. 2017;95(6-1):062114. doi:10.1103/PhysRevE.95.062114 , available at https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6117159/

So, in frustration, I open the container and rip up the book. Or, maybe I open the box and burn the book. Maybe I fill the container with sulfuric acid and destroy the book. The thing is, I can 'easily' calculate ΔS for each of those three irreversible processes!

For an irreversible process, as long as the start and end states are equilibrium states ('thermostatics'), I can calculate ΔS in terms of any contrived reversible process that connects the two states. It's also possible to compute ΔS if the two states are *stationary* rather than equilibrium ('thermokinetics'), but a full-on nonequilibrium thermodynamic calculation is (AFAIK) not possible at this time.

Not sure if this helps... clarification is welcome!
 
  • #8
199
20
If an irreversible process is cyclic, this means that it starts and ends in the same state. It is possible to devise an infinite number of alternate reversible processes that also start and end in this same state. The change in entropy for all these reversible processes is zero. Therefore the change in entropy for the irreversible process is zero.
Yes, that is very helpful.
So essentially I have to correct the textbook a little.
Any cyclic process, both reversible and irreversible, can be approximated by a series of Carnot processes.

An interesting example is a process that's almost identical to the Carnot-cycle, but the reversible, isothermal expansion is replaced by a free expansion.
It can be replaced by a single Carnot-cycle for calculating the entropy change of the system.
 
  • #9
vanhees71
Science Advisor
Insights Author
Gold Member
2019 Award
13,805
5,618
If an irreversible process is cyclic, this means that it starts and ends in the same state. It is possible to devise an infinite number of alternate reversible processes that also start and end in this same state. The change in entropy for all these reversible processes is zero. Therefore the change in entropy for the irreversible process is zero.
Have you now proven that there are no irreversible processes?

Of course not. The reason is that in general when you consider time-dependent situations of many-body systems they are usually not described by local thermal equilibrium at any instance of time. That's only the case if the changes on the macroscopic relevant observables are slow compared to the thermalization time of the system. Only then the entropy stays constant over all time. In other words for general non-equilibrium processes the entropy is not constant but increasing.
 
  • #10
19,677
3,987
Yes, that is very helpful.
So essentially I have to correct the textbook a little.
Any cyclic process, both reversible and irreversible, can be approximated by a series of Carnot processes.
Absolutely not. To determine the entropy change for an irreversible process, we need to devise an alternative reversible process between the same two thermodynamic end states, and calculate the integral of dq/T for that reversible process. The alternative reversible process does not need to bear any resemblance whatsoever to the actual irreversible process, as long as it passes through the same two end states. And there are an infinite number of reversible processes that will do this.
 
  • #11
199
20
Absolutely not. To determine the entropy change for an irreversible process, we need to devise an alternative reversible process between the same two thermodynamic end states, and calculate the integral of dq/T for that reversible process. The alternative reversible process does not need to bear any resemblance whatsoever to the actual irreversible process, as long as it passes through the same two end states. And there are an infinite number of reversible processes that will do this.
Then I wonder whether I have gained anything. I want to show that S is a state function.
In your recipe you use the fact that S is a state function.
The problem I have with our textbook is that it shows that S is a state function only for reversible processes, but then it uses this finding even for irreversible processes.
I'm missing one step in the argument, so to say.
 
  • #12
19,677
3,987
I really don't know how to answer this question. I guess we just accepted it when we learned ti. My main strength lies more in applying the fundaments to solve specific problems.
 
  • #13
vanhees71
Science Advisor
Insights Author
Gold Member
2019 Award
13,805
5,618
A general change of state is described by off-equilibrium many-body theory, which can be addressed on various levels: At the most fundamental level you can use, e.g., non-equilibrium quantum-field theory methods (relativistic or non-relativistic), e.g., the Schwinger-Keldysh real-time-contour formalism, which leads to the Kadanoff-Baym equations in the 2PI (kown under the various names of Luttinger-Ward, Kadanoff-Baym, or Jackiw-Tomboulis) formalism. This is a very sophisticated approach and the solution of the equations is usually pretty difficult. It has been done mostly for toy models like the ##\phi^4##, linear $\sigma$ model, often in lower space-time dimensions.

The next level of description are Boltzmann-type (quantum) transport equations, which is found formally by using a gradient expansion or equivalently a formal expansion in powers of ##\hbar## leading to a Markovian description, usually taking into account only ##2 \rightarrow 2## scattering processes but also higher-order inelastic processes like ##2 \leftrightarrow 3## scattering etc.

Through the assumption of "molecular chaos" to cut the typical "BBGKY hierarchy" one introduces a "thermodynamical arrow of time" which is by construction identical with the "causal arrow of time" underlying all physics. The principle of detailed balance, guaranteed by the unitarity of the S-matrix, ensures that the (coarse-grained) entropy does not decrease ("H-theorem"), and that in the long-time limit thermal equilibrium is reached.

This final equilibrium state is uniquely determined by the temperature, chemical potential(s) of conserved quantitities, and external paramaters (like volume, presence of external fields, etc.), and for this equilibrium state the entropy is maximal and a "state function", which means that the equilibrium state, reached under the applied constraints is the state of "minimal information", and you cannot say from the knowledge of this state, how the system came into this state, i.e., you have no information about the "history" of the system before it reached this equilibrium state.
 
  • #14
Lord Jestocost
Gold Member
491
327
The problem I have with our textbook is that it shows that S is a state function only for reversible processes,
When you once have proven that the entropy of a system is a state function which depends only on the values of some state variables of the system, it doesn’t matter how you bring the system into the state where it has these values of these state variables.
 
  • #15
vanhees71
Science Advisor
Insights Author
Gold Member
2019 Award
13,805
5,618
Yes, you only have to wait until everything is in thermal equilibrium. On its way to thermal equilibrium entropy is never decreasing (H-theorem).
 
  • #16
199
20
I think I'm getting there.
What the book really shows is that every reversible cyclic process has entropy change zero.
That obviously means that the entropy change between two states can be calculated using any reversible path between these states and the result will be the same. So far everything is clear.

The step I need is that the only explanation for the above is that the system's entropy is a state function.
So it doesn't really matter that only reversible paths are considered.
Does that sound right?

With this step I can conclude that an irreversible process between the same two states will also have the same entropy change.
 
  • #17
19,677
3,987
I think I'm getting there.
What the book really shows is that every reversible cyclic process has entropy change zero.
That obviously means that the entropy change between two states can be calculated using any reversible path between these states and the result will be the same. So far everything is clear.

The step I need is that the only explanation for the above is that the system's entropy is a state function.
So it doesn't really matter that only reversible paths are considered.
Does that sound right?

With this step I can conclude that an irreversible process between the same two states will also have the same entropy change.
How are you defining the entropy change of a process?
 
  • #18
199
20
How are you defining the entropy change of a process?
The whole discussion in the book starts with the Carnot cycle, so the entropy change during a reversible isotherm is Q/T, whereas for the adiabatic processes it's zero.
 
  • #19
19,677
3,987
The whole discussion in the book starts with the Carnot cycle, so the entropy change during a reversible isotherm is Q/T, whereas for the adiabatic processes it's zero.
How do you define it for an arbitrary process?
 
  • #20
Stephen Tashi
Science Advisor
6,928
1,184
With this step I can conclude that an irreversible process between the same two states will also have the same entropy change.
A technicality about "irreversible process":
Is a "process" defined as physical phenomena that can be represented as a path on a P-V diagram? If so, then is "free expansion" a process? If a gas is confined to one side of a cylinder by a partition and the partition is removed then as the gas expands without being in equilibrium, does the gas have a defined volume and pressure?
 
  • #21
Stephen Tashi
Science Advisor
6,928
1,184
A quotation from "Extended Irreversible Thermodynamics" by Jou, Casas-Vazquez and Lebon

Using "CIT" to mean "classical irreversible thermodynamics":

The fundamental hypothesis underlying CIT is that of local equilibrium. It postulates that the local and instantaneous relations between the thermal and mechanical properties of a physical system are the same as for a uniform system at equilibrium. It assumes that the system under study can be mentally split into a series of cells sufficiently large to allow them to be treated as macroscopic thermodynamic systems, but sufficiently small that equilibrium is very close to being realized in each cell.
 
  • #22
199
20
How do you define it for an arbitrary process?
As I see it, dS is always dQ/T for a reversible process. For an irreversible process it's not defined or at least can't be calculated directly.
 
  • #23
199
20
A technicality about "irreversible process":
Is a "process" defined as physical phenomena that can be represented as a path on a P-V diagram? If so, then is "free expansion" a process? If a gas is confined to one side of a cylinder by a partition and the partition is removed then as the gas expands without being in equilibrium, does the gas have a defined volume and pressure?
Yes, I see that problem with processes where p and V are not defined, because they happen too quickly for example. They have two end-points on the pV-diagram but nothing in between so to say. That should mean they can't be approximated by reversible processes.
 
  • #24
19,677
3,987
As I see it, dS is always dQ/T for a reversible process. For an irreversible process it's not defined or at least can't be calculated directly.
It can be calculated directly for an irreversible process, but it isn't as simple a calculation as one might think. It involves solving the complicated partial differential equations in space and time within the system that include the transport processes of viscous fluid dynamics and heat conduction. This enables one to calculate the local rate of entropy generation within the system and to integrate that over the volume of the system. For the basics of how this is done, see Chapter 11, Transport Phenomena, Bird, Stewart, and Lightfoot, Problem 11D.1.

For pure irreversible heat conduction problems, say involving steady state conduction in a rod or even transient heat conduction in a rod, the calculation is much more straightforward to apply in practice, and I can show how it is done, if anyone is interested.
 
  • #25
hilbert2
Science Advisor
Insights Author
Gold Member
1,297
398
Every time you predict the direction of a constant-pressure chemical reaction or other process with Gibbs free energy arguments, you assume that entropy is a state function. Otherwise the system could have more than one value of S and G in the same apparent macroscopic state.
 

Related Threads for: Show that entropy is a state function

  • Last Post
Replies
1
Views
2K
  • Last Post
2
Replies
32
Views
28K
Replies
5
Views
2K
  • Last Post
Replies
2
Views
702
Replies
1
Views
2K
  • Last Post
Replies
2
Views
2K
Top