Show that entropy is a state function

AI Thread Summary
In a reversible Carnot cycle, the entropy increase during isothermal expansion equals the entropy decrease during isothermal compression, resulting in zero net entropy change for the system after one complete cycle. The discussion highlights that while the textbook claims any reversible cyclic process has zero entropy change, this does not necessarily imply that entropy is a state function, especially for irreversible processes. It is argued that the entropy change for irreversible processes can still be calculated by approximating them with reversible processes connecting the same initial and final states. The conversation reveals a need for clarity in the textbook's presentation regarding the state function nature of entropy, particularly in the context of irreversible processes. Ultimately, the conclusion is that while the entropy change for reversible processes is zero, the relationship for irreversible processes requires careful consideration.
  • #51
Chestermiller said:
If the two end states are thermodynamic equilibrium states, each one must be characterized by a unique distribution of quantum mechanical states (i.e., a unique entropy). So how could the entropy change be different for the fast process between the same two states?
You almost had me there.
In your argument you assume that entropy is a state function.
If the "entropy change" we calculate, for example using a reversible path between two states, is not the change of a state function, but something more like work or heat, one could get a different value for a path that can't be plotted in a pV-diagram, even though that would be a bit strange, I must admit.
 
Science news on Phys.org
  • #52
Philip Koeck said:
You almost had me there.
In your argument you assume that entropy is a state function.
If the "entropy change" we calculate, for example using a reversible path between two states, is not the change of a state function, but something more like work or heat, one could get a different value for a path that can't be plotted in a pV-diagram, even though that would be a bit strange, I must admit.
What is really needed here is the establishment of the equivalence between the quantum mechanical entropy change (clearly a state function) and the thermal entropy change ##\int{dq/T}## for a reversible process.
 
  • Like
Likes Philip Koeck
  • #53
Chestermiller said:
What is really needed here is the establishment of the equivalence between the quantum mechanical entropy change (clearly a state function) and the thermal entropy change ##\int{dq/T}## for a reversible process.
If I understand correctly you are talking about showing that the entropy given by Boltzmann's formula is the same (apart from a factor and an offset, maybe) as the thermodynamic entropy we're dealing with when we calculate entropy changes.
That would definitely settle my question.
Does anybody know if it's possible to show this equivalence in a general way?
 
  • #54
Philip Koeck said:
If I understand correctly you are talking about showing that the entropy given by Boltzmann's formula is the same (apart from a factor and an offset, maybe) as the thermodynamic entropy we're dealing with when we calculate entropy changes.
That would definitely settle my question.
Does anybody know if it's possible to show this equivalence in a general way?
It is usually done for the specific case of ideal gases in elementary books on Statistical Thermodynamics, but my background in ST is not broad enough to know whether more general representations are around.
 
  • #56
Philip Koeck said:
If I understand correctly you are talking about showing that the entropy given by Boltzmann's formula is the same (apart from a factor and an offset, maybe) as the thermodynamic entropy we're dealing with when we calculate entropy changes.
That would definitely settle my question.
Does anybody know if it's possible to show this equivalence in a general way?
How many different entropies are there? Their mutual compatibility seems to depend on which entropy is being discussed.

From Boltzmann's Approach to Statistical Mechanics by Sheldon Goldstein
The Second Law is concerned with the thermodynamic entropy, and this is given by Boltzmann's entropy (1), not by the Gibbs entropy(2). In fact, the Gibbs entropy is not even an entity of the right sort: It is a function of a probability distribution, i.e., of an ensemble of systems, and not a function on phase space, a function of the actual state X of an individual system, the behavior of which the Second Law { and macro-physics in general } is supposed to describe.

A stackexchange debate about the topic: https://physics.stackexchange.com/questions/316812/gibbs-vs-boltzmann-entropies
 
  • Like
Likes Philip Koeck
  • #57
Chestermiller said:
Consider the irreversible process of steady state heat conduction occurring along a rod in contact with a reservoir of temperature ##T_H## at x = 0 and a second reservoir of temperature ##T_C## at x = L. The temperature profile along the rod is linear, and given by $$T=T_H-(T_H-T_C)\frac{x}{L}$$and the heat flux q along the rod is constant, and given by: $$q=-k\frac{dT}{dx}=k\frac{(T_H-T_C)}{L}$$ where k is the thermal conductivity of the rod metal.

The temperature profile has been determined by satisfying the differential heat balance equation, given by:
$$k\frac{d^2T}{dx^2}=0$$
The differential entropy balance equation on the rod can be obtained by dividing the differential heat balance equation by the absolute temperature, to yield:
$$\frac{k}{T}\frac{d^2T}{dx^2}=\frac{d}{dx}\left(\frac{k}{T}\frac{dT}{dx}\right)+\frac{k}{T^2}\left(\frac{dT}{dx}\right)^2=0$$or, equivalently $$-\frac{d}{dx}\left(\frac{q}{T}\right)+\frac{q^2}{kT^2}=0$$The second term in this equation represents physically the local rate of entropy generation per unit volume within the rod. If we integrate this equation between x = 0 and x = L, we obtain: $$\frac{q_H}{T_H}-\frac{q_C}{T_C}+\frac{q^2}{kT_HT_C}L=0\tag{1}$$where ##q_H=q_C=q##. The first term on the LHS represents the rate of entropy entering the rod per unit area at x = 0, and the second term represents the rate of entropy exiting the rod per unit area at x = L. The coefficient of L in the third term is positive definite, and represents the average rate of entropy generation per unit volume within the rod. Since the rod is operating at steady state, we know that the entropy of the rod is not changing with time. So ##L\frac{dS}{dt}=0##, where dS/dt is the average rate of change of entropy per unit volume of the rod. So we can write:
$$0=L\frac{dS}{dt}\gt \frac{q_H}{T_H}-\frac{q_C}{T_C}=-\frac{q^2}{kT_HT_C}L\tag{2}$$Eqn. 2 is equivalent to the Clausius inequality applied to the irreversible heat conduction process in the rod.

Note that, in this very simple case, we have been able to precisely determine the rate of entropy generation within the system. Note also that, Eqn. 1 tells us that the rate of entropy exiting the rod at x = L is just equal to the rate of entropy entering the rod at x = 0, plus the total rate of entropy generation within the rod.

Very nice derivation.

One can rewrite $$-\frac{d}{dx}\left(\frac{q}{T}\right)+\frac{q^2}{kT^2}=0$$ as $$-\frac{d S}{dx}+\frac{S^2}{k}=0$$

But, I am not getting any insights from that equation. Any thoughts or opinions?

Thank you,
 
Last edited by a moderator:
  • #58
The parameter q is not the amount of heat (and, even if it were, q/T would not be the entropy). The parameter q is the heat flux (the rate of heat flow per unit cross sectional area).
 
Back
Top