Internal energy and thermodynamic entropy

Click For Summary

Discussion Overview

The discussion revolves around the concepts of internal energy and thermodynamic entropy, particularly in the context of ideal gases. Participants explore the definitions and relationships between these concepts, including the implications of treating internal energy as a state function and the role of particle number in determining entropy.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants propose that internal energy is primarily concerned with the kinetic energy of particles, while others argue that this is only true for ideal gases and not for substances like steam.
  • There is a contention regarding the definition of entropy, with some asserting that S=E/T is a valid expression, while others emphasize that entropy is defined through heat flow and temperature as dS = dQ/T.
  • One participant suggests that if S is a function of N (the number of particles), then entropy would remain constant unless particles are added or removed, which is challenged by others who argue that this interpretation is incorrect.
  • Another viewpoint is presented that integrates the relationship between entropy and energy, suggesting that entropy could depend on the logarithm of energy, particularly in high-energy limits.
  • Participants discuss the implications of treating specific heat as constant in ideal gases and question whether entropy depends solely on the number of particles, with responses indicating that other variables must also be considered.
  • References to the Boltzmann equation and Gibbs equation are made, highlighting that entropy is influenced by factors beyond just the number of particles, including volume and molecular configurations.

Areas of Agreement / Disagreement

Participants express multiple competing views regarding the definitions and relationships between internal energy, temperature, and entropy. There is no consensus on whether entropy solely depends on the number of particles or if other factors must be considered.

Contextual Notes

Participants note limitations in their discussions, such as the dependence of definitions on specific conditions (e.g., ideal gases vs. real substances) and the unresolved nature of certain mathematical relationships.

SW VandeCarr
Messages
2,199
Reaction score
77
As a practical matter, the internal energy of a system is treated as a state function of a the system and is concerned only with the kinetic energy of particles, not the potential energy. So for thermodynamic entropy, S=E/T, (S=entropy, T= absolute temperature) I'm considering E to be kinetic energy.

Now thermodynamic temperature has been defined as the average kinetic energy per unit particle. If so, we can say T=E/N where E is the internal (kinetic) energy of a system. Then, S=E/T=E/E/N=N. That is, S becomes a dimensionless number equal to (or direct function of) the number of particles in the system. The Boltzmann equation: S=k log W considers only the number of particles in a system since W=N!/\Pi N_{i}!.

If S=f(N), then the entropy of a system of N particles would seem to be constant. That would mean it doesn't matter if you add or remove energy or dilute or concentrate the particles. S is always constant unless you add or remove particles. Is this true?
 
Last edited:
Science news on Phys.org
SW VandeCarr said:
As a practical matter, the internal energy of a system is treated as a state function of a the system and is concerned only with the kinetic energy of particles, not the potential energy.
This is true only of an ideal gas. It is not true of steam, for example.

So for thermodynamic entropy, S=E/T, (S=entropy, T= absolute temperature) I'm considering E to be kinetic energy.
This is not the definition of entropy. dS = dQ/T where dQ = dU + W = dU + PdV. In your equation E appears to be the same as U, internal energy.

Now thermodynamic temperature has been defined as the average kinetic energy per unit particle.
This is not how temperature is defined. T is directly related to average kinetic energy of a molecule as follows:

KE_{avg} = 3kT/2

where k is the Boltzmann constant having units of Joules/degree K
If so, we can say T=E/N where E is the internal (kinetic) energy of a system. Then, S=E/T=E/E/N=N. That is, S becomes a dimensionless number equal to (or direct function of) the number of particles in the system. The Boltzmann equation: S=k log W considers only the number of particles in a system since W=N!/\Pi N_{i}!.

If S=f(N), then the entropy of a system of N particles would seem to be constant. That would mean it doesn't matter if you add or remove energy or dilute or concentrate the particles. S is always constant unless you add or remove particles. Is this true?
No. If you take into account what I have said above, you will avoid this confusion. S has dimensions of energy/temperature. You are equating Q with U. All you are saying that if U is proportional to T (which it is in an ideal gas) then dU/dT will be constant. That is true, but that has nothing to do with entropy. Entropy is heat flow / temperature: S = Q/T

AM
 
So you're saying:

dS=dE/T

which is fine when entropy doesn't depend on volume (zero pressure always).

Integrating both sides you get:

S=\int dE/T

But T is a function of just E and not V (try to guess why), so it ought to be:

S=\int dE/T(E)

But now you say that T(E)=aE, where 'a' is a proportionality constant.

Then the integral becomes:

S=\int dE/aE=\log E /a

So your entropy depends on the log of energy. And notice that this is the result for how the entropy of a solid varies with energy in the high energy limit!

But it's not quite the exact result. The reason is that T is proportional to E only at high energies, whereas in the integral you assumed T is proportional to E even at low energies. But still the integral gives an accurate result when you have a lot of energy, since the integral is dominated by this high energy region (since this region is large), and a few inaccuracies at the beginning of the integral don't matter.
 
Andrew Mason said:
This is true only of an ideal gas. It is not true of steam, for example.

Thank you for your response. Yes. I'm really asking my question in terms of an ideal gas. I want to avoid the specific interactions which occur between particles. I want to consider the case where the specific heat a of a gas (J/kmol) is constant (or nearly so )over some range of temperature.

This is not how temperature is defined. T is directly related to average kinetic energy of a molecule as follows:

KE_{avg} = 3kT/2

This doesn't appear to change the situation of interest since you are only multiplying the variable T by a constant.

You are equating Q with U. All you are saying that if U is proportional to T (which it is in an ideal gas) then dU/dT will be constant. That is true, but that has nothing to do with entropy.

The Boltzmann equation for entropy contains only one variable: N. The Gibbs equation contains only one variable: p. In the first case N is the number of particles in the system, In the second case p is the probability of a particular position-momentum phase state configuration of N particles at equilibrium.

My question is: Given that dU/dT is constant in an ideal gas, can we say that U=k(mv^2/2) in an ideal gas? In other words, in an ideal gas, does entropy only depend on the number of particles: N?
 
Last edited:
SW VandeCarr said:
This doesn't appear to change the situation of interest since you are only multiplying the variable T by a constant.
You seemed to be interested in making the point that entropy is dimensionless. Temperature has units of kelvins which is not the same as energy.

The Boltzmann equation for entropy contains only one variable: N. The Gibbs equation contains only one variable: p. In the first case N is the number of particles in the system.
This is not correct. N is the number of microstates that the collection of molecules can have. It is usually written using the symbol \Omega to avoid confusion with N being the number of molecules. The number of microstates depends upon volume and the molecular speed as well as the number of molecules. It would also depend on the orientation of those molecules if they are not monatomic.
In the second case p is the probability of a particular position/momentum phase state configuration of N particles at equilibrium.
Which, again, depends upon the number of available microstates.
My question is: Given that dU/dT is constant in an ideal gas, can we say that U=k(mv^2/2) in an ideal gas?
No. U = Nmv_{rms}^2/2 = 3NkT/2. And this is true only for a monatomic ideal gas since single atoms have only 3 degrees of freedom.

In other words, in an ideal gas, does entropy only depend on the number of particles: N?
For the above reasons the answer is no.

AM
 
Last edited:
SW VandeCarr said:
In other words, in an ideal gas, does entropy only depend on the number of particles: N?

See: http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entropgas.html

for the entropy of an ideal gas. It contains lots of variables besides N. Notice how it varies with energy as a log. T=(2/(3NK))E for a monotone ideal gas, so you would plug in "a"=2/(3NK)
in the equation:

S=\frac{log E}{a}

For an ideal gas you need to include volume. For a solid volume is not really important. As long as energy is proportional to temperature, you'll get an entropy dependence on energy as a log.
 
RedX said:
See: http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entropgas.html

for the entropy of an ideal gas. It contains lots of variables besides N. Notice how it varies with energy as a log. T=(2/(3NK))E for a monotone ideal gas, so you would plug in "a"=2/(3NK)
in the equation:

S=\frac{log E}{a}

For an ideal gas you need to include volume. For a solid volume is not really important. As long as energy is proportional to temperature, you'll get an entropy dependence on energy as a log.

Thanks for the link. I would note that in your expressions, N is the only variable in the relation between T and E. So S=\frac{log E}{a} should be equivalent to Boltzmann's S=k. log. W and Gibbs'S=-k\sum p_{i}. log.p_{i}.
 
SW VandeCarr said:
Thanks for the link. I would note that in your expressions, N is the only variable in the relation between T and E. So S=\frac{log E}{a} should be equivalent to Boltzmann's S=k. log. W and Gibbs'S=-k\sum p_{i}. log.p_{i}.
That is not what Redx said. He said that S=\frac{log E}{a} does not take into account changes in volume whereas the correct equations do.

\Delta S = \int dQ/T = \int dU/T + \int PdV/T

For a monatomic ideal gas dU = nCvdT = 3NkdT/2, and with no change in volume (PdV = 0) over a reversible path between states i and f:

\Delta S_{i-f} = \int_{T_i}^{T_f} dU/T = 3Nk/2\int_{T_i}^{T_f} dT/T = \frac{3Nk}{2}\ln\left(\frac{T_f}{T_i}\right)

Since T = 2U/3Nk, we have:

\Delta S = \frac{3Nk}{2}\ln\left(\frac{U_f}{U_i}\right)

Caution: this is only true if there is no change in volume over a reversible path between the two states i and f.

AM
 
Last edited:

Similar threads

  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 17 ·
Replies
17
Views
3K
Replies
5
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
Replies
1
Views
1K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K