Register to reply

Internal energy and thermodynamic entropy

by SW VandeCarr
Tags: energy, entropy, internal, thermodynamic
Share this thread:
SW VandeCarr
#1
Feb17-11, 07:32 AM
P: 2,499
As a practical matter, the internal energy of a system is treated as a state function of a the system and is concerned only with the kinetic energy of particles, not the potential energy. So for thermodynamic entropy, S=E/T, (S=entropy, T= absolute temperature) I'm considering E to be kinetic energy.

Now thermodynamic temperature has been defined as the average kinetic energy per unit particle. If so, we can say T=E/N where E is the internal (kinetic) energy of a system. Then, S=E/T=E/E/N=N. That is, S becomes a dimensionless number equal to (or direct function of) the number of particles in the system. The Boltzmann equation: S=k log W considers only the number of particles in a system since [tex]W=N!/\Pi N_{i}![/tex].

If S=f(N), then the entropy of a system of N particles would seem to be constant. That would mean it doesn't matter if you add or remove energy or dilute or concentrate the particles. S is always constant unless you add or remove particles. Is this true?
Phys.Org News Partner Physics news on Phys.org
Researchers demonstrate ultra low-field nuclear magnetic resonance using Earth's magnetic field
Bubbling down: Discovery suggests surprising uses for common bubbles
New non-metallic metamaterial enables team to 'compress' and contain light
Andrew Mason
#2
Feb17-11, 11:55 AM
Sci Advisor
HW Helper
P: 6,671
Quote Quote by SW VandeCarr View Post
As a practical matter, the internal energy of a system is treated as a state function of a the system and is concerned only with the kinetic energy of particles, not the potential energy.
This is true only of an ideal gas. It is not true of steam, for example.

So for thermodynamic entropy, S=E/T, (S=entropy, T= absolute temperature) I'm considering E to be kinetic energy.
This is not the definition of entropy. dS = dQ/T where dQ = dU + W = dU + PdV. In your equation E appears to be the same as U, internal energy.

Now thermodynamic temperature has been defined as the average kinetic energy per unit particle.
This is not how temperature is defined. T is directly related to average kinetic energy of a molecule as follows:

[tex]KE_{avg} = 3kT/2[/tex]

where k is the Boltzmann constant having units of Joules/degree K


If so, we can say T=E/N where E is the internal (kinetic) energy of a system. Then, S=E/T=E/E/N=N. That is, S becomes a dimensionless number equal to (or direct function of) the number of particles in the system. The Boltzmann equation: S=k log W considers only the number of particles in a system since [tex]W=N!/\Pi N_{i}![/tex].

If S=f(N), then the entropy of a system of N particles would seem to be constant. That would mean it doesn't matter if you add or remove energy or dilute or concentrate the particles. S is always constant unless you add or remove particles. Is this true?
No. If you take into account what I have said above, you will avoid this confusion. S has dimensions of energy/temperature. You are equating Q with U. All you are saying that if U is proportional to T (which it is in an ideal gas) then dU/dT will be constant. That is true, but that has nothing to do with entropy. Entropy is heat flow / temperature: S = Q/T

AM
RedX
#3
Feb17-11, 03:35 PM
P: 969
So you're saying:

dS=dE/T

which is fine when entropy doesn't depend on volume (zero pressure always).

Integrating both sides you get:

[tex]S=\int dE/T [/tex]

But T is a function of just E and not V (try to guess why), so it ought to be:

[tex]S=\int dE/T(E) [/tex]

But now you say that T(E)=aE, where 'a' is a proportionality constant.

Then the integral becomes:

[tex]S=\int dE/aE=\log E /a [/tex]

So your entropy depends on the log of energy. And notice that this is the result for how the entropy of a solid varies with energy in the high energy limit!

But it's not quite the exact result. The reason is that T is proportional to E only at high energies, whereas in the integral you assumed T is proportional to E even at low energies. But still the integral gives an accurate result when you have a lot of energy, since the integral is dominated by this high energy region (since this region is large), and a few inaccuracies at the beginning of the integral don't matter.

SW VandeCarr
#4
Feb18-11, 05:20 AM
P: 2,499
Internal energy and thermodynamic entropy

Quote Quote by Andrew Mason View Post
This is true only of an ideal gas. It is not true of steam, for example.
Thank you for your response. Yes. I'm really asking my question in terms of an ideal gas. I want to avoid the specific interactions which occur between particles. I want to consider the case where the specific heat a of a gas (J/kmol) is constant (or nearly so )over some range of temperature.

This is not how temperature is defined. T is directly related to average kinetic energy of a molecule as follows:

[tex]KE_{avg} = 3kT/2[/tex]
This doesn't appear to change the situation of interest since you are only multiplying the variable T by a constant.

You are equating Q with U. All you are saying that if U is proportional to T (which it is in an ideal gas) then dU/dT will be constant. That is true, but that has nothing to do with entropy.
The Boltzmann equation for entropy contains only one variable: N. The Gibbs equation contains only one variable: p. In the first case N is the number of particles in the system, In the second case p is the probability of a particular position-momentum phase state configuration of N particles at equilibrium.

My question is: Given that dU/dT is constant in an ideal gas, can we say that [tex]U=k(mv^2/2)[/tex] in an ideal gas? In other words, in an ideal gas, does entropy only depend on the number of particles: N?
Andrew Mason
#5
Feb18-11, 07:37 AM
Sci Advisor
HW Helper
P: 6,671
Quote Quote by SW VandeCarr View Post
This doesn't appear to change the situation of interest since you are only multiplying the variable T by a constant.
You seemed to be interested in making the point that entropy is dimensionless. Temperature has units of kelvins which is not the same as energy.

The Boltzmann equation for entropy contains only one variable: N. The Gibbs equation contains only one variable: p. In the first case N is the number of particles in the system.
This is not correct. N is the number of microstates that the collection of molecules can have. It is usually written using the symbol [itex]\Omega[/itex] to avoid confusion with N being the number of molecules. The number of microstates depends upon volume and the molecular speed as well as the number of molecules. It would also depend on the orientation of those molecules if they are not monatomic.
In the second case p is the probability of a particular position/momentum phase state configuration of N particles at equilibrium.
Which, again, depends upon the number of available microstates.
My question is: Given that dU/dT is constant in an ideal gas, can we say that [tex]U=k(mv^2/2)[/tex] in an ideal gas?
No. [itex]U = Nmv_{rms}^2/2 = 3NkT/2[/itex]. And this is true only for a monatomic ideal gas since single atoms have only 3 degrees of freedom.

In other words, in an ideal gas, does entropy only depend on the number of particles: N?
For the above reasons the answer is no.

AM
RedX
#6
Feb18-11, 09:31 AM
P: 969
Quote Quote by SW VandeCarr View Post
In other words, in an ideal gas, does entropy only depend on the number of particles: N?
See: http://hyperphysics.phy-astr.gsu.edu...entropgas.html

for the entropy of an ideal gas. It contains lots of variables besides N. Notice how it varies with energy as a log. T=(2/(3NK))E for a monotone ideal gas, so you would plug in "a"=2/(3NK)
in the equation:

[tex]S=\frac{log E}{a} [/tex]

For an ideal gas you need to include volume. For a solid volume is not really important. As long as energy is proportional to temperature, you'll get an entropy dependence on energy as a log.
SW VandeCarr
#7
Feb18-11, 01:55 PM
P: 2,499
Quote Quote by RedX View Post
See: http://hyperphysics.phy-astr.gsu.edu...entropgas.html

for the entropy of an ideal gas. It contains lots of variables besides N. Notice how it varies with energy as a log. T=(2/(3NK))E for a monotone ideal gas, so you would plug in "a"=2/(3NK)
in the equation:

[tex]S=\frac{log E}{a} [/tex]

For an ideal gas you need to include volume. For a solid volume is not really important. As long as energy is proportional to temperature, you'll get an entropy dependence on energy as a log.
Thanks for the link. I would note that in your expressions, N is the only variable in the relation between T and E. So [tex]S=\frac{log E}{a}[/tex] should be equivalent to Boltzmann's [tex]S=k. log. W[/tex] and Gibbs'[tex] S=-k\sum p_{i}. log.p_{i}[/tex].
Andrew Mason
#8
Feb18-11, 03:48 PM
Sci Advisor
HW Helper
P: 6,671
Quote Quote by SW VandeCarr View Post
Thanks for the link. I would note that in your expressions, N is the only variable in the relation between T and E. So [tex]S=\frac{log E}{a}[/tex] should be equivalent to Boltzmann's [tex]S=k. log. W[/tex] and Gibbs'[tex] S=-k\sum p_{i}. log.p_{i}[/tex].
That is not what Redx said. He said that [tex]S=\frac{log E}{a}[/tex] does not take into account changes in volume whereas the correct equations do.

[tex]\Delta S = \int dQ/T = \int dU/T + \int PdV/T[/tex]

For a monatomic ideal gas dU = nCvdT = 3NkdT/2, and with no change in volume (PdV = 0) over a reversible path between states i and f:

[tex]\Delta S_{i-f} = \int_{T_i}^{T_f} dU/T = 3Nk/2\int_{T_i}^{T_f} dT/T = \frac{3Nk}{2}\ln\left(\frac{T_f}{T_i}\right)[/tex]

Since T = 2U/3Nk, we have:

[tex]\Delta S = \frac{3Nk}{2}\ln\left(\frac{U_f}{U_i}\right)[/tex]

Caution: this is only true if there is no change in volume over a reversible path between the two states i and f.

AM


Register to reply

Related Discussions
Thermodynamic Potentials - Internal energy problem. Introductory Physics Homework 0
Thermodynamic cycle and entropy Classical Physics 2
Entropy and internal energy Introductory Physics Homework 6
Entropy, Adiabatic Conditions and Internal Energy Introductory Physics Homework 0
Internal energy vs. Enthalpy vs. Entropy Advanced Physics Homework 6