Internal energy and thermodynamic entropy

In summary: N. In summary, the conversation discusses the practical treatment of internal energy as a state function of a system, the definition of thermodynamic entropy in relation to kinetic energy, and the role of temperature in determining entropy. It also covers the concept of entropy in an ideal gas and how it relates to the number of particles in the system.
  • #1
SW VandeCarr
2,199
81
As a practical matter, the internal energy of a system is treated as a state function of a the system and is concerned only with the kinetic energy of particles, not the potential energy. So for thermodynamic entropy, S=E/T, (S=entropy, T= absolute temperature) I'm considering E to be kinetic energy.

Now thermodynamic temperature has been defined as the average kinetic energy per unit particle. If so, we can say T=E/N where E is the internal (kinetic) energy of a system. Then, S=E/T=E/E/N=N. That is, S becomes a dimensionless number equal to (or direct function of) the number of particles in the system. The Boltzmann equation: S=k log W considers only the number of particles in a system since [tex]W=N!/\Pi N_{i}![/tex].

If S=f(N), then the entropy of a system of N particles would seem to be constant. That would mean it doesn't matter if you add or remove energy or dilute or concentrate the particles. S is always constant unless you add or remove particles. Is this true?
 
Last edited:
Science news on Phys.org
  • #2
SW VandeCarr said:
As a practical matter, the internal energy of a system is treated as a state function of a the system and is concerned only with the kinetic energy of particles, not the potential energy.
This is true only of an ideal gas. It is not true of steam, for example.

So for thermodynamic entropy, S=E/T, (S=entropy, T= absolute temperature) I'm considering E to be kinetic energy.
This is not the definition of entropy. dS = dQ/T where dQ = dU + W = dU + PdV. In your equation E appears to be the same as U, internal energy.

Now thermodynamic temperature has been defined as the average kinetic energy per unit particle.
This is not how temperature is defined. T is directly related to average kinetic energy of a molecule as follows:

[tex]KE_{avg} = 3kT/2[/tex]

where k is the Boltzmann constant having units of Joules/degree K
If so, we can say T=E/N where E is the internal (kinetic) energy of a system. Then, S=E/T=E/E/N=N. That is, S becomes a dimensionless number equal to (or direct function of) the number of particles in the system. The Boltzmann equation: S=k log W considers only the number of particles in a system since [tex]W=N!/\Pi N_{i}![/tex].

If S=f(N), then the entropy of a system of N particles would seem to be constant. That would mean it doesn't matter if you add or remove energy or dilute or concentrate the particles. S is always constant unless you add or remove particles. Is this true?
No. If you take into account what I have said above, you will avoid this confusion. S has dimensions of energy/temperature. You are equating Q with U. All you are saying that if U is proportional to T (which it is in an ideal gas) then dU/dT will be constant. That is true, but that has nothing to do with entropy. Entropy is heat flow / temperature: S = Q/T

AM
 
  • #3
So you're saying:

dS=dE/T

which is fine when entropy doesn't depend on volume (zero pressure always).

Integrating both sides you get:

[tex]S=\int dE/T [/tex]

But T is a function of just E and not V (try to guess why), so it ought to be:

[tex]S=\int dE/T(E) [/tex]

But now you say that T(E)=aE, where 'a' is a proportionality constant.

Then the integral becomes:

[tex]S=\int dE/aE=\log E /a [/tex]

So your entropy depends on the log of energy. And notice that this is the result for how the entropy of a solid varies with energy in the high energy limit!

But it's not quite the exact result. The reason is that T is proportional to E only at high energies, whereas in the integral you assumed T is proportional to E even at low energies. But still the integral gives an accurate result when you have a lot of energy, since the integral is dominated by this high energy region (since this region is large), and a few inaccuracies at the beginning of the integral don't matter.
 
  • #4
Andrew Mason said:
This is true only of an ideal gas. It is not true of steam, for example.

Thank you for your response. Yes. I'm really asking my question in terms of an ideal gas. I want to avoid the specific interactions which occur between particles. I want to consider the case where the specific heat a of a gas (J/kmol) is constant (or nearly so )over some range of temperature.

This is not how temperature is defined. T is directly related to average kinetic energy of a molecule as follows:

[tex]KE_{avg} = 3kT/2[/tex]

This doesn't appear to change the situation of interest since you are only multiplying the variable T by a constant.

You are equating Q with U. All you are saying that if U is proportional to T (which it is in an ideal gas) then dU/dT will be constant. That is true, but that has nothing to do with entropy.

The Boltzmann equation for entropy contains only one variable: N. The Gibbs equation contains only one variable: p. In the first case N is the number of particles in the system, In the second case p is the probability of a particular position-momentum phase state configuration of N particles at equilibrium.

My question is: Given that dU/dT is constant in an ideal gas, can we say that [tex]U=k(mv^2/2)[/tex] in an ideal gas? In other words, in an ideal gas, does entropy only depend on the number of particles: N?
 
Last edited:
  • #5
SW VandeCarr said:
This doesn't appear to change the situation of interest since you are only multiplying the variable T by a constant.
You seemed to be interested in making the point that entropy is dimensionless. Temperature has units of kelvins which is not the same as energy.

The Boltzmann equation for entropy contains only one variable: N. The Gibbs equation contains only one variable: p. In the first case N is the number of particles in the system.
This is not correct. N is the number of microstates that the collection of molecules can have. It is usually written using the symbol [itex]\Omega[/itex] to avoid confusion with N being the number of molecules. The number of microstates depends upon volume and the molecular speed as well as the number of molecules. It would also depend on the orientation of those molecules if they are not monatomic.
In the second case p is the probability of a particular position/momentum phase state configuration of N particles at equilibrium.
Which, again, depends upon the number of available microstates.
My question is: Given that dU/dT is constant in an ideal gas, can we say that [tex]U=k(mv^2/2)[/tex] in an ideal gas?
No. [itex]U = Nmv_{rms}^2/2 = 3NkT/2[/itex]. And this is true only for a monatomic ideal gas since single atoms have only 3 degrees of freedom.

In other words, in an ideal gas, does entropy only depend on the number of particles: N?
For the above reasons the answer is no.

AM
 
Last edited:
  • #6
SW VandeCarr said:
In other words, in an ideal gas, does entropy only depend on the number of particles: N?

See: http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entropgas.html

for the entropy of an ideal gas. It contains lots of variables besides N. Notice how it varies with energy as a log. T=(2/(3NK))E for a monotone ideal gas, so you would plug in "a"=2/(3NK)
in the equation:

[tex]S=\frac{log E}{a} [/tex]

For an ideal gas you need to include volume. For a solid volume is not really important. As long as energy is proportional to temperature, you'll get an entropy dependence on energy as a log.
 
  • #7
RedX said:
See: http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entropgas.html

for the entropy of an ideal gas. It contains lots of variables besides N. Notice how it varies with energy as a log. T=(2/(3NK))E for a monotone ideal gas, so you would plug in "a"=2/(3NK)
in the equation:

[tex]S=\frac{log E}{a} [/tex]

For an ideal gas you need to include volume. For a solid volume is not really important. As long as energy is proportional to temperature, you'll get an entropy dependence on energy as a log.

Thanks for the link. I would note that in your expressions, N is the only variable in the relation between T and E. So [tex]S=\frac{log E}{a}[/tex] should be equivalent to Boltzmann's [tex]S=k. log. W[/tex] and Gibbs'[tex] S=-k\sum p_{i}. log.p_{i}[/tex].
 
  • #8
SW VandeCarr said:
Thanks for the link. I would note that in your expressions, N is the only variable in the relation between T and E. So [tex]S=\frac{log E}{a}[/tex] should be equivalent to Boltzmann's [tex]S=k. log. W[/tex] and Gibbs'[tex] S=-k\sum p_{i}. log.p_{i}[/tex].
That is not what Redx said. He said that [tex]S=\frac{log E}{a}[/tex] does not take into account changes in volume whereas the correct equations do.

[tex]\Delta S = \int dQ/T = \int dU/T + \int PdV/T[/tex]

For a monatomic ideal gas dU = nCvdT = 3NkdT/2, and with no change in volume (PdV = 0) over a reversible path between states i and f:

[tex]\Delta S_{i-f} = \int_{T_i}^{T_f} dU/T = 3Nk/2\int_{T_i}^{T_f} dT/T = \frac{3Nk}{2}\ln\left(\frac{T_f}{T_i}\right)[/tex]

Since T = 2U/3Nk, we have:

[tex]\Delta S = \frac{3Nk}{2}\ln\left(\frac{U_f}{U_i}\right)[/tex]

Caution: this is only true if there is no change in volume over a reversible path between the two states i and f.

AM
 
Last edited:

What is internal energy?

Internal energy is the total energy of a system due to the motion and interactions of its particles. It includes both kinetic energy (energy of motion) and potential energy (energy due to position or configuration).

How is internal energy related to temperature?

The internal energy of a system is directly related to its temperature. As the temperature of a system increases, so does its internal energy.

What is thermodynamic entropy?

Thermodynamic entropy is a measure of the disorder or randomness of a system. It is a measure of the number of possible ways in which a system can be arranged at a given energy level.

How is thermodynamic entropy related to the Second Law of Thermodynamics?

The Second Law of Thermodynamics states that the total entropy of an isolated system will tend to increase over time. This means that in any process, the total amount of entropy in the universe will either stay the same or increase. This is why thermodynamic entropy is sometimes referred to as the "arrow of time."

How can thermodynamic entropy be decreased?

Technically, it is not possible to decrease thermodynamic entropy in a closed system. However, it is possible to decrease the local entropy of a system by adding energy or performing work on it. This can create pockets of order within a larger system, but the overall entropy of the system will still increase.

Similar threads

Replies
5
Views
2K
Replies
17
Views
1K
Replies
15
Views
1K
Replies
13
Views
1K
Replies
3
Views
987
Replies
1
Views
879
  • Thermodynamics
Replies
3
Views
773
  • Thermodynamics
Replies
3
Views
1K
Replies
2
Views
837
Replies
11
Views
290
Back
Top