IttyBittyBit
- 160
- 0
After a bit of calculation, I came up with the following quantity for the bit-entropy of a thermodynamic system.
We have the following assumptions:
1. System at thermal equilibrium.
2. Ideal gas.
3. Monatomic gas (i.e. no internal degrees of freedom for particles).
4. All particles have equal mass.
5. Units are such that k_B (boltzmann constant) normalized to 1.
Using just information-theoretic arguments (no assumptions from thermodynamics!) I calculated the raw entropy of such a system to be:
S = (Q/T)[(log(T/T0) - 1) + (log(V/V0) - 1) + log(Q/T)]
(T=temperature, Q=thermal energy, V=volume of system, T0,V0=unknown normalizing constants).
This can be simplified to:
S = (Q/T)[log(Q/T0) + log(V/V0) - 2]
Further, I suspect it works for any system size, even systems that wouldn't be called 'ensembles' in the thermodynamic sense (like just a single particle, in which case Q=0. In general, we take Q = total energy - kinetic energy of center of mass of system).
In addition, we find that dS is proportional to dQ/T (i.e. Clausius law of entropy), in the limit where Q >> T (which is always true in thermodynamic ensembles) and volume is held constant.
Yet another interesting thing about this is that the entropy is not zero at the limit of T=0 (because then Q=0 too). Thus it appears the third law of thermodynamics need not apply from a purely information-theoretic standpoint.
Is my formula correct?
We have the following assumptions:
1. System at thermal equilibrium.
2. Ideal gas.
3. Monatomic gas (i.e. no internal degrees of freedom for particles).
4. All particles have equal mass.
5. Units are such that k_B (boltzmann constant) normalized to 1.
Using just information-theoretic arguments (no assumptions from thermodynamics!) I calculated the raw entropy of such a system to be:
S = (Q/T)[(log(T/T0) - 1) + (log(V/V0) - 1) + log(Q/T)]
(T=temperature, Q=thermal energy, V=volume of system, T0,V0=unknown normalizing constants).
This can be simplified to:
S = (Q/T)[log(Q/T0) + log(V/V0) - 2]
Further, I suspect it works for any system size, even systems that wouldn't be called 'ensembles' in the thermodynamic sense (like just a single particle, in which case Q=0. In general, we take Q = total energy - kinetic energy of center of mass of system).
In addition, we find that dS is proportional to dQ/T (i.e. Clausius law of entropy), in the limit where Q >> T (which is always true in thermodynamic ensembles) and volume is held constant.
Yet another interesting thing about this is that the entropy is not zero at the limit of T=0 (because then Q=0 too). Thus it appears the third law of thermodynamics need not apply from a purely information-theoretic standpoint.
Is my formula correct?
Last edited: