Boltzmann Constant in definition of entropy

AI Thread Summary
The Boltzmann constant (kB) serves as a crucial conversion factor that relates the logarithm of multiplicity to thermodynamic units, specifically joules per Kelvin. This relationship arises from statistical mechanics, where entropy can be defined with an arbitrary constant, and temperature is derived from energy changes with respect to entropy. At equilibrium, two systems share the same temperature, independent of the chosen constant. The ideal gas law connects the Boltzmann constant to the behavior of gases at the molecular level, confirming its role in converting thermal energy to temperature. Ultimately, the Boltzmann constant ensures that statistical definitions align with macroscopic thermodynamic quantities.
Curl
Messages
756
Reaction score
0
I don't remember why the Boltzmann constant is the perfect number that let's us convert the log of the multiplicity to a unit that works with other SI units. I understand that this constant was "given" units of J/K but why does it work exactly to convert a rather odd made-up number (log of the multiplicity) to a unit which macroscopically has meaning (one joule of heat per Kelvin). Why doesn't there have to be a factor of say, 2, in there as well?
 
Science news on Phys.org
Curl said:
I don't remember why the Boltzmann constant is the perfect number that let's us convert the log of the multiplicity to a unit that works with other SI units. I understand that this constant was "given" units of J/K but why does it work exactly to convert a rather odd made-up number (log of the multiplicity) to a unit which macroscopically has meaning (one joule of heat per Kelvin). Why doesn't there have to be a factor of say, 2, in there as well?
Hi,

A good and simple explanation according to me comes from statistical mechanics. First, you can define an entropy with an arbitrary multiplicative constant that you will call "A" or whatever. Then you can define a temperature from that as the derivative of the energy with respect to S. You will find that two systems in contact with each other will share the same temperature at equilibrium.

This result is independent from the constant "A" that you put in you definition of statistical entropy. Now, you may be interested in ideal gases and in the microcanonical ensemble can find an equation of state relating the pressure P, the volume V, the number of particles and the temperature. This result does depend on the constant "A" you put in your definition for the entropy and can then be estimated using the ideal gas law which is well known and established since the 19th century.
If you do that correctly, you find that, speacking in term of molecules rather than in moles, "A" equals kB the Boltzmann constant.
 
The Boltzmann constant is simply converting the units of thermal energy to temperatures. For a non-relativistic ideal gas, made up by N particles at absolute temperature T the total mean of the kinetic energy is

U=\frac{f}{2} k T,

where f is the number of momentum-degrees of freedom, entering the Hamiltonian quadratically, e.g., for a monatomic gas it's 3, for a two-atomic one its 5 and for a more general one its 6.

Of course, at higher temperatures also vibrational degrees of freedom are excited, and this changes the factor again.

In natural units, one usually sets k=1 an gives temperatures in energy units (e.g., MeV or GeV in relativistic heavy-ion collisions to give the temperature of the created hot and dense fireball undergoing a phase transition from a deconfined QGP phase to a hadron-gas phase at a temperature of around T_c=160 \; \mathrm{MeV}.)
 
Oh right, Temperature was defined from entropy (which was defined to include k) so it works out that way.

Dumb question I guess, sorry.
 
Back
Top