Boltzmann Constant in definition of entropy

Click For Summary

Discussion Overview

The discussion revolves around the role of the Boltzmann constant in the definition of entropy, particularly its function in converting the logarithm of multiplicity into SI units that have macroscopic meaning. Participants explore the implications of this constant in statistical mechanics and thermodynamics.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification

Main Points Raised

  • Some participants question why the Boltzmann constant is necessary for converting the log of multiplicity into a meaningful unit and why it does not require an additional factor.
  • One participant suggests that the definition of entropy can include an arbitrary constant, which does not affect the temperature at equilibrium for two systems in contact.
  • Another participant explains that the Boltzmann constant relates thermal energy to temperature in the context of non-relativistic ideal gases, where the mean kinetic energy is expressed in terms of temperature.
  • A later reply notes that temperature was defined from entropy, which includes the Boltzmann constant, implying a relationship between the two concepts.

Areas of Agreement / Disagreement

Participants express varying levels of understanding regarding the necessity and implications of the Boltzmann constant in entropy definitions. There is no consensus on the foundational reasons for its role, and multiple viewpoints are presented without resolution.

Contextual Notes

Some discussions touch on the dependence of results on the choice of constants in entropy definitions and the implications for different systems, such as ideal gases and relativistic contexts.

Curl
Messages
756
Reaction score
0
I don't remember why the Boltzmann constant is the perfect number that let's us convert the log of the multiplicity to a unit that works with other SI units. I understand that this constant was "given" units of J/K but why does it work exactly to convert a rather odd made-up number (log of the multiplicity) to a unit which macroscopically has meaning (one joule of heat per Kelvin). Why doesn't there have to be a factor of say, 2, in there as well?
 
Science news on Phys.org
Curl said:
I don't remember why the Boltzmann constant is the perfect number that let's us convert the log of the multiplicity to a unit that works with other SI units. I understand that this constant was "given" units of J/K but why does it work exactly to convert a rather odd made-up number (log of the multiplicity) to a unit which macroscopically has meaning (one joule of heat per Kelvin). Why doesn't there have to be a factor of say, 2, in there as well?
Hi,

A good and simple explanation according to me comes from statistical mechanics. First, you can define an entropy with an arbitrary multiplicative constant that you will call "A" or whatever. Then you can define a temperature from that as the derivative of the energy with respect to S. You will find that two systems in contact with each other will share the same temperature at equilibrium.

This result is independent from the constant "A" that you put in you definition of statistical entropy. Now, you may be interested in ideal gases and in the microcanonical ensemble can find an equation of state relating the pressure P, the volume V, the number of particles and the temperature. This result does depend on the constant "A" you put in your definition for the entropy and can then be estimated using the ideal gas law which is well known and established since the 19th century.
If you do that correctly, you find that, speacking in term of molecules rather than in moles, "A" equals kB the Boltzmann constant.
 
The Boltzmann constant is simply converting the units of thermal energy to temperatures. For a non-relativistic ideal gas, made up by [tex]N[/tex] particles at absolute temperature [tex]T[/tex] the total mean of the kinetic energy is

[tex]U=\frac{f}{2} k T[/tex],

where [tex]f[/tex] is the number of momentum-degrees of freedom, entering the Hamiltonian quadratically, e.g., for a monatomic gas it's 3, for a two-atomic one its 5 and for a more general one its 6.

Of course, at higher temperatures also vibrational degrees of freedom are excited, and this changes the factor again.

In natural units, one usually sets [tex]k=1[/tex] an gives temperatures in energy units (e.g., MeV or GeV in relativistic heavy-ion collisions to give the temperature of the created hot and dense fireball undergoing a phase transition from a deconfined QGP phase to a hadron-gas phase at a temperature of around [tex]T_c=160 \; \mathrm{MeV}[/tex].)
 
Oh right, Temperature was defined from entropy (which was defined to include k) so it works out that way.

Dumb question I guess, sorry.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
7K
  • · Replies 4 ·
Replies
4
Views
14K
  • · Replies 4 ·
Replies
4
Views
2K