gsingh2011
- 115
- 1
Why is there a logarithm in the entropy formula? Why is it S=kln(N) where k is the Boltzmann constant and N is the number of microstates? Why isn't it S=N?
The discussion revolves around the presence of the logarithm in the entropy formula, specifically the expression S = k ln(N), where k is the Boltzmann constant and N represents the number of microstates. Participants explore the implications of this formulation in terms of entropy's properties, particularly its extensiveness, and question the necessity of the logarithmic function in this context.
Participants express multiple competing views regarding the necessity and implications of the logarithm in the entropy formula. There is no consensus on whether the logarithm is the only suitable function or on the broader implications of entropy's extensiveness.
Some discussions involve assumptions about the definitions of extensive and intensive properties, as well as the mathematical properties of functions. The exploration of these concepts remains unresolved.
CompuChip said:The reason that I know of, is that we require entropy to be an extensive property.
Suppose that we have two systems, with N1 and N2 microstates, respectively, and we join them. From basic statistics it follows that the new system has N = N1N2 microstates.
However, to be an extensive quantity, the entropy should scale as
S = S1 + S2.
jhjensen said:But is ln the only function for which f(xy) = f(x)+f(y)?