The logarithm in the entropy formula

AI Thread Summary
The discussion centers on the role of the logarithm in the entropy formula, specifically S = k ln(N), where k is the Boltzmann constant and N represents the number of microstates. The logarithm is essential because it ensures that entropy behaves as an extensive property. When combining two systems with N1 and N2 microstates, the total number of microstates becomes N = N1N2, and for entropy to remain extensive, it must satisfy S = S1 + S2. This requirement is rooted in the historical definition of entropy as an extensive quantity, which is crucial for thermodynamic relationships, such as the contribution of entropy to energy changes in a system. The discussion also touches on whether the logarithmic function is unique in satisfying the property f(xy) = f(x) + f(y), with references to entropy changes in various thermodynamic processes.
gsingh2011
Messages
115
Reaction score
1
Why is there a logarithm in the entropy formula? Why is it S=kln(N) where k is the Boltzmann constant and N is the number of microstates? Why isn't it S=N?
 
Chemistry news on Phys.org
The reason that I know of, is that we require entropy to be an extensive property.
Suppose that we have two systems, with N1 and N2 microstates, respectively, and we join them. From basic statistics it follows that the new system has N = N1N2 microstates.

However, to be an extensive quantity, the entropy should scale as
S = S1 + S2.
 
CompuChip said:
The reason that I know of, is that we require entropy to be an extensive property.
Suppose that we have two systems, with N1 and N2 microstates, respectively, and we join them. From basic statistics it follows that the new system has N = N1N2 microstates.

However, to be an extensive quantity, the entropy should scale as
S = S1 + S2.

Why do we want entropy to be an extensive quantity? Multiplying the microstates to calculate the entropy seems just as easy/useful as adding the entropies.
 
Because Entropy was defined as an extensive quantity long before people knew about statistical mechanics.
 
Well, again there is a lot I'm omitting, but one good reason is that entropy contributes to the energy of the system as
dE = T dS - p dV + N dμ
and we definitely want that to be extensive, don't we?
(Note by the way that the quantities occur in combinations of extensive and intensive: two systems with entropy S and temperature T have total entropy 2S but temperature T, two systems with pressure p and volume V have pressure 2V but pressure p, etc)
 
But is ln the only function for which f(xy) = f(x)+f(y)?
 
delta S for n moles of a gas in isothermal expansion =
integral V1 to V2 nR dV/V = delta S= nR ln V2/V1
Given that a change in entropy in statistical mechanics from a system with probability of W1
to one of W2 = k ln W2/W1 , it should follow that
delta S = integral w1 to w2 k = k ln w2/w1
And since w2 = all the possible states in phase space and w1 = one state
Then S = k ln w
 
Last edited:
jhjensen said:
But is ln the only function for which f(xy) = f(x)+f(y)?

I answered that mathematically in your other thread
 

Similar threads

Back
Top