Boltzmann equation/ Statistical Mechanics

Click For Summary

Homework Help Overview

The discussion revolves around the Boltzmann equation and concepts in statistical mechanics, specifically focusing on the relationship between entropy and multiplicity. The original poster attempts to demonstrate that the function relating entropy to multiplicity is the natural logarithm of multiplicity.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants explore the definition of multiplicity and its relation to probabilities, with attempts to derive the logarithmic relationship. Questions arise regarding the completeness of the proof and the reasoning behind using the logarithm in the context of entropy.

Discussion Status

The discussion has seen participants providing insights into the properties of entropy and multiplicity, with some suggesting that the logarithmic function is essential due to its mathematical properties. There is acknowledgment of the original poster's progress, though some uncertainty remains about the proof's robustness.

Contextual Notes

Participants are working within the framework of statistical mechanics and are considering the implications of combining systems with defined entropies and multiplicities. The conversation hints at the need for clarity in definitions and the assumptions underlying the relationships discussed.

spaphy
Messages
3
Reaction score
0

Homework Statement



If we assume entropy is a function of the multiplicity, \Omega, (S=k*f(\Omega)) show that that function f(\Omega) is ln(\Omega).

Homework Equations


The Attempt at a Solution



\Omega can be written as N!/ni!. By using stirling's approximation, this becomes \Omega= ((N/e)^N)/((n1/e)^n1*(n2/e)^n2*...(ni/e)^ni). We know that the probability pi=N/ni so this reduces to W=1/(p1^n1*p2^n2*...*pi^ni). To make this user friendly take the log so ln(\Omega)=-\Sigmapi*ln(pi).

I just started down the road of trying to use definition of multiplicity and probabilities and I did get to ln(\Omega), but it doesn't seem like I'm really doing a solid proof and I'm not sure what's missing/ how to tie it together.
 
Physics news on Phys.org
Consider two systems, with entropies S1 and S2, multiplicities W1 and W2. What can you say about the entropy of the combined system? What can you say about the multiplicity of the combined system?
 
the total entropy s=s1+s2 and the multiplicity w=w1*w2. Is the log just out of convenience then?
 
No -- it's the only function that would fit the requirement that f(w1)+f(w2)=f(w1*w2).
 
Makes sense...thank you. It's been a long week. Nice to finally know where that log came from.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
4K
Replies
4
Views
1K
Replies
4
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
5
Views
2K
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K
Replies
2
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K