Show how the Boltzmann entropy is derived from the Gibbs entropy for equilibrium

So the integral of 1 over that phase space should be the total number of microstates...In summary, the Boltzmann entropy can be derived from the Gibbs entropy by setting the probability distribution to a constant value and integrating over the phase space volume. This results in the equation S = lnΩ, where Ω represents the number of microstates in a given macrostate.
  • #1
zi-lao-lan
4
0

Homework Statement


Show how the Boltzmann entropy is derived from the Gibbs entropy for systems in equilibrium.


Homework Equations



Gibbs entropy S= - [tex]\int[/tex] [tex]\rho[/tex](p,q) (ln [tex]\rho[/tex](p,q)) dpdq
where [tex]\rho[/tex](p,q) is the probability distribution

Boltzmann entropy S= ln[tex]\Omega[/tex]
where [tex]\Omega[/tex] is the number of microstates in a given macrostate.


The Attempt at a Solution



1. Well, when the system is in equilibrium (ie when the Boltzmann entropy can be used) all microstates have equal probability. So this means that each microstate has a probability of 1/[tex]\Omega[/tex] and the probability distribution [tex]\rho[/tex] will have a constant value regardless of what p and q are.

2. I tried putting [tex]\rho[/tex]=1/[tex]\Omega[/tex] and subbing it into the Gibb's equation

S= - [tex]\int[/tex] 1/[tex]\Omega[/tex] (ln [tex]\1/[tex]\Omega[/tex]) d[tex]\Omega[/tex]
using d[tex]\Omega[/tex] since we want to add up over all the microstates and there are
[tex]\Omega[/tex] of them. But I can see that this won't give me the Boltzmann entropy.

Any ideas?
 
Physics news on Phys.org
  • #2
Your problem is in setting [tex]dp dq = d\Omega[/tex], because with omega, you mean a fixed number, not a variable! It is clearer in this manner:

[tex]- \int \rho \ln \rho dp dq = - \int \frac{1}{\Omega} \ln \frac{1}{\Omega} dp dq = \left( - \frac{1}{\Omega} \ln \frac{1}{\Omega} \right) \int dp dq = \left( \frac{1}{\Omega} \ln \Omega \right) \Omega = \ln \Omega[/tex]
 
  • #3
Thanks for the reply :) But I'm still not sure how you get to the last step.

That means the the integral of dpdp = - omega, but I can't see why that is.

Is it something to do with normalising it?
 
  • #4
ln(1/Ω)= ln1-lnΩ lol
 
  • #5
Well, in a discrete system, omega is the number of microstates, but we're working with a continuous system here: then omega is the phase space volume, by which I mean the "volume" in (p,q)-space formed by all the available (p,q)-points. I think it's just defined that way actually.
 

1. What is the difference between Boltzmann entropy and Gibbs entropy?

Boltzmann entropy is a measure of the disorder or randomness in a system, while Gibbs entropy takes into account not only the disorder of the system but also its energy. Boltzmann entropy is used for systems in equilibrium, while Gibbs entropy can be used for any system.

2. How is the Boltzmann entropy derived from the Gibbs entropy?

The Boltzmann entropy (S) is derived from the Gibbs entropy (SG) by using the formula S = kB ln(W), where kB is the Boltzmann constant and W is the number of microstates in the system. This formula is based on the Boltzmann distribution, which describes the probability of a system being in a particular energy state.

3. What is the significance of equilibrium in deriving Boltzmann entropy from Gibbs entropy?

In order for the derivation to be valid, the system must be in thermal equilibrium. This means that the temperature, pressure, and other thermodynamic variables remain constant over time. This allows us to use the Boltzmann distribution and the concept of microstates to calculate the entropy of the system.

4. Can the Boltzmann entropy be calculated for non-equilibrium systems?

No, the Boltzmann entropy can only be calculated for systems in equilibrium. For non-equilibrium systems, other methods such as the Shannon entropy or Kolmogorov-Sinai entropy may be used to measure the level of disorder.

5. How does the Boltzmann entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system always increases over time. The Boltzmann entropy is a measure of the disorder of a system, and therefore, as the second law states, it will always increase or remain constant in an isolated system. This is why the Boltzmann entropy is often referred to as the "arrow of time" in thermodynamics.

Similar threads

  • Advanced Physics Homework Help
Replies
3
Views
2K
  • Advanced Physics Homework Help
Replies
4
Views
3K
  • Thermodynamics
Replies
18
Views
3K
  • Thermodynamics
Replies
1
Views
697
Replies
22
Views
1K
  • Advanced Physics Homework Help
Replies
5
Views
2K
  • Atomic and Condensed Matter
Replies
6
Views
4K
  • Advanced Physics Homework Help
Replies
1
Views
630
  • Advanced Physics Homework Help
Replies
4
Views
1K
Replies
2
Views
816
Back
Top