Confusion about the entropy of mixing

In summary: If you have a basic understanding of quantum mechanics, you can go deeper into equilibrium statistical mechanics, but it's not required.In summary, Gibb's paradox in statistical mechanics is a seeming discrepancy between thermodynamics and the calculation of entropy when two identical boxes are combined. While thermodynamics suggests the combined entropy remains the same, a statistical mechanics viewpoint indicates a significant increase in entropy due to the increase in potential microstates. Can anyone help resolve this discrepancy?
  • #1
sha1000
123
6
TL;DR Summary
I'm seeking clarity on a seeming discrepancy between thermodynamics and statistical mechanics concerning the calculation of entropy when two identical boxes are combined. While thermodynamics suggests the combined entropy remains the same, a statistical mechanics viewpoint indicates a significant increase in entropy due to the increase in potential microstates. Can anyone help resolve this?
Hello everyone,

I am seeking some clarification regarding a question related to thermodynamics and statistical mechanics. My understanding is that when we combine two identical boxes with the same ideal gas by removing the wall between them, the resulting system's entropy stays the same. Essentially, the total entropy of the new system is the summation of the entropies of the original two boxes (i.e., Stot = S1 + S2 = 2S1 or 2S2).

However, from the standpoint of statistical mechanics, it appears that this entropy increase might not be as straightforward. Let's consider that we have N1 particles in a volume V1, which results in an entropy of S1. If we duplicate this system with a partition in place, we can simply double the entropy. But, if we remove the partition, we're left with 2N1 particles in a volume of 2V1. My confusion arises from the fact that when calculating the number of microstates in this new system, the entropy seems to increase significantly due to the doubled number of particles in the doubled volume.

Could anyone shed some light on this apparent discrepancy between these two views?

Thank you in advance for your help!
 
Science news on Phys.org
  • #4
sha1000 said:
TL;DR Summary: I'm seeking clarity on a seeming discrepancy between thermodynamics and statistical mechanics concerning the calculation of entropy when two identical boxes are combined. While thermodynamics suggests the combined entropy remains the same, a statistical mechanics viewpoint indicates a significant increase in entropy due to the increase in potential microstates. Can anyone help resolve this?

Hello everyone,

I am seeking some clarification regarding a question related to thermodynamics and statistical mechanics. My understanding is that when we combine two identical boxes with the same ideal gas by removing the wall between them, the resulting system's entropy stays the same. Essentially, the total entropy of the new system is the summation of the entropies of the original two boxes (i.e., Stot = S1 + S2 = 2S1 or 2S2).

However, from the standpoint of statistical mechanics, it appears that this entropy increase might not be as straightforward. Let's consider that we have N1 particles in a volume V1, which results in an entropy of S1. If we duplicate this system with a partition in place, we can simply double the entropy. But, if we remove the partition, we're left with 2N1 particles in a volume of 2V1. My confusion arises from the fact that when calculating the number of microstates in this new system, the entropy seems to increase significantly due to the doubled number of particles in the doubled volume.

Could anyone shed some light on this apparent discrepancy between these two views?

Thank you in advance for your help!
If the gases in the two parts of the box are of identical particles (atoms/molecules), then the entropy doesn't change. If they are not identical there's mixing entropy. You get this right within statistical mechanics, using quantum theory. Anyway, quantum statistical mechanics is simpler than classical. If you know enough quantum theory, it's thus easier to learn statistical physics starting from quantum many-body theory and, for equilibrium, the maximum-entropy principle and take the classical limit to get the results of classical statistics. The same also holds for off-equilibrium statistical mechanics, where you can derive the Boltzmann(-Uehling-Uhlenbeck) equation via the Kadanoff-Baym equations of quantum many-body theory.
 
  • Like
Likes Lord Jestocost

1. What is entropy of mixing?

Entropy of mixing is a thermodynamic concept that measures the degree of randomness or disorder in a system when two or more substances are mixed together. It is a measure of the number of ways in which the components of a mixture can be arranged.

2. Why is there confusion about the entropy of mixing?

There is confusion about the entropy of mixing because it is a complex concept that is often misunderstood. It is also often confused with the concept of entropy in general, which is a measure of the disorder of a system.

3. How is entropy of mixing calculated?

Entropy of mixing is calculated using the formula S = -R∑(xi ln xi), where S is the entropy of mixing, R is the gas constant, and xi is the mole fraction of each component in the mixture. This calculation takes into account the number of ways the components can be arranged in the mixture.

4. Does entropy of mixing always increase?

No, entropy of mixing does not always increase. In some cases, it may decrease or remain constant. It depends on the specific conditions and components involved in the mixing process.

5. How does entropy of mixing relate to thermodynamic stability?

Entropy of mixing is closely related to thermodynamic stability. In general, a decrease in entropy of mixing leads to an increase in thermodynamic stability. This is because a more ordered or structured system is often more stable than a disordered one.

Similar threads

  • Thermodynamics
Replies
1
Views
735
Replies
2
Views
844
Replies
3
Views
969
Replies
22
Views
2K
Replies
1
Views
978
  • Thermodynamics
Replies
1
Views
1K
Replies
3
Views
1K
Replies
2
Views
1K
  • Thermodynamics
Replies
4
Views
1K
Back
Top