Resolution to Gibbs' entropy paradox?

In summary, Gibbs' Paradox can be resolved without assuming that particles are indistinguishable. This means that the entropy of a classical ideal gas, calculated by phase-space volume, is not extensive. This is a connection between information and entropy, as the entropy increases with the number of distinguishable particles.
  • #1
maline
436
69
It seems to me that Gibbs' Paradox (that the entropy of a classical ideal gas, calculated by phase-space volume, is not extensive) can be resolved without assuming that particles are indistinguishable.
Suppose instead the opposite: that particles are distinguishable, meaning that each one can in principle be identified- imagine a minuscule serial number stamped on every molecule. Now this should apply not only to the system under consideration, but to the universe as a whole- the serial numbers run from one to A, where A is the total number of molecules (of a particular type) in the universe.
This immediately implies that specifying the position and momentum of each of the N particles in our box does not fully determine the microstate of the system! We must also specify which, out of the A molecules in existence, are in fact the N ones in the box. The total number of microstates should include all such possibilities.
This means multiplying the phase-space volume by "A choose N", that is, A!/(N!(A-N)!). Since A>>N, the factor A!/(A-N)! tends to AN. Thus we are left with the desired factor of 1/N!, giving the standard (extensive) entropy, plus a constant contribution N log(A) that is also extensive and (I think) has no observable effects.
 
Science news on Phys.org
  • #2
maline said:
It seems to me that Gibbs' Paradox (that the entropy of a classical ideal gas, calculated by phase-space volume, is not extensive) can be resolved without assuming that particles are indistinguishable.

You seem to have stumbled upon Jaynes' solution, which results in a connection between information and entropy:

http://www.santafe.edu/media/workingpapers/07-08-029.pdf
 
  • #3
Where in the book is this?
 
  • #4
maline said:
Where in the book is this?

Chapter 5.
 
  • #5
I don't see my argument there. All I see is the assertion by Gibbs that identical particles must be treated as indistinguishable. There is a completely different conception of entropy there, due to Jaynes, in which the entropy is a function of a probability distribution that we assign, rather than of the physical system per se. I am working with the original phase-space volume concept, and simply pointing out that if particles were distinguishable, then that gives us more possibilities for "different" microstates.
 
  • #6
maline said:
I don't see my argument there.

Then keep reading, that includes searching the literature- and there's a lot of it. There's a clear connection with information content and distinguishable particles.

Edit- how about this: a thought experiment. Consider a box partly full of spheres; when you look inside you see a partition separates one half, containing green spheres, from the other half which has red spheres. However, when I look in the box, because I have special glasses that only transmit luminance values, I see a partition separating equal numbers of grey spheres.

Now we remove the partition and shake up the box. When we look inside and compare the final state to the initial state, do we initially report the same or different increase of entropy? (and explain your answer)
 
Last edited:
  • #8
Andy Resnick said:
how about this: a thought experiment. Consider a box partly full of spheres; when you look inside you see a partition separates one half, containing green spheres, from the other half which has red spheres. However, when I look in the box, because I have special glasses that only transmit luminance values, I see a partition separating equal numbers of grey spheres.

Now we remove the partition and shake up the box. When we look inside and compare the final state to the initial state, do we initially report the same or different increase of entropy? (and explain your answer)
Yes, this is a standard statement of the paradox. The answer is well- known: there is an increase in entropy, but the guy with the glasses will not know this. But what exactly is his error? According to the standard account, the problem is that he wrongly thinks the balls are indistinguishable, and therefore undercounts the microstates in the final situation. I am suggesting the opposite: if the balls are in principle distingushable- and macroscopic balls certainly are- then his glasses cause him to overcount the microstates initially, by neglecting the color constraint that limits which balls can be on each side.
 
  • #9
HPt said:
I published this solution 2010 in Journal of Statistical Physics http://www.springerlink.com/openurl.asp?genre=article&id=doi:10.1007/s10955-010-0077-7 and, more concisely, 2014 in European Journal of Physics http://dx.doi.org/10.1088/0143-0807/35/1/015023
Unfortunately, the solution is still widely unknown.
This is great! It restores my faith in the existence of common sense in physics. I especially love this quote:

"Secondly, the resolution above implies that the mere concept of distinguishable identical particles is at odds with thermodynamics [12]. In the opinion of the present author, such a connection between these otherwise unrelated subjects would be rather surprising, to say the least."
 
Last edited:
  • #10
maline said:
Yes, this is a standard statement of the paradox. The answer is well- known: there is an increase in entropy, but the guy with the glasses will not know this. But what exactly is his error? According to the standard account, the problem is that he wrongly thinks the balls are indistinguishable, and therefore undercounts the microstates in the final situation. I am suggesting the opposite: if the balls are in principle distingushable- and macroscopic balls certainly are- then his glasses cause him to overcount the microstates initially, by neglecting the color constraint that limits which balls can be on each side.

I would caution you against using 'error', 'wrongly', etc, because you are making some assumptions. The basic point, the essential starting point, is only that we initially report different values. Our values become equal once you provide me with additional *information*.

There's a deep principle here, because it fundamentally goes to the idea of 'hidden variables' and how we can decide if we have complete information about a system. Because surely you would allow that perhaps you did not notice that there were actually *4* colors instead of 2, so your calculation is 'wrong' as well. This has led some to wonder if entropy is not an objective, external, feature of the universe but instead an anthropomorphic, subjective, one.

Since there are no hidden variables, we instead postulate that the difference in reported values is quantified by information: 1 bit of information has entropy k ln(2).

http://www3.imperial.ac.uk/pls/portallive/docs/1/55905.PDF
 

1. What is Gibbs' entropy paradox?

Gibbs' entropy paradox is a contradiction in the second law of thermodynamics, which states that entropy (a measure of disorder) always increases in a closed system. However, Gibbs showed that in certain systems, entropy can decrease over time, leading to a paradox.

2. How was Gibbs' entropy paradox resolved?

The paradox was resolved through the development of the concept of information entropy by Claude Shannon. This helped to clarify that Gibbs' entropy refers to thermodynamic entropy, while Shannon's entropy refers to information entropy. While thermodynamic entropy can decrease in certain systems, information entropy always increases.

3. Why is Gibbs' entropy paradox important?

Gibbs' entropy paradox highlights the complexity of the second law of thermodynamics and the need for a deeper understanding of entropy. It also led to the development of the concept of information entropy, which has many practical applications in fields such as computer science and communication.

4. Can Gibbs' entropy paradox be observed in real-life systems?

Yes, Gibbs' entropy paradox has been observed in systems such as lasers and living organisms. These systems are able to decrease their thermodynamic entropy by increasing their information entropy, leading to a resolution of the paradox.

5. How does the resolution of Gibbs' entropy paradox impact our understanding of the universe?

The resolution of Gibbs' entropy paradox has deepened our understanding of the fundamental laws of thermodynamics and the role of entropy in the universe. It has also opened up new avenues of research and applications in areas such as information theory, complexity science, and biology.

Similar threads

Replies
22
Views
1K
  • Thermodynamics
Replies
3
Views
1K
  • Thermodynamics
Replies
3
Views
953
Replies
7
Views
909
Replies
15
Views
938
Replies
2
Views
847
Replies
1
Views
769
Replies
15
Views
1K
  • Thermodynamics
Replies
7
Views
1K
Back
Top