What is the Mixing Paradox and its implications for entropy change?

AI Thread Summary
The discussion centers on the paradox of entropy change when mixing gases in a partitioned box. Initially, it seems that removing the partition between two identical gases should not result in an entropy change, but this contradicts the mathematical predictions. Participants argue that the increase in entropy is valid because the number of ways to arrange molecules increases when the gases mix, despite them being indistinguishable. The conversation also touches on the idea that entropy is not an inherent property of a system but rather relates to the information available about the system. Ultimately, the consensus is that mixing entropy can be neglected if the distinctions between molecules are not relevant to the analysis.
JesseC
Messages
247
Reaction score
2
Just learned about it recently, and I'm curious to understand more. The 'mixing' form of the paradox goes something like this:

Imagine a box partitioned in the middle. The two partitions are the same volume. In each there is a gas, on one side labelled A, and on the other B. The are held at exactly the same temperature and pressure.

Now suppose we get rid of the partition and allow the two to mix. Thus each gas can now occupy twice its original volume. Mathematically we can calculate the expected change of entropy to be

\Delta S = (n_A+n_B) ln(2)

or something a bit like that anyway, the details of the mathematics are not particularly important I don't think.

But then suppose we had the same gas in each partition, A = B. So mathematically the entropy change would become

\Delta S = 2n_A ln(2)

However when thought about carefully, it is impossible to achieve an entropy change in this situation. Say each side were to expand into the volume of the other, and we replaced the partition, we'd have exactly the picture as we started with thus no entropy change can have occurred.

What bothers me is, if the maths is predicting something incorrectly, it suggests there is something going wrong perhaps with the details, or the definitions, or the theory. Getting around the problem by saying the particles are indistinguishable is fine, but it doesn't make the mathematical answer correct! Has this been resolved?
 
Physics news on Phys.org
I don't know if this is answering your question, but I'll just post it.

The entropy really does increase. The number of ways you can arrange the individual molecules goes up, because you have all the ways that you had with the partition, plus all the ways you can arrange it by swapping any number of molecule pairs between the two sides (while keeping the gas density equal on both sides). To make the point more clear, imagine a chamber with N molecules, but it is partitioned into N sub-chambers, each with one molecule. Then if you remove all the partitions, there is still the same density of gas overall, but now there are a huge number of ways to arrange the N molecules, where before there was only 1 way.
 
johng23 said:
The entropy really does increase. The number of ways you can arrange the individual molecules goes up...

Consensus physics disagrees. JesseC specified that there is only one gas present; therefore, the molecules are indistinguishable. One can't tell if one has swapped a pair of molecules in this case.
 
Andy, I read this article some years ago while I was thinking about the entropy change when mixing substances which consist of larger molecules than O2 or N2 usually considered in textbooks. Specifically, I had some comment by Roald Hoffmann in mind who pointed out that in the human body there are most probably not two identical molecules of hemoglobine due to the huge amount of possibilities of the different isotopes of hydrogen, carbon, nitrogen and oxygen to distribute in the molecule. Nevertheless, hemoglobine is considered a pure substance on a macroscopical level.
So, if we would start a heavy discussion and would end up on the street fighting with our knives, would the entropy of our bloods increase on mixing?
 
DrDu,

The article by Jaynes suggested by Andy Resnick answers all these questions.
Specifically, as long as you do not "probe" properties that could differentiates between molecules, you could just as well neglect the mixing entropy, or pretend there is no mixing entropy.

To push it even further, you could even have this reasoning when mixing oxygen and nitrogen. As long as you don't "identify" differences between these two gases, you can just pretend there is no mixing entropy.

From Jaynes, I learned that entropy should be related to the properties we make relevant in an experiment (voluntarily or involuntarily, of course!). Making relevant relates either to the preparation of the system or to its observation. For example, if oxygen and nitrogen are separated by a selective membrane, then you better think about the differences between O2 and N2. Similarly, if you can measure the O2/N2 ratio, you also better think about mixing entropy. Gravitation, centrifugal forces and other specific interactions, make it most often necessary to take O2-N2 mixing entropy into account.

When we are using the words oxygen and nitrogen, by hypothesis we can assume we know these gases behave differently in various situations. Therefore, implicitely we are forced to consider mixing entropy.

As long as you talk about hemoglobine, without pretending you can differentiate different kind of these molecules, then in all your analysis you can drop the mixing entropy. However, if you make distinctions between species, based on molecular weight for example, and if you are going to analyse these distinction -like by measuring concentration of species- then you could not analyse correctly what is going on without the mixing terms.

This is actually rather trivial: you can forget about mixing entropy if you don't need it!

From Jaynes, I really learned what I missed for many years!
Entropy is not a property of system: it really is a property of what we know about a system!
Exactly as for the quantum mechanical wave function!
The proof of this statement is in all our experiments and analysis we make of it, as Jaynes explained with his Whifnium tale.
 
DrDu said:
So, if we would start a heavy discussion and would end up on the street fighting with our knives, would the entropy of our bloods increase on mixing?

There's a good joke in there somewhere, but I can't seem to bring it out.. :)

More seriously, the answer clearly depends on if we can distinguish between your clean blood and my dirty blood. More fundamentally, the amount of work we can extract by becoming blood brothers depends on how many thermodynamic degrees of freedom we have to control during the mixing.

The critical point in Jaynes' paper is that the entropy, unlike energy, is *not* a physical property of a microstate. Put another way, if you give me a system prepared into a particular microstate, then you *must* ascribe it a state of zero entropy, regardless of *which* microstate you provided me. Section 6 in his paper is fairly clear regarding this- and specifically, if we observe an apparent violation of the second law of thermodynamics upon mixing and unmixing our blood, there *must* be additional thermodynamic degrees of freedom of which we are not aware.
 
Andy, I fully agree with you. Specifically I had your point in mind that the entropy of the system is related to the amount of information we have about the system. Nevertheless entropy remains an objective and not a subjective quality. If I learn more about the microstates of a system, then I reduce the entropy of the system. However in the process of measuring the microstate the entropy of my memory (or of the measuring devices computer) increases at least the same amount. I think it was Bennett and Landauer who tried to pinpoint where exactly in a measurement entropy is generated. They found that all operations of a computer can be done arbitrarily reversible with the exception of the deletion of the memory.
 
Back
Top