Entropy Calculation for Small Systems

  • Thread starter Thread starter Redsummers
  • Start date Start date
  • Tags Tags
    Entropy
Click For Summary

Homework Help Overview

The discussion revolves around the calculation of entropy for small systems, specifically focusing on examples involving a limited number of molecules and cards. The original poster questions the applicability of standard entropy formulas in these contexts.

Discussion Character

  • Exploratory, Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • The original poster attempts to apply the Boltzmann entropy formula to two examples: one involving card shuffling and another involving gas molecules. Some participants question the appropriateness of the binomial coefficient for the card scenario and explore the implications of using the Boltzmann constant.

Discussion Status

Participants are engaging in a conceptual exploration of entropy, with some providing insights into the nature of multiplicity and its relation to entropy. There is an acknowledgment of the need for further clarification, particularly regarding the card example, and the original poster plans to seek additional guidance from a professor.

Contextual Notes

There is a recognition that the results obtained for small systems yield unexpectedly small entropy values, prompting questions about the validity of the methods used. The discussion also highlights the difference in behavior between macroscopic and microscopic systems.

Redsummers
Messages
162
Reaction score
0

Homework Statement



My question is more global than a specific exercise, but I will illustrate my question with two examples.

My question is, when determining entropy for systems of say, 1000 molecules, 200 cards, and other small amounts (in contrast to the 6.022 E23 molecules in one mole), is it okay to give our result according to the well-known formulas (cited in 2.)?

Let me put two examples:

a) You have 5 identical sets of cards, each with 24 different cards. Then you shuffle the five sets, calculate the entropy.

b) You have 1000 molecules of N2 and 250 molecules of O2 in two different containers until you mix them together. Imagine that they reach equilibrium and the process let no temperature change due to the large-enough volume.

Homework Equations



For a) I would just use:

S = k_B lnW

Where W are the possible combinations, so:

S = k_B ln \frac{n!}{(n-r)!r!}
Now for b), similarly:

S = k_B ln\Omega

And \Omega is the different kind of arranging possibles, i.e.:

S = k_B ln \frac{N!}{N_1!N_2!}

Where N = N_1 + N_2

The Attempt at a Solution



The calculations are really straight forward at this point.

For a) I would use n = 24 and r = 5, and at the end, it gives me S = 10.65 * k_B = 1.47 E-22

And b) I use N1=1000 and N2=250, being N=1250. And after plugging the numbers in the computer (since the number of different arrangements is rather high). It gives me S= 621.93*k_B = 8.58 E-21.

As you can see both results are really small, I don't know if I'm missing something, or if in these statistical entropy cases there are other ways to deal with it.
 
Physics news on Phys.org
I wouldn't worry about the units, as the Boltzmann constant will cause such a unfamiliar exponent to occur. Rather, think of it in relative terms. Aka, the multiplicity of one situation vs. another, and then the entropy is just a tuning of this value. The inherent stat mech is in the multiplicity.

That being say, the binomial coefficient that you are using may not be appropriate to the situation of cards. I agree that for molecules it works because you are essentially setting up a binary scenario, meaning a molecule is paired or unpaired. I'm not sure about card shuffling though.
 
Oh okay, that makes sense. But yeah, the card shuffling thing was my guess as for what I know about combination and permutations. I don't know either if I should be using the Boltzmann constant or if there is another way of calculating such disorder.

Thanks anyway! Now I'm more relieved about the small quantities results.
 
If this helps make the results seem more sensical,

Entropy is a macroscopic variable that relates to temperature and energy. Imagine you drop an ice cub into a pot of water. The temperature will eventually change, maybe 10 degrees, but imagine how many new microstates you created by dumping 10^23 new molecules in the system. This is why a huge change in multiplicity doesn't manifest as a huge change in entropy, because multiplicities only translate into a few degrees of temperature etc...

If you learn anything more about the cards, post back. I think there is a lot of information on it if you google card riffling.
 
That's a good conceptual definition, definitely. And yeah, I will ask my professor in the next lecture, which will be this friday, so if there are no answers before then I will post a more logical result concerning the card example.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 10 ·
Replies
10
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
5K
  • · Replies 14 ·
Replies
14
Views
2K