# Determining entropy

1. May 5, 2010

### Redsummers

1. The problem statement, all variables and given/known data

My question is more global than a specific exercise, but I will illustrate my question with two examples.

My question is, when determining entropy for systems of say, 1000 molecules, 200 cards, and other small amounts (in contrast to the 6.022 E23 molecules in one mole), is it okay to give our result according to the well-known formulas (cited in 2.)?

Let me put two examples:

a) You have 5 identical sets of cards, each with 24 different cards. Then you shuffle the five sets, calculate the entropy.

b) You have 1000 molecules of N2 and 250 molecules of O2 in two different containers until you mix them together. Imagine that they reach equilibrium and the process let no temperature change due to the large-enough volume.

2. Relevant equations

For a) I would just use:

$$S = k_B lnW$$

Where W are the possible combinations, so:

$$S = k_B ln \frac{n!}{(n-r)!r!}$$

Now for b), similarly:

$$S = k_B ln\Omega$$

And $$\Omega$$ is the different kind of arranging possibles, i.e.:

$$S = k_B ln \frac{N!}{N_1!N_2!}$$

Where $$N = N_1 + N_2$$

3. The attempt at a solution

The calculations are really straight forward at this point.

For a) I would use n = 24 and r = 5, and at the end, it gives me S = 10.65 * k_B = 1.47 E-22

And b) I use N1=1000 and N2=250, being N=1250. And after plugging the numbers in the computer (since the number of different arrangements is rather high). It gives me S= 621.93*k_B = 8.58 E-21.

As you can see both results are really small, I don't know if I'm missing something, or if in these statistical entropy cases there are other ways to deal with it.

2. May 5, 2010

### mooglue

I wouldn't worry about the units, as the boltzmann constant will cause such a unfamiliar exponent to occur. Rather, think of it in relative terms. Aka, the multiplicity of one situation vs. another, and then the entropy is just a tuning of this value. The inherent stat mech is in the multiplicity.

That being say, the binomial coefficient that you are using may not be appropriate to the situation of cards. I agree that for molecules it works because you are essentially setting up a binary scenario, meaning a molecule is paired or unpaired. I'm not sure about card shuffling though.

3. May 5, 2010

### Redsummers

Oh okay, that makes sense. But yeah, the card shuffling thing was my guess as for what I know about combination and permutations. I don't know either if I should be using the Boltzmann constant or if there is another way of calculating such disorder.

Thanks anyway! Now I'm more relieved about the small quantities results.

4. May 5, 2010

### mooglue

If this helps make the results seem more sensical,

Entropy is a macroscopic variable that relates to temperature and energy. Imagine you drop an ice cub into a pot of water. The temperature will eventually change, maybe 10 degrees, but imagine how many new microstates you created by dumping 10^23 new molecules in the system. This is why a huge change in multiplicity doesn't manifest as a huge change in entropy, because multiplicities only translate into a few degrees of temperature etc...

If you learn anything more about the cards, post back. I think there is a lot of information on it if you google card riffling.

5. May 5, 2010

### Redsummers

That's a good conceptual definition, definitely. And yeah, I will ask my professor in the next lecture, which will be this friday, so if there are no answers before then I will post a more logical result concerning the card example.