# I Confused about "equal a priori"

1. Nov 23, 2016

### Getterdog

i'm new to statistical mechanics and the more I read and view ,the more I get confused.

So far I've got this. At equilibrium a system will have a fluctuating energy,and the specific energies follow Boltzmann's distribution. I also read that within one phase point the molecular velocities also follow a distribution.. .

Question 1 is do all points in phase space that belong to a specific energy have a equal probability of existing?

Question 2 ,involves Dr. Suskind's you tube lectures on statistical mechanics ,specifically the one deriving Boltzmann's distribution. He starts by considering an ensemble of identical systems and asked how these can be distributed over a set of energies.

He comes up with the formula of N!/n1!n2!n3! and so forth. To my knowledge this is the formula for permutation of multisets. And implies that some members of the ensemble are distinguishable and some are not.

The formula from combinatorics where say c indistinguishable items can be placed in b bags is ( c+b-1)
( b )

read this as number of combinations of c+b-1 taken b at a time. the assumption is that the number in each bag is unrestricted.. Why is he assuming that some members of the ensemble are distinguishable? Totally confused thanks for any help.

Last edited by a moderator: Nov 24, 2016
2. Nov 24, 2016

### haruspex

No, it is taking all members as distinct and all energy levels as distinct. It answers how many ways are there of assigning the distinct members to the levels, given a specified number in each level. Simple example: how many ways of assigning either of two energy levels to three atoms such that one gets the first level and two get the second?
Call the atoms A, B, C. You can have A, BC; B, AC; C, AB. 3!/(1!2!)=3.
This is quite a different problem. In this case, the number to go in each bag is not specified. Indeed, it is telling you how many ways there are of specifying the number to go in each bag.

3. Nov 24, 2016

### Getterdog

yes,upon looking it up,its called the multinomial coefficient. I guess my problem is that most beginner presentations just state that an ensemble is a "collection of identical systems." Most don't explain that the collection is of identical macro states,each of which can have a different energy. The same confusion arises when they talk of micro states..initially I identified micro state with points in a phase space that correspond to a specific energy,then I see it used to refer to the different energy levels a a system can have,in thermal equilibrium., Now i see it to be a much more flexible term. Dr. Suskind treats the n1,n2 act as variables and then uses the lagrange multiplier to see what combination maximizes. But his initial setup shows many identical systems that get distributed over distinct energies. If he stated that each member of the ensemble has a distinct energy ,it doesn't make sense to me to ask how a ensemble member with say E1 energy can be distributed over different energy E1,E2 and so forth. where am I getting confused here? thanks again for the reply.

4. Nov 25, 2016

### haruspex

I struggle to follow what you say above.
I would not describe an ensemble as a collection of states. It is a collection of entities in states.

The expression "identical systems" may be a bit misleading. The formula applies to classical entities, such as billiard balls. They may appear identical, or be identical in the sense that we do not care to discriminate, but in terms of the equally likely states at the 'atomic' level of probabilities they are distinct entities. That is, ball 1 in bag A and ball 2 in bag B is essentially different from the other way around, so if we randomly place balls one at a time into the bags there is an evens chance that there will be one ball in each. If the balls were truly identical at some fundamental level the probability would be 1/3. I believe this does happen for some types of particle.

As I understand it, microstates are arrangements that are distinct at the fundamental level. Thus, 1 in A and 2 in B is a different microstate from the other way about. We take all microstates as equally likely (atomic in the probabilistic sense). A macrostate is a collection of microstates that we choose to regard as equivalent for whatever purpose. Different purposes may result in different aggregations into macrostates.
In the context you describe, it would appear that a macrostate is a list of numbers, the numbers of subsystems in each of a list of energy levels.

5. Nov 25, 2016

### vanhees71

The trouble is that it is hard to understand the Boltzmann conjecture that the phase-space cells are a priori equally probable modulo the constraints like the mean energy in the canonical (or mean energy and mean particle number in the grand-canonical) ensemble without referring to quantum theory. In fact you need a natural phase-space volume which is not provided by classical but only by quantum physics.

The argument goes roughly as follows. Take an ideal gas (i.e., a many-body system of non-interacting particles) and cut out a large box out of the gas (grand-canonical ensemble, i.e., the energy and particle numbers insight the box can fluctuate). Now think about the particles in terms of wave mechanics. To have a well-defined momentum operator we take the box to be a cube of edge length $L$ and impose periodic boundary conditions. Then the momentum eigenstates are
$$u_{\vec{p}}(\vec{x})=\frac{1}{\sqrt{L^3}} \exp(\mathrm{i} \vec{p} \cdot \vec{x}/\hbar) \quad \text{with} \quad \vec{p} = \frac{2 \pi \hbar}{L} \mathbb{Z}^3.$$
Now think of a very large volume (to take the "thermodynamic limit") to make the momenta "quasi continuous". Then you count the number of microstates of the particle in a phase-space volume $L^3 \mathrm{d}^3 \vec{p}$ to be
$$\mathrm{d} \rho =\frac{V \mathrm{d}^3 \vec{p}}{(2 \pi \hbar)^3}.$$
Now note that you can take the cube to be "macroscopically small", i.e., $L$ as small compared to the typical length scale accross which the macroscopic properties of the gas (like density, energy density, etc.) vary and still keep it large on a microscopical scale, which is as just argued, of the order $(2 \pi \hbar)^3$. So the natural phase-space measure is
$$\mathrm{d}^6 \xi=\frac{\mathrm{d}^3 \vec{x} \mathrm{d}^3 \vec{p}}{(2 \pi \hbar)^3}.$$
For further discussions about the entropy and how to determine phase-space distributions functions from the point of view of kinetics, see

http://th.physik.uni-frankfurt.de/~hees/publ/off-eq-qft.pdf

For an information-theoretical approach to statistical physics, see

http://th.physik.uni-frankfurt.de/~hees/publ/stat.pdf

6. Nov 25, 2016

### Getterdog

Again thanks, I was going by the article in wikipedia on "statistical ensemble" it is " an idealization consisting of a large number of virtual copies (sometimes infinitely many) of a system,considered all at once,each of which represents a possible state that the real system might be in.In other words a statistical ensemble is a probability distribution for the state of the system". So at first look ( with reference to a canonical ensemble) it sounds like the ensemble is just a set of the possible copies each with a specific energy allowed. But then adding on that its a probability distribution means to me that the ensemble involves degeneracy with multiple members having the same energy. Is this right or can you clarify? thanks .

7. Nov 25, 2016

### Getterdog

Also I forgot to ask,by subsystem do you mean a member of the ensemble?

8. Nov 25, 2016

### haruspex

A "possible state of the system" would imply it satisfies all the facts you know about it. If you know the total energy then all states have that total. Different elements of the ensemble will have that energy distributed differently amongst its constituent parts.

9. Nov 25, 2016

### Getterdog

Thanks for staying with this. To move on with this I'm going to look at other derivations that don't involve this concept but before I do I will pose one last question. To go back to the N!/n1! n2!n3!.. formula, placing N distinguishable items into groupings with a prescribed number to each group and asking for the number of ways to do this. What makes the items distinguishable and what make the grouping distinguishable Again, this refers to the canonical distribution discussion..

10. Nov 25, 2016

### haruspex

That's not quite right characterisation.
For that formula, we are certainly discriminating between groups. We want n1 in the first group, n2 in the second group, etc. If we did not, we would be dealing with partition numbers, which is a much tougher prospect.
But we do not care which n1 go into group 1 etc., so in that sense we are treating the objects as identical. At the same time, we understand that at some more basic level they really are distinct, otherwise we would say there is only one way of having n1 in the first group, n2 in the second group, etc. I.e. we understand that object 1 in group 1 and object 2 in group 2 is a different microstate from the other way around, so both have to be counted in the macrostate.

In the ensemble context, we want all those configurations that have n1 particles in state 1 (energy 1?) n2 in state 2, etc. For ths purpose, we see the states as distinct but the particles as interchangeable, although we regard them as distinct at a more fundamental level.

11. Nov 27, 2016

### Getterdog

Thanks for your help,I did find one text that takes a lot more time clarifying the basic concepts,as well .most introductions just jump over these concepts as if it's clear from the onset. I can finally move on.