# The Equilibrium Macrostate: Reif's Statistical Physics

• I
• Kashmir
In summary, the author discusses the concept of equilibrium in a system and how it can be determined by a few macroscopic parameters, such as volume and energy. They define equilibrium as the most random situation, where the particles are uniformly distributed and share the same energy on average. This is due to the fact that a uniform distribution has the maximum number of combinations, making it the most random. The author also states that the average energy per particle is always equal to the total energy divided by the number of particles, as per the definition of average. This is further supported by the idea of equating equilibrium to the most random situation, where each particle has the same energy on average.
Kashmir
Reif, statistical physics

"The equilibrium macrostate of a system can be completely specified by very few macroscopic parameters. For example, consider again the isolated gas of ##N## identical molecules in a box. Suppose that the volume of the box is ##V##, while the constant total energy of all the molecules is ##E##. If the gas is in equilibrium and thus known to be in its most random situation, then the molecules must be uniformly distributed throughout the volume ##V## and must, on the average, share equally the total energy ##E## available to them. A knowledge of the macroscopic parameters ##V## and ##E## is, therefore, sufficient to conclude that the average number ##\bar{n}_{s}## of molecules in any subvolume ##V_{s}## of the box is ##\bar{n}_{s}=N\left(V_{s} / V\right)##, and that the average energy ##\bar{\epsilon}## per molecule is ##\bar{\epsilon}=E / N##. If the gas were not in equilibrium, the situation would of course be much more complicated. The distribution of molecules would ordinarily be highly nonuniform and a mere knowledge of the total number ##N## of molecules in the box would thus be completely insufficient for determining the average number ##\bar{n}_{s}## of molecules in any given subvolume ##V_{s}## of the box."

The author defines equilibrium state corresponding to the "**most random situation**" .

I understand why on the average the particles are uniformly distributed throughout the volume, because the number of combinations for a uniform distribution is maximum.

But I'm having trouble understanding why using this definition of equilibrium as the most random situation how can I deduce that the average energy of a gas particle in equilibrium is ##\bar{\epsilon}=E / N##.
Where does the idea of "most random situation" enter the argument?

I'm sorry if this question is elementary. Please tell me where to go to learn if this question is too basic.

Thank you.

Kashmir said:
But I'm having trouble understanding why using this definition of equilibrium as the most random situation how can I deduce that the average energy of a gas particle in equilibrium is $$\bar{\epsilon}=E / N$$.
Where does the idea of "most random situation" enter the argument?

I'm sorry if this question is elementary. Please tell me where to go to learn if this question is too basic.

Thank you.
The average energy per particle is always ##\bar{\epsilon}=E / N##, by the definition of average!

Kashmir
PeroK said:
The average energy per particle is always ##\bar{\epsilon}=E / N##, by the definition of average!
Hello sir,
The author says just before this paragraph:

"The time independent equilibrium situation which is ultimately attained corresponds therefore to the most random distribution of the total energy of the gas over all the molecules. Each molecule has then, on the average, the same energy and thus also the same speed"

The author suggests that the most random situation implies that the average energy is E/N.
I'm not able to see the link between the most random distribution and average energy.

Kashmir said:
The author suggests that the most random situation implies that the average energy is E/N.
I'm not able to see the link between the most random distribution and average energy.
What definition of "average" are you using?

PeroK said:
What definition of "average" are you using?
I'm following the authors definition, given before this paragraph, that is ##\frac{1}{T} \int E \cdot d t##.

Also the way the author gives his argument on another page to find the average energy per particle is:
"Hence the basic question becomes:
How is the fixed total energy of the gas distributed over the individual
molecules? It is possible that one group of molecules might have very
high energies while another group might have very low energies. But
this kind of situation is quite special and would not persist as the molecules collide with each other and thus exchange energy
The time independent equilibrium situation which is ultimately attained corresponds therefore to the most random distribution of the total energy of the gas over all the molecules.
Each molecule has then, on the
average, the same energy"

Please note how the author is stressing the 'most random distribution ' in his proof.

I don't understand how each particle having average energy as E/N corresponds to the most random distribution?

(By random the author means "A situation, which can be obtained in many different ways, is said to be random or disordered". The most random therefore means the situation that can be obtained in most different ways. )

Kashmir said:
I'm following the authors definition, given before this paragraph, that is ##\frac{1}{T} \int E \cdot d t##.
That's the average over time for the system. That's not the average per particle.

Are you sure you are not way out of your depth with this textbook?

PeroK said:
That's the average over time for the system. That's not the average per particle.

Are you sure you are not way out of your depth with this textbook?
##\frac{1}{T} \int E \cdot d t## the E is energy of one particle ,i should have written it as ##\frac{1}{T} \int e \cdot d t##

Kashmir said:
##\frac{1}{T} \int E \cdot d t## the E is energy of one particle ,i should have written it as ##\frac{1}{T} \int e \cdot d t##
It's still a time average and completely independent of how ##e(t)## is calculated.

Well, assuming ergodicity (i.e., that time-averaging corresponds to ensemble-averaging) you can define the thermodynamical energy of a particle via the time average along the trajectory of the particle. You have to integrate over a time interval large compared to the "typical microscopical time scales" but small compared to the "typical macroscopical time scales". This coarse grains over the random thermal fluctuations. Of course this assumes that you have these separations of time scales. For thermal equilibrium this time average becomes constant of course.

Twigg and Kashmir
vanhees71 said:
Well, assuming ergodicity (i.e., that time-averaging corresponds to ensemble-averaging) you can define the thermodynamical energy of a particle via the time average along the trajectory of the particle. You have to integrate over a time interval large compared to the "typical microscopical time scales" but small compared to the "typical macroscopical time scales". This coarse grains over the random thermal fluctuations. Of course this assumes that you have these separations of time scales. For thermal equilibrium this time average becomes constant of course.
Hello sir,
You said "For thermal equilibrium this time average becomes constant of course" and I think you meant the average then equals ##E/N##.

I am not sure how the average equals ##E/N##.

The author first defined the equilibrium as the most random situation, and then using this definition he tries to prove that the average energy of each particle is ##E/N##.This is what the author writes about a gas in isolated box having energy ##E## :
"Hence the basic question becomes:
How is the fixed total energy of the gas distributed over the individual
molecules? It is possible that one group of molecules might have very
high energies while another group might have very low energies. But
this kind of situation is quite special and would not persist as the molecules collide with each other and thus exchange energy.
The time independent equilibrium situation which is ultimately attained corresponds therefore to the most random distribution of the total energy of the gas over all the molecules.
Each molecule has then, on the
average, the same energy"

The author clearly is suggesting that because the equilibrium is the most random situation therefore the average energy per particle is ##E/N##.

Can you please help me understand that having an average energy of each particle as ##E/N## is the most random distribution.

PeroK
vanhees71 said:
Well, assuming ergodicity (i.e., that time-averaging corresponds to ensemble-averaging) you can define the thermodynamical energy of a particle via the time average along the trajectory of the particle. You have to integrate over a time interval large compared to the "typical microscopical time scales" but small compared to the "typical macroscopical time scales". This coarse grains over the random thermal fluctuations. Of course this assumes that you have these separations of time scales. For thermal equilibrium this time average becomes constant of course.
Nevertheless, if a system of ##N## particles has total energy ##E##, then the average energy of a particle is ##\dfrac E N##, by definition. I don't understand how that could be a source of confusion.

Invoking the concept of a time-based average seems to miss the point entirely.

Twigg and Lord Jestocost
PeroK said:
Nevertheless, if a system of ##N## particles has total energy ##E##, then the average energy of a particle is ##\dfrac E N##, by definition. I don't understand how that could be a source of confusion.

Invoking the concept of a time-based average seems to miss the point entirely.

"This does not mean that each molecule has the same energy at anyone time. The energy of anyone molecule fluctuates quite appreciably in the course of time as a result of its collisions with other molecules. But when each molecule is observed over a sufficiently long time interval ##\tau##, its average energy over that time interval is the same as that of any other molecule"

And also author gives the definition of average as( here average of number density)
" ##[\bar{n}(t)]_{r} \equiv \frac{1}{\tau} \int_{t}^{t+\tau} n\left(t^{\prime}\right) d t^{\prime} .##"

It depends on which average you want to calculate. If you just want the average energy per particle, you are right. If you want an ensemble average in the sense of statistical mechanics it may be different. In thermal equilibrium, however it's clear that each particle has the same average energy, which is ##U/N## (##U## total internal energy).

vanhees71 said:
It depends on which average you want to calculate. If you just want the average energy per particle, you are right. If you want an ensemble average in the sense of statistical mechanics it may be different. In thermal equilibrium, however it's clear that each particle has the same average energy, which is ##U/N## (##U## total internal energy).
I don't know what ensemble average means. I'm just on the first chapter of this undergraduate book, and I'm following his definition of average i.e time-average.

How is it clear "that each particle has the same average energy, which is U/N (U total internal energy)" in equilibrium?

Kashmir said:
I don't know what ensemble average means. I'm just on the first chapter of this undergraduate book, and I'm following his definition of average i.e time-average.

How is it clear "that each particle has the same average energy, which is U/N (U total internal energy)" in equilibrium?
Perhaps you should start by analysing a system of two particles. Try to do it yourself. Not rely on someone else to do it and explain it to you.

Twigg
Kashmir said:
This is what the author writes about a gas in isolated box having energy ##E## :
"Hence the basic question becomes:
How is the fixed total energy of the gas distributed over the individual
molecules? It is possible that one group of molecules might have very
high energies while another group might have very low energies. But
this kind of situation is quite special and would not persist as the molecules collide with each other and thus exchange energy.
The time independent equilibrium situation which is ultimately attained corresponds therefore to the most random distribution of the total energy of the gas over all the molecules.
Each molecule has then, on the
average, the same energy"

The author clearly is suggesting that because the equilibrium is the most random situation therefore the average energy per particle is ##E/N##.

Can you please help me understand that having an average energy of each particle as ##E/N## is the most random distribution.

I think what Reif is explaining is actually very simple, maybe simpler than it sounds.
You could imagine that half the molecules, maybe the half that is in the left half of the volume, together have 3/4 of the total kinetic energy E, whereas the other half, in the right half of the volume, have a total of E/4 together. This would also mean that the average energy per molecule is different in the two halves (by a factor 3).
However, this is not equilibrium and the only (simple) way you can keep the gas like that is by inserting an insulating wall between the two halves.
Without the wall the molecules will collide with each other and if you wait long enough each half will have a total energy of E/2 with relatively small fluctuations around that value.
Then the gas is in equilibrium and on average each molecule will have the energy E/N with relatively large fluctuations around that value.

jim mcnamara and vanhees71

## 1. What is the concept of equilibrium macrostate in Reif's Statistical Physics?

The concept of equilibrium macrostate in Reif's Statistical Physics refers to the state of a system in which all macroscopic variables, such as temperature, pressure, and energy, remain constant over time. In this state, the system is in balance and there is no net flow of energy or matter.

## 2. How does Reif's Statistical Physics explain the concept of entropy?

Reif's Statistical Physics explains entropy as a measure of the disorder or randomness of a system. It states that the entropy of a system tends to increase over time, leading to a state of maximum disorder or equilibrium.

## 3. What are the key principles of Reif's Statistical Physics?

The key principles of Reif's Statistical Physics include the concept of microscopic states, which refers to the different ways in which a system can be arranged on a microscopic level, and the concept of macroscopic states, which refers to the observable properties of a system. It also includes the principle of equal a priori probabilities, which states that all microscopic states are equally likely to occur in an isolated system.

## 4. How does Reif's Statistical Physics relate to thermodynamics?

Reif's Statistical Physics is closely related to thermodynamics, as it provides a microscopic explanation for the macroscopic behavior of thermodynamic systems. It helps to bridge the gap between the microscopic world of atoms and molecules and the macroscopic world of everyday objects, providing a deeper understanding of thermodynamic processes.

## 5. What are the practical applications of Reif's Statistical Physics?

Reif's Statistical Physics has various practical applications in fields such as chemistry, biology, and engineering. It is used to study and predict the behavior of complex systems, such as chemical reactions, phase transitions, and biological processes. It also has applications in the development of new materials and technologies, such as in the design of more efficient energy systems.

• Thermodynamics
Replies
7
Views
1K
• Thermodynamics
Replies
5
Views
2K
• Thermodynamics
Replies
2
Views
882
• Thermodynamics
Replies
3
Views
950
• Thermodynamics
Replies
15
Views
1K
• Thermodynamics
Replies
39
Views
5K
• Thermodynamics
Replies
4
Views
2K
• Thermodynamics
Replies
20
Views
2K
• Thermodynamics
Replies
7
Views
1K
• Thermodynamics
Replies
1
Views
1K