Mathematical detail regarding Boltzman's H thm

  • Mathematica
  • Thread starter quasar987
  • Start date
  • Tags
    Mathematical
In summary, the conversation discusses the definition of H and its relation to information entropy, statistical mechanics, and the second law. H is defined as the mean value of ln(P_r) and is related to statistical mechanics by multiplying by the Boltzmann factor -k to give Gibbs' entropy. While this is a correct definition, it may not be the most useful or insightful way of thinking about entropy. Instead, entropy reflects the number of possible ways a macrostate can be realized, with the second law ensuring that the equilibrium state is the one with the most possible realizations.
  • #1
quasar987
Science Advisor
Homework Helper
Gold Member
4,807
32
In the "proof" of the theorem, my course notes defines [itex]P_r(t)[/itex] as the probability to find the system is state r at time t, and it defined H as the mean value of [itex]\ln P_r[/itex] over all acesible states:

[tex]H \equiv \sum_r P_r\ln P_r[/tex]

Is is right to call the above sum the "mean value of ln P_r" ?! Cause given a quantity u, the mean value of f(u) is defined as

[tex]\sum_i P(u_i)f(u_i)[/tex]

So the mean value of [itex]\ln P_r[/itex] should be

[tex]\sum_r P(P_r)\ln P_r[/tex]

But P(P_r) does not make sense.I confessed my confusion to the professor in more vague terms (at the time, I only tought the equation looked suspicious), but he said there was nothing wrong with it. I say, H could be called at best "some kind" of mean value of ln(Pr).
 
Last edited:
Physics news on Phys.org
  • #2
The formula you are asking about
H=SUM(P_r * ln(P_r))
is Shannon's definition of information entropy; it is related back to statistical mechanics by multiplying by the Boltzmann factor -k to give Gibbs' entropy
S=-k*SUM(P_r * ln(P_r)).
H is correctly called the mean of ln(P_r). Don't get confused by your notation--you must sum over states, not variables. P(u_i) is the probability of finding u in the ith state, so your first and second equations are actually the same.

You may prefer a less confusing way of writing the mean or expectation
E[f(x)] = SUM(P_r * f(x_r)),
as used in Jaynes, Phys. Rev. 106:620-630 (1957) (who discusses the connection between information H and stat mech S)
or Chandler, Intro to Modern Stat Mech, ch. 3 (1987).

Having defended the correctness of the definition, I have to add that it isn't a very useful way of thinking of H. Take a very simple case, that of an ideal gas, as an example. One sums r over W equally probable microstates (p=1/W) so the entropy of a macrostate of the gas system reduces to
H = -lnW;
multiplying by the constant -k gives exactly Boltzmann's entropy H=k*lnW. But how is thinking of H as the mean value of log of probability helpful or insightful? Instead, entropy reflects the number of possible ways W that a macrostate can be realized, and the second law ensures that the macrostate adopted in equilibrium is that which can be realized in the most number of ways. To tie it to information theory, this is the maximum entropy state.

Hope this helps.
 
  • #3


It is correct to call the sum H as the mean value of ln P_r over all accessible states. The notation \sum_r P_r\ln P_r is a shorthand way of writing the summation \sum_{r} P(r)\ln P(r), where P(r) represents the probability of the system being in state r. This notation is commonly used in statistical mechanics and does not cause any confusion.

The mean value of a function f(u) is defined as \sum_i P(u_i)f(u_i), where P(u_i) represents the probability of the system having the value u_i. In this case, the function f(u) is ln P_r, and the values u_i are the probabilities P_r. Therefore, the mean value of ln P_r is correctly written as \sum_r P_r\ln P_r.

As for the notation P(P_r), it is not used in this context and does not make sense. P_r represents the probability of the system being in state r, and it cannot be used as an argument for another probability function. Therefore, it is correct to say that H is the mean value of ln P_r over all accessible states.
 

1. What is Boltzmann's H theorem?

Boltzmann's H theorem, also known as the Boltzmann's Second Law, is a fundamental principle in statistical mechanics that describes the behavior of gas particles in a closed system. It states that the entropy of an isolated system will tend to increase over time, eventually reaching a maximum value.

2. How does Boltzmann's H theorem relate to the second law of thermodynamics?

Boltzmann's H theorem is closely related to the second law of thermodynamics, which states that the total entropy of an isolated system will always increase over time. This means that as particles within a system interact and move around, their disorder and randomness, or entropy, will increase.

3. What is the mathematical formula for Boltzmann's H theorem?

The mathematical formula for Boltzmann's H theorem is H = ∑ pi ln(pi), where H represents the entropy of the system, and pi represents the probability of a particle being in a particular state. This formula is a way to quantify the randomness or disorder of a system.

4. What is the significance of Boltzmann's H theorem in physics?

Boltzmann's H theorem has significant implications in physics, specifically in the fields of thermodynamics, statistical mechanics, and quantum mechanics. It provides a mathematical explanation for the second law of thermodynamics and helps us understand the behavior of gas particles in a closed system.

5. How is Boltzmann's H theorem related to the concept of equilibrium?

Boltzmann's H theorem is closely related to the concept of equilibrium in thermodynamics. It states that in a closed system, the entropy of the system will tend to increase until it reaches a maximum value, also known as equilibrium. This means that at equilibrium, the system will have the maximum possible disorder or randomness.

Similar threads

  • High Energy, Nuclear, Particle Physics
Replies
3
Views
1K
  • Quantum Physics
Replies
9
Views
783
Replies
19
Views
1K
Replies
1
Views
909
  • Thermodynamics
Replies
4
Views
1K
  • Atomic and Condensed Matter
Replies
3
Views
845
  • Advanced Physics Homework Help
Replies
5
Views
1K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
3
Views
265
  • Classical Physics
Replies
0
Views
110
  • Math Proof Training and Practice
3
Replies
100
Views
7K
Back
Top