Derivation of the Boltzmann constant

In summary, the conversation discusses the derivation of the Boltzmann distribution in thermal physics. The system being considered is a small 2 state system connected to a reservoir of energy. The next step in the derivation involves the fundamental assumption that quantum states are either accessible or inaccessible to the system and are equally likely to be in one accessible state as in another. This leads to the ratio probability of the system being in the excited state to the probability of it being in the ground state being equal to the ratio of the number of states available to the reservoir at each energy level. The conversation also touches on the concept of the microcanonical ensemble and the importance of specifying a finite value for the energy range when calculating entropy in thermodynamics.
  • #1
Hellabyte
53
0
Hey so I'm just looking through a thermal physics textbook at a quick derivation of the Boltzmann distribution.
It says to consider a small 2 state system with states of energies E=0 and E= [tex] \epsilon[/tex] .This system is connected to a resivor of energy [tex]U_0[/tex] when the small system is in the 0 energy state. When the system is in the [tex] \epsilon[/tex] energy state the energy of the reservoir has energy [tex]U_0 - \epsilon[/tex]. if the number of states available to the reservoir is denoted by g, then the numbers of states available to the reservoir will be [tex]g(U_0)[/tex] and [tex]g(U_0 - \epsilon)[/tex] respectively.

The part i don't understand is the next step. He makes an argument by 'the fundamental assumption", which states that"quantum states are either accessible or inaccessible to the system, and the system is equally likely to be in one accessible state as in another." He says that the ratio probability of the system being in the E=[tex] \epsilon[/tex] state to the probability of the state being in the E=0 will be

[tex] \frac{P(\epsilon )}{P(0)} = \frac{g(U_0 - \epsilon)}{g(U_0)} [/tex]

I am struggling to see how one can come to this conclusion. Any help? does the fundamental assumption have a hidden meaning that the probability of being in an energy state is proportional to the states accessible in that energy? That is what this relationship implies i would think. But i can't see why this would be true. Thanks.

Also, I don't know what is wrong with my epsilons but they aren't supposed to superscripts like that.
 
Physics news on Phys.org
  • #2
Hi Hellabyte,
in fact we suppose that the probabilities of degenerate states of the reservoir (states with the same energy) are the same. This is the fundamental assumption, which is quite plausible, if we do not know anything else about these states. In classical statistical physics, we do the same thing and suppose that the probability function is the same on the energy surface in phase space.

Jano
 
  • #3
Note that the states are not assumed to be exactly degenerate (i.e. of exactly the same energy). The microcanonical ensemble is defined by specifying a small energy range which represents how accurately we know the internal enrgy of the system to be. One assumes that all energy eigentates with an energy wihin this interval are equally likely.
 
  • #4
I've never understood this point. The system is isolated, so its energy should be constant and definite. Our knowledge or ignorance of it should not matter.
If we allow the energy to fluctuate by means of an additional reservoir, the postulate of equal probabilities is clearly wrong.
But suppose we allow the energy to make a random walk through the states with the energy in the interval [tex](E, E+ \Delta)[/tex].
What is then the value of [tex]\Delta[/tex]? Can we calculate it somehow?
 
  • #5
Jano L. said:
I've never understood this point. The system is isolated, so its energy should be constant and definite. Our knowledge or ignorance of it should not matter.
If we allow the energy to fluctuate by means of an additional reservoir, the postulate of equal probabilities is clearly wrong.
But suppose we allow the energy to make a random walk through the states with the energy in the interval [tex](E, E+ \Delta)[/tex].
What is then the value of [tex]\Delta[/tex]? Can we calculate it somehow?

In statistical mechanics, we consider not a single system but a large number of "identically" prepaired system, where "identically" does not mean exactly the same, but only the same as far as certain macroscopic properties are concerned.

The microcanonical ensemble is a set of such systems, each with a definite energy but such that the energy of each member can be within a range of Delta E with equal probability. The value of Delta E is an arbitrary choice.

You can easily see that specifying a finite value for Delta E is of crucial importance, even though we tend to ignore this in actual computations (it doesn't seem to matter how you chose Delta E). Suppose we have an isolated system that is not degenerate. It is isolated and it thus has well defined energy eigenstates. Specifying the energy exactly for such a system would thus specify the exact microstate the system is in. Therefore the entropy would have to be zero.

This zero entropy of a system is called the "fine grained entropy". It always stays zero due to unitary time evolution. Information does not get lost, so if you have complete information about a system at t =0, you'll still have complete information any time later. Clearly, this is not the relevant notion of entropy we want to work with in thermodynamics.

The entropy should reflect the lack of knowledge we have about a system. This we deal with by considering an ensemble of systems that are in a range of different microstates that are compatible with the specified macrostate. Now, due to the way quantum mechanics works, we only need to specify the energy exactly in order for the exact miscostate of the system to be specified. What this means in practice is that you have energy levels that are are astronomically close to each other. Specifying the exact energy eigenstate would thus require an astronomical amount of information.

So, we simply specify that the energy of a system is between E and E + Delta E, consider the ensemble of such systems and define thermodynamic quantities as ensemble averages. The entropy can be interpreted as the amount of information you would have to specify in order to point out some specific member of the ensemble. The choice of Delta E will only affect the entropy by a few bits and can thus be ignored.

In contrast, zero Delta E means that the ensemble consists of only one member and you need zero bits of information to point out that member.
 
  • #6
Yes I understand what the fundamental assumption is saying, But how does that relationship spring up?
 
  • #7
Hellabyte said:
Yes I understand what the fundamental assumption is saying, But how does that relationship spring up?

If all accessible states are equally likely. If the subsystem in in the excited state, the reservoir has g(U0 - epsilon) accessible states and if the subsystem is in the ground state, the reservoir has g(U0) accessible states. Then this is like throwing two dice, the sum of the numbers can range from 2 to 12, but the probabilities of the possible outcomes are not the same. 7 is more likely than 2 because there are more ways to throw 7 than there are to throw 2. There are 6 times as many states for 7 than there are for 2, so the ratio of the probabilties is 6.

In this case, the number of different states in which the combined system comprising of the reservoir and the subsystem corresponds to the subsystem being in the excited state is g(U0-epsilon).

The number of different states in which the combined system comprising of the reservoir and the subsystem corresponds to the subsystem being in the ground state is g(U0).

But all these states are equally likely, so the ratio of the probabilities is:

g(U0-epsilon)/g(U0)
 
  • #8
Great I think I got it. Thanks a lot!
 
  • #9
Hi Count,

I still do not understand this [tex]\Delta[/tex]. Nondegenerate system would cease to evolve both in classical and quantum theory, and we do not have any need to describe it statistically, as we know the state already.

I think we can just say that the level [tex]E[/tex] is [tex]g(E)[/tex] times degenerate and entropy is the logarithm of this number. Then we make the hypothesis that all of these states are equally probable and introduce the ensemble of states with sharp energy [tex]E[/tex].

Right?
 
  • #10
Jano L. said:
Hi Count,

I still do not understand this [tex]\Delta[/tex]. Nondegenerate system would cease to evolve both in classical and quantum theory, and we do not have any need to describe it statistically, as we know the state already.

I think we can just say that the level [tex]E[/tex] is [tex]g(E)[/tex] times degenerate and entropy is the logarithm of this number. Then we make the hypothesis that all of these states are equally probable and introduce the ensemble of states with sharp energy [tex]E[/tex].

Right?

The problem here is that if the system is not exactly degenerate, the definition will, strictly speaking, not apply. Note that a system that is not degenerate can still evolve in time, because it doesn't have to be in an exact energy eigenstate. The spacing between the energy eigenvalues must necessarily be astronomically small if you have a system with an astronomically large number of degrees of freedom. This means that the system can evolve almost arbitrarily slowly.

Then, to evaluate thermal averages we can average over some ensemble in which each member is in a "realistic" state which are then not the energy eigenstates. But this average will then be a trace over certain basis states with almost well defined energies. Due to the invariance of the trace, you can then evaluate the trace over the exact energy eigenstates.
 
  • #11
I see the problem - in fact, we never know the energy precisely so we never have the situation where the system is in nondegenerate state (if it were some).

I think the key point is in the fact that the isolated system is not necessarily in energy eigenstate. In fact, the initial state should depend on the preparation of such a system.
Now the question is, why do we suppose that the probability of measuring the energy E is uniform in the interval [tex]E,E+\Delta[/tex]? It seems that it is just a mathematical convenience, as in reality this distribution should reflect the preparation process. I guess that peaked distribution around E would describe the results of energy measurements better.
 

1. What is the Boltzmann constant and why is it important in science?

The Boltzmann constant, denoted by k, is a physical constant that relates the average kinetic energy of particles in a gas to the temperature of that gas. It is important in science because it allows us to understand and predict the behavior of gases and their interactions with other substances.

2. How was the Boltzmann constant derived?

The Boltzmann constant was derived by Austrian physicist Ludwig Boltzmann in the late 19th century. He used statistical mechanics, a branch of physics that studies the behavior of large systems of particles, to derive the equation k = R/N, where R is the gas constant and N is the Avogadro constant.

3. What is the relationship between the Boltzmann constant and the gas constant?

The Boltzmann constant is directly related to the gas constant, which is a physical constant that relates the amount of energy in a gas to its temperature and pressure. The relationship between the two constants is given by the equation k = R/N, where N is the Avogadro constant.

4. How is the Boltzmann constant used in thermodynamics?

The Boltzmann constant is used in thermodynamics to calculate the average kinetic energy of particles in a gas, as well as their probability of occupying a particular energy state. It is also used in the calculation of entropy, a measure of the disorder or randomness of a system.

5. Is the value of the Boltzmann constant exact or approximate?

The value of the Boltzmann constant is currently known to be approximately 1.380649 x 10^-23 joules per kelvin. It is considered an exact value, as it is defined in terms of other fundamental constants such as the Avogadro constant and the gas constant. However, its numerical value may change as our measurement techniques and understanding of fundamental constants improve.

Similar threads

Replies
1
Views
594
Replies
3
Views
1K
  • Introductory Physics Homework Help
Replies
4
Views
798
Replies
18
Views
994
  • Classical Physics
Replies
1
Views
492
  • Introductory Physics Homework Help
Replies
13
Views
2K
  • Advanced Physics Homework Help
Replies
4
Views
1K
  • Atomic and Condensed Matter
Replies
0
Views
785
Replies
4
Views
1K
Back
Top