Register to reply 
Boltzmann distribution 
Share this thread: 
#19
Mar1812, 12:22 PM

P: 789

The bottom line for the Boltzmann distribution is there are many ways to distribute energy among the particles while keeping a particular value of the total energy. The Boltzmann distribution just says that when you have lots of particles and a total energy high enough, almost all of these many ways look about the same. That has nothing to do with temperature.
Boltzmann just says that the number in level i ([itex]N_i[/itex]) is proportional to [itex]N e^{\beta \epsilon_i}[/itex] where [itex]\epsilon_i[/itex] is the energy of the ith level, N is the total number of particles and [itex]\beta[/itex] is some number that depends on the total energy. That makes no reference to temperature. To be exact, we have to have the sum of the [itex]N_i[/itex] add up to [itex]N[/itex], so that means [tex]N_i = N\, \frac{e^{\beta \epsilon_i}}{Z(\beta)}[/tex] where [tex]Z(\beta)=\sum_i e^{\beta \epsilon_i}[/tex] We also have to have the total energy equal to some fixed value, call it [itex]U[/itex] [tex]U=\sum_i \epsilon_i N_i[/tex] If you know the energy of all the energy levels, the total energy, and the total number of particles, this will let you solve for the value of [itex]\beta[/itex]. Still no mention of temperature, only that [itex]\beta[/itex] is a function of the total energy, and we can solve for its value if we know [itex]\epsilon_i,\,N[/itex] and [itex]U[/itex]. Now you have to get into thermodynamics, because that is where temperature is defined. It turns out, by using thermodynamics, you can show that [itex]\beta=1/kT[/itex]. That at least separates things into nontemperature ideas and temperature ideas. Do you need to know how Boltzmann came up with his notemperature equation? Do you now need to know how [itex]\beta=1/kT[/itex]? 


#20
Mar1812, 01:58 PM

P: 1,005




#21
Mar1912, 12:16 AM

P: 789

Check out the Wikipedia article at http://en.wikipedia.org/wiki/Maxwell...ann_statistics
Basically, you have the number of particles N and a bunch of energy levels with energy, say 0,1,2,3... for example. Then you have the total energy E. Now you want to know how many ways you can fill those energy levels with N particles to get that total energy. Suppose you have 3 particles and the total energy is 4. There are 4 ways to do this: 0 1 2 3 4 = energy of level  2 0 0 0 1 = number of particles in each level 1 1 0 1 0 " 1 0 2 0 0 " 0 2 1 0 0 " 4 3 3 1 1 = 4 times the average number in each level You can see the distribution is high at level 0, dropping off for higher energy levels. Boltzmann (and Gibbs) carried out this analysis for N particles with total energy E, and figured out that if you have [itex]N_i[/itex] particles in energy level i with energy [itex]\epsilon_i[/itex] and total energy E, the number of ways ([itex]W[/itex]) this could be done is: [tex]W=\prod_i \frac{1}{N_i!}[/tex] for large N. Now we want to find the [itex]N_i[/itex] such that the sum of all the [itex]N_i[/itex] equals N and the sum of all the [itex]\epsilon_i N_i[/itex] equals E. In the table above, it was done for N=3 and E=4, now we want to do it for the general case. Since N is large, we can use Stirlings approximation for the factorial [itex]x!\approx x^xe^{x}[/itex]. Its better to work in the log of W, so we can do sums instead of products. [tex]\ln(W)=\sum_i N_iN_i\ln N_i [/tex] Now we want to find the [itex]N_i[/itex] where W is maximum. It turns out that that maximum is a HUGE maximum. The set of [itex]N_i[/itex] that gives the largest number of ways gives a number of ways that is MUCH larger than any other configuration. The way we find this maximum is to form a function: [tex]f=\sum_i N_iN_i\ln N_i +(N\alpha \sum_iN_i) +(E\sum_i\beta \epsilon_i N_i)[/tex] You can see that if you find the maximum of this function, it will give you the W you are looking for, that also gives you the right total N and E. So now take the derivative and set to zero: [tex]\frac{d \ln(W)}{dN_i}=0=\sum_i \ln N_i\alpha\beta \epsilon_i[/tex] or [tex]N_i=e^{\alpha\beta\epsilon_i}[/tex] and that's Boltzmann's equation. Now you solve for [itex]\alpha[/itex] using the fact that [itex]\sum_i N_i=N[/itex] to get [tex]N_i=N e^{\beta\epsilon_i}/Z(\beta)[/tex] and you can solve for [itex]\beta[/itex] knowing E. PLEASE NOTE  there is a lot more things that go into the derivation. The above derivation leaves a lot out, but if you can follow the above derivation, then you will be very ready for the real derivation. As far a showing that [itex]\beta=1/kT[/itex], you have to use Boltzmann's famous equation for entropy [itex]S=k\ln(W)[/itex]. Using the [itex]N_i[/itex] that you found above and substituting it into the expression for [itex]\ln W[/itex] you get [tex]S/k=N\alpha+\beta E[/tex] differentiating and rearranging, you get [tex]dE=\frac{1}{k\beta}\,dS\frac{\alpha}{\beta}\,dN[/tex] which is just the fundamental equation of thermodynamics at constant volume: [tex]dE=T\,dS+\mu dN[/tex] which shows that [itex]T=1/k\beta[/itex] and [itex]\mu=\alpha/\beta[/itex] is the chemical potential. 


#22
Mar1912, 05:42 AM

P: 1,005

neat! had actually looked up the derivation on wikipedia, where they use lagrangian multipliers to determine the maximum.
I'm ashamed to say that I'm however still a but confused. In your derivation β is a function of the CHANGE in total energy  not the total energy. So how is temperature determined from total energy with that argument? 


#23
Mar1912, 09:21 AM

P: 789

So total energy is not proportional to temperature in the big picture, the "constant of proportionality" (C) is not constant at all, it is itself a function of temperature. 


#24
Mar1912, 11:30 AM

P: 1,005

I think that all in all by looking through your posts I have learned what I came for. That I can't just say suppose you have this and this energy and this and this temperature, since the temperature is something that comes from the combinatorics and energy. Indeed I'm almost tempted to say that the boltzmann distribution merely defines what the temperature is.
So with high energy density  which concerned my original question  you for instance get a very high temperature, which means the the exponential curve approaches a very flat curve meaning equal probability for all states. I hope what I said so far was correct, because now I want to ask you a final question (you have been immensely helpful so far): Can you make it intuitive for me, that the probabilities for each energy level always approach the probability for the lowest state, but never exceed it? I can't quite make it intuitive by own arguments, but maybe you could put up and example like: suppose our atom acquires one unit of energy, then the probability for acquiring another one must always be a little less bla bla bla.. Hope you understand what I mean :) 


#25
Mar2012, 02:20 AM

P: 789

About the intuitive explanation, I'm not sure. If you have 100 particles and 200 energy units, you could have all but one at zero, and one at 200. That one at 200 has a higher population number than all the levels below it except the ground state, but there is only one way this can happen. You can see that there are many more ways if you move some particles out of the ground state, and drop that 200 particle down in energy. By the same token, if you have any energy level that has more particles in it than some state below it, you can always find many more ways to distribute that energy by removing some of its particles, some up and more down than you can by just leaving things more or less the same. Whatever intuition you come up with, it must have the idea that the equilibrium distribution has the most ways of being realized  its the most likely distribution. If you come up with a better intuition, let me know. 


#26
Mar2012, 03:17 AM

P: 1,005

1/T = ∂S/∂U, where S = kln(W) isn't it exactly this because, this is the definition that makes sense in for instance the boltzmann distribution? And thus the distribution more or less defines what temperature is.. 


#27
Mar2112, 12:54 AM

P: 789

The definition of temperature is an experimental definition, as all classical thermodynamic definitions are. The definition of temperature tells you how to measure temperature, and makes no reference to atoms or molecules or statistical mechanics. The laws of classical thermodynamics puts constraints on the results of measurements. Then Boltzmann comes along and assumes that classical thermodynamics is explained by atoms and molecules and their statistics, and develops (along with others) the explanation of classical thermodynamics called statistical mechanics. What falls out of this explanation is an explanation of many things that are just an unexplained set of measurements in classical thermodynamics, like the specific heat. 


Register to reply 
Related Discussions  
Boltzmann Distribution  Advanced Physics Homework  5  
When can Boltzmann distribution be used?  Classical Physics  1  
Boltzmann distribution  Advanced Physics Homework  0  
Boltzmann distribution vs. Distribution of energy  General Physics  4  
Boltzmann distribution!  Advanced Physics Homework  0 