Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Statistical/thermal physics problem

  1. Feb 23, 2005 #1
    Ok so i got this problem i'm solving. Problem goes something like this (i'm translating to english so it might look weird):

    "N atoms are placed in a rigid grid, and each atom has 4 energi states, numbered 1-4, with the energy levels [tex]\epsilon_1=0 \epsilon_2=d \epsilon_3=\epsilon_4=2d[/tex]. The propability of an atom to be in the i'th state is given by [tex]n_i[/tex]. With a powerful elektromagnetic field, the system is made such that the propabilites are as follows: [tex]n_1=n_2=1/2 n_3=n_4=0[/tex]."

    Then the first question.

    "a) What is the inner energy [tex]U_o[/tex] and the entropy [tex]S_o[/tex] of this system?"

    So i first i look at the energy, since [tex]U=N*<\epsilon>[/tex] i assumed that:


    Next the entropy, since we have no information about the spin excess, i assume i should use the approximation [tex]g=2^N[/tex], and hence [tex]S_o=N*ln(2)[/tex].

    Then on to the B question:

    b) "The system is now isolated from the enviroment and after a bit of time, it is in thermal equilibrium.

    Write the propabilities [tex]n_i[/tex] expressed as functions of [tex]x=e^(-d/kT)[/tex] where T is the systems final temperature (and k the boltzman constant). Find the temperature T, the free energy F and entropy S for the thermal equilibrium, expressed as functions of N and d. Compare S with the start values of [tex]S_0[/tex] as well as the systems maximal entropy [tex]S_\infinity[/tex]."

    So now i can see that the Z function must be:


    And consequently the propabilities:

    [tex]n_1=1/Z\ n_2=x/Z\ n_3=n_4=x^2/Z[/tex]

    And using the same method as in a) to find the energy (with these new propabilities and old energy states) i get U to be:


    Now i'm not 100% sure if i can do this next step. But i assume that the energy before, is equal to the energy after, so [tex]U_0=U[/tex]. By doing that, i eliminate Nd from both equations and end up with a second order equation that i can easily solve. So:

    [tex]Nd*(4x^2+X)/Z=Nd*1/2,\ =>\ 4x^2+x=1/2*Z:\ =>\ 3x^2+1/2*x-1/2=0[/tex]

    Which has the solutions

    [tex]x_1=1/3 x_2=-1/2[/tex]

    Of which obviously 1/3 is the solution we can use, to find T from the x equation:

    [tex]x=e^(-d/kT) x=1/3 => ln(1/3)=-d/kT => T=-d/k*ln(1/3) or \tau=-d/ln(1/3)[/tex]

    Substituting x=1/3 into the Z function before, i get Z=14/9, and then i have tau and Z, so i can use the equation [tex]F=-\tau*ln(Z)[/tex] i get F to be:


    Then using the definiton of F, [tex]F=U-\tau*\sigma[/tex], where i have also substituted x=1/3 i U, i get:

    [tex]F=U-\tau*\sigma \ =>\ \sigma=F-U/\tau=(d*ln(14/9)/ln(1/3)-1/2*nd)/(-d/ln(1/3) \ =>\ \sigma*k=k(ln(14/9)-1/2*N*ln(1/3)=S[/tex]

    And this is where i'm stuck. Now i wasn´t sure if i could do the [tex]U_0=U[/tex] move, but i'm relatively sure that it was ok. but i have real problems with the last problem, where i am asked to compare the expressions for S. For the first, i can´t see what the answer should be, i could of course make obvious comments such as the entropy is bigger/smaller and what not, but i feel like they're looking for a more specific answer, and i'm not seeing the relation.

    Also since they ask me to compare to the maximal S. So my question is, can i simply do [tex]S_\infinity=kN*ln(4)[/tex] or do i need to derive the equation from a g where there is an equal amount of particles in all states. I have done that, and ended up with the expression:

    [tex]g=N!/((4*(N*1/4)!) \ and \ S=k*ln(g)=kN(ln(n)-4ln(N*1/4))[/tex]

    Using the stirling approximation. Either way, i also have problems comparing this to the [tex]S_0[/tex], not sure what to say on that question.
  2. jcsd
  3. Feb 23, 2005 #2
    hmm big post...
    First tex errors I know they are not horribly important, but it will make it easier for other to help too.
    should be
    You can click on the tex to see the code.
    [tex]\epsilon_1=0 \epsilon_2=d \epsilon_3=\epsilon_4=2d[/tex]
    needs commas
    [tex]\epsilon_1=0{,} \epsilon_2=d{,} \epsilon_3=\epsilon_4=2d[/tex]

    should be

    I follow your method and think you are right including the energy equivalence until
    [tex]Nd*(4x^2+X)/Z=Nd*1/2,\ =>\ 4x^2+x=1/2*Z:\ =>\ 3x^2+1/2*x-1/2=0[/tex]
    you get:

    I think it should be:
    4x^2 + x = \frac{1}{2}(1+x+x^2)
    It looks like an algebra mistake. This leads to different solutions to x.

    I follow the steps to the free energy though as of now I think the x values you found are wrong.

    A useful derivative relation:
    (\frac{\delta F}{\delta T})_{V,N}\ =\ -S

    V and N must be held constant for this relationship to hold true.

    Does this help?
  4. Feb 23, 2005 #3
    I'm sorry, that wasn´t an algebra mistake, i typed it in wrong. It's suppose to be;


    Since it´s the sum of boltzman factors, and there´s two [tex]x^2[/tex] since 3=4.

    I'm afraid the differentiation relationship doesn´t help me much. I think i got all the S's (although as said lastly i'm not sure if i can use the shortcut of saying [tex]4^N[/tex] is g, or if i need to derive it as i did above.

    And once i have all the S's, i'm not sure what it is they want. They ask me to compare the different entropies, (having found all the S's), but i'm just not sure how i formulate what the difference is in them. I can compare them, i tried dividing them, and plotting the 3 different entropies as functions of N. Maybe it's enough to show the plot to show how they differ from each other as N rises?

    Btw interesting thing happened when i plotted those 3 S's (used kN*ln(4) for S_max), the entropy in thermal equilibrium seemed to be less then in the binary system. I would've thought it was the other way around, is this perhaps ok, or an indication that my calculations are wrong?

    Sorry for the tex mistakes, not too used to this :)
  5. Feb 23, 2005 #4
    Right I should have caught that missing [itex]x^2[/itex].
    Does g stand for the gibes free energy in this case. Or what does it stand for.
    Another entropy equation. There are so many you have to love it.
    Where P(s) is the probability of being in a particular microstate.

    In my understanding. The entropy should go up after leaving the binary state since more energy levels have become available. The total number of possible states should have increase right?
  6. Feb 24, 2005 #5
    That's what i intuitively would've thought, but i only just started this course 3 weeks ago so i assumed my intuition could be way off :)

    Sorry about the g, i assumed it was a standard notation, perhaps it's different from book to book. But the g i'm referring to is the number of states available to the system. So the basic formulation for a binary system would be (where [tex]N_1[/tex] is the number of particles in the second state):


    And then the entropy would be:


    So one way i got entropy up there was by assuming, that in the maximal entropy, there would be an equal number of particles in each energy state, so g had to be:


    And then doing ln(g) and using the stirling approximation i got the result:


    But it would of course be easiest if i could assume that at maximal entropy, where all particles are divide equally between energy states, the number of available states would be [tex]g=4^N[/tex]. But i'm not sure if i can do that.

    And once i do find that S :) I'm still left with the with the problem, of how i compare these 3 different entropies to each other in an intuitive way.
    Last edited: Feb 24, 2005
  7. Feb 24, 2005 #6
    Now I see where the g is coming from. Putting an equal number atoms in each state is not a valid way to find the entropy.
    Take the equation below.
    g is often called the multiplicity of a system or some times the carnality of the system. Multiplicity however is what I have commonly read in thermo books. The Multiplicity is the number of possible ways of dividing U energy into N atoms. So all the energy could go to one atom, however this state would be so unlikely that it would take longer then current age of the universe to occur.

    The Einstein solid consists of N independent quantum harmonic oscillators and is a classic example of how to find the multiplicity. Each level in the quantum harmonic oscillator is separated by [itex]\hbar \omega[/itex]. So if we had [itex] N \hbar \omega[/itex] in energy and r atoms then we would have:
    g = \frac{(N+r-1)!}{(r-1)!(N)!}

    Many text books will often interchange maximal entropy and the total entropy of the system. These two entropies are nearly the same for most systems.

    I do not know how much of this is old hat for you.

    Why does:
    (\frac{\delta F}{\delta T})_{V,N}\ =\ -S
    Not work to find the entropy for you?

    -S=(\frac{\delta F}{\delta T})_{V,N}=\frac{\delta }{\delta T}(-\tau*ln(Z)) =\frac{\delta }{\delta T}(-T*k*ln(Z))
    Then use the chain rule
    \frac{\delta }{\delta T}(-T*k*ln(Z)) = (-k*ln(Z))\frac{\delta T}{\delta T} +(-T*k)\frac{\delta ln(Z)}{\delta T}
    Where [itex] Z = 1+ x+2x^2 = 1+e^{-\beta d}+2 (e^{-\beta d})^2 [/itex] You have already solved from T so this should not be a problem. The solution for the entropy in this case will still depend on d. In this problem I do not think that it is possible to separate d from the equation for entropy.

    Hmmm. I don’t think using [itex]g=2^N[/itex] is valid for the first part of the problem. This is because all of the states are not available due to energy restriction. [itex]g=2^N[/itex] allows for all of the states to be in [itex] \epsilon_2[/itex] which should not be possible when [itex]U=Nd*1/2[/itex]. The [itex] \epsilon_2[/itex] state corresponds to d energy per state so if all N atoms where in [itex] \epsilon_2[/itex] then [itex]U=N*d[/itex]. This energy is clearly larger then what you found.

    Rather I think you should be using:
    Where N_1 corresponds to [itex] \epsilon_2[/itex] which means it should equal N/2.

    Does this make sense to you?
  8. Feb 24, 2005 #7
    Yes that does make sense to me (although i had not seen that N_1 should correspond to [tex]\epsilon_1[/tex]. But if you calculate g from that, you get:


    Which then becomes (with the stirling approximation):


    Which gives the same results as if i'd used [tex]\sigma=N*ln(2)[/tex]. For example, just throwing the arbitrary number 10 in there, it gives the same sigma. Which leads me to believe that i could use perhaps [tex]4^N[/tex] as the maximal entropy that could be achieved in a system of 4 different states (although given the propabilities for each state, the propability of the systme being in that state would be very small).

    Btw is there something wrong with the way i found the entropy originally? Where i used the definition of [tex]F=U-\tau*\sigma[/tex] and isolated sigma? You seem to want me to do it another way, but i'm not sure if it's because it's an easier (perhaps more correct) way of doing it, or if the other method i used was wrong. I do see now though that i could get S from the partial derivative there, but i said before it didn't help me because i believed i already had found it :)
  9. Feb 24, 2005 #8
    I am not sure that you method for finding the entropy after isolation takes into account the different energy levels. [itex]S_\infinity=kN*ln(4)[/itex] is the answer I would imagine getting if all the energy levels were equally spaced and with out any degeneracies. I do not know if you method takes this into account I know the one I posed does.

    I think why the two entropies map so well on to each other is
    1. The state of maximum entropy is almost the same as the total entropy of the system. The same at least for all particle purposes.
    2. The binary state is simple with only one energy level and no degenerate states.

    Using [itex]g=4^N[/itex] is only true if all four states are equally likely, which is not true in the isolated case. Both states were equal likely in the binary case though this is why [itex]g=2^N[/itex] was not a bad assumption there.
  10. Feb 24, 2005 #9
    Hmm ok, but i was under the impression that the entropy does not necessarily have to have anything to do with the propability. I´m asked to find the maximal entropy of the system. ANd the maximal entropy of a system with 4 energy states would be one where all particles in the system are divided equally between the energy states, right? I realize that this state of the system is highly inpropable given the propabilities, but it would still be the maximal state of entropy for this system wouldn´t it?

    Btw i'm not sure i´m making myself clear, i´m saying that [tex]g=4^N[/tex] is the solution for the maximal value of S (which i am asked to find towards the end of the problem). I agree with you about finding the entropy of S in the thermal equilibrium, where i have to take the propabilities into account.
  11. Feb 24, 2005 #10
    Your logic here does not make sense. The state of highest entropy is the most probable state for the system to be in. In other words the state with highest entropy is also the most probable state.

    This is for the isolated system right? So your energy would be limited to [itex] \frac{1}{2} N*d[/itex] You can not even explore all of the [tex]g=4^N[/tex] states because you do not have enough energy to have all of the atoms in the [itex]\epsilon_3[/itex] or the [itex]\epsilon_4[/itex] state.

    Please define what you mean by "maximal value"? This may were be were we are misunderstanding each other.
  12. Feb 24, 2005 #11
    Hmm you're propably right, but that just doesn´t make sense to me.

    Intuitively i'd think that, for example in this system, if i use the x=1/3 substitution to find the propabilities of the system, they have n1=0,64, n2=0,22 and n3=n4=0,07. So to me, it would be obvious to conclude that in an equilibrium with a large number of N particles, the particles should be split between the energy states, approximately as the propabilities state, right? So you have 0,64N, 0,22N, 0,07N and 0,07N as the distribution between states (approximately).

    Well if that's so, then the maximal entropy is not in it's equilibrium is it? Since the system has more entropy if it is set up such that it's 1/4N for each state right? This would be a highly impropable state given the propabilities in the example, but non the less, one with more entropy then equilibrium.

    Well, thanks for all the help man, i appreciate it! But class is tomorrow :)
  13. Feb 24, 2005 #12
    Ok good luck. It was a pleasure.

    To my understanding the state of highest entropy is the most probable state.

    If an isolated system is out of equilibrium and moves toward equilibrium then it is also increasing its entropy. When the system reaches equilibrium it has also reached the state of highest entropy.

    Again good luck.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook