Most Probable Value of 3 fair choices

  • Thread starter Thread starter Poop-Loops
  • Start date Start date
  • Tags Tags
    Choices Value
Poop-Loops
Messages
731
Reaction score
1

Homework Statement



Okay, so the problem asks to show that if you have N number of O2 molecules with spin either 1, 0, or -1, the most probable value for either of those is N/3, assuming they don't interact at all and no B field, so basically 3-sided coins.

Homework Equations



Asked to start with the multinomial coefficient formula:
http://en.wikipedia.org/wiki/Multinomial_theorem

And Stirling's crude approximation for N! = N*ln(N) - N

The Attempt at a Solution



So Probability = \frac{\Omega}{\Omega_t}, right?

In my case \Omega_t = 3^N, so I substituted that. And for \Omega I substituted the multinomial expansion coefficient formula, so I got my probability. Then I simplified the factorial on top and the ones in Product function on the bottom of the fraction and then multiplied them out, since I'm only dealing with 3 choices which I label x,y,z for clarity, so they are easy to multiply out.

Now I'm stuck. I want to take the derivative of this and set it to 0, but I'm not sure what to take the derivative with respect to. N? Also, I'm still stuck with an x, y, and z, but I'm thinking I can set those equal to 1.

(x+y+z)^N = 3^N => x=y=z=1

But that also seems dubious to me.
 
Last edited:
Physics news on Phys.org
I really hate to bump my own thread, especially since it's only been a day, but this thread has been buried already and I haven't figured out my problem yet. :frown:
 
If the number of atoms in each state is k1, k2 and k3, the usual way to handle this is to treat the constraint that (k1+k2+k3)=N as a Lagrange multiplier along with the crude approximation log(k!)=k*log(k). Can you handle that?
 
Last edited:
Wow! I haven't used Lagrange multipliers in... well over a year. But yeah, if it's as simple as you say then I'll have no problem because I've used them before.

Yes, now I see what you are saying. I think I'll be fine now. Thanks. :)
 
I spent several hours today trying to figure it out and I still can't.

What I'm trying to do is this:

\frac{N!}{(3^N)(x!)(y!)(z!)} + \lambda(x + y + z - N) = F

Right? I'm trying to maximize the probability (first part) with the constraint that x + y + z = N.

Then I'd do the Lagrange Multipliers by taking partial derivatives and setting them to 0, and solving for the variables. Well I don't know where the natural log comes into play here. If I try to take the natural log of both sides and then take the partials... I just get a huge mess and none of it makes any sense...

I thought about substituting for example x with N - y - z, but that still doesn't help me much.
 
Last edited:
You've got the setup right. Now let's maximize the log of the combinatorial thing instead of maximizing the thing. Same deal, right? Use Stirling to write this to the highest order as: constant*(x*ln(x)+y*ln(y)+z*ln(z))-lambda*(x+y+z-N)=F

dF/dx=0 gives constant*(1+ln(x))-lambda=0.
dF/dy=0 gives constant*(1+ln(y))-lambda=0.
...

Right? What does this tell you about the relation between x and y?
 
Last edited:
Yes, that's what I did at first, but then I noticed that taking the log of the combinatorial would mean I have to take the log of everything, including the \lambda term and the F, so I'd end up with ln(F).

EDIT: UNLESS... you are saying that I can arbitrarily (it's my choice) decide to maximize the Log of the combinatorial instead of the comb. by itself, with the same constraint in place.
 
You don't want to do that. As you've observed, it leads to a complete mess. If you want to maximize G(x,y,z) subject to the constraint x+y+z=N, you can equally well choose to maximize ln(G(x,y,z)). It's called entropy.
 
Yes, thank you. I will proceed to do that now. I sort of remember Entropy from my thermodynamics class, but we haven't gotten to it yet in my stat mech class so I didn't think of it at all.

Thanks again for your help. :D
 
Back
Top