1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Entropy of a loaded dice throw

  1. Nov 8, 2014 #1

    ZetaOfThree

    User Avatar
    Gold Member

    1. The problem statement, all variables and given/known data
    A 6 sided dice is loaded such that 6 occurs twice as often as 1. What is the total probability of rolling a 6 if the Shannon entropy is a maximum?

    2. Relevant equations
    Shannon Entropy:
    $$S=-\sum_i p_i \ln{p_i}$$
    where ##p_i## is the probability that we roll ##i##.

    3. The attempt at a solution
    We know that ##\sum_i p_i = 1 ## and we are given that ##p_1=p_6/2##. So $$p_2+p_3+p_4+p_5+\frac{3}{2} p_6 =1 \Rightarrow p_5 = 1-p_2-p_3-p_4-\frac{3}{2} p_6$$ There we can write the Shannon entropy as $$S=-\left( \frac{p_6}{2} \ln{\frac{p_6}{2}} + p_2 \ln{p_2}+p_3 \ln{p_3}+p_4 \ln{p_4}+(1-p_2-p_3-p_4-\frac{3}{2} p_6) \ln(1-p_2-p_3-p_4-\frac{3}{2} p_6) + p_6 \ln{p_6}\right)$$
    $$=-\left( \frac{3 p_6}{2} \ln{p_6} + p_2 \ln{p_2}+p_3 \ln{p_3}+p_4 \ln{p_4}+(1-p_2-p_3-p_4-\frac{3}{2} p_6) \ln(1-p_2-p_3-p_4-\frac{3}{2} p_6) - \frac{p_6}{2} \ln{2}\right)$$
    To find an extremum, we partial differentiate and set it equal to zero:
    $$\frac{\partial S}{\partial p_2} = - \left(\ln{p_2} +1 - \ln(1-p_2-p_3-p_4-\frac{3}{2} p_6) -1 \right) = -\left(\ln{p_2} - \ln(1-p_2-p_3-p_4-\frac{3}{2} p_6) \right) =0$$
    $$\Rightarrow \ln p_2 = \ln(1-p_2-p_3-p_4-\frac{3}{2} p_6)$$
    Differentiating with respect to ##p_3## and ##p_4##, we find the same condition, so we conclude that ##p_2=p_3=p_4=p_5##. We now write the Shannon entropy as $$S=- \left(\frac{3 p_6}{2} \ln{p_6}+4p_2 \ln{p_2} - \frac{p_6}{2} \ln{2}\right)$$ So $$\frac{\partial S}{\partial p_6}= -\left(\frac{3}{2} \ln{p_6}+\frac{3}{2} - \frac{1}{2} \ln{2}\right) = 0$$
    Therefore we find that ##p_6 = \frac{2^{1/3}}{e}##. But this is wrong, because if we plug in the probabilities into the Shannon entropy formula, we do not get a maximum. For example, we get a higher Shannon entropy if we plug in ##p_1=p_2=p_3=p_4=p_5=1/7## and ##p_6=2/7##. Where did I go wrong? Maybe I found a minimum or something? If so, how do I get the maximum?
    Thanks for any help.
     
  2. jcsd
  3. Nov 9, 2014 #2

    BvU

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    It looks as if you put ##\displaystyle {d p_2\over d p_6}=0##. But isn't there a constraint that ##4p_2+{3\over 2}p_6 = 1 \ \Rightarrow \ 4d p_2 + {3\over 2}dp_6 = 0 ## ?
     
  4. Nov 9, 2014 #3

    ZetaOfThree

    User Avatar
    Gold Member

    What step are you referring to? I don't think I used this.
    Yes, you can use that condition to find ##p_2## after ##p_6## is found. So in the case above, we have ##p_1 = \frac{1}{2^{2/3}e}##, ##p_2=p_3=p_4=p_5= \frac{1}{8} \left(2- \frac{3 \sqrt [3] {2}}{e} \right)## and ##p_6 = \frac{\sqrt[3]{2}}{e}##.
     
  5. Nov 9, 2014 #4

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    The partial derivative gives you an optimum if p6 is independent of the other probabilities. It is not. You should get the correct result if you express p2 as function of p6 and then calculate the partial derivative.
     
  6. Nov 9, 2014 #5

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    Given that you know ##p_2=p_3=p_4=p_5##, and given the two other constraints you could write the entropy as a function of a single variable like ##p_6## and maximize that as a single variable problem. That should be straightforward.
     
  7. Nov 9, 2014 #6

    ZetaOfThree

    User Avatar
    Gold Member

    Sounds about right. I did that, and got an answer I'm pretty sure is correct. Thank you all for the help!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Entropy of a loaded dice throw
  1. Distributed Load (Replies: 3)

  2. Quantum entropy and ? (Replies: 2)

  3. Entropy expression (Replies: 1)

  4. Finding Entropy (Replies: 3)

Loading...