1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Show extremum of the Entropy is a maximum

  1. Sep 23, 2013 #1
    1. The problem statement, all variables and given/known data

    The Entropy of a probability distribution is given by,

    [itex] S = -k_B \sum _{i=1}^N p(i)\ln{p(i)} [/itex]

    I've shown that the extremum of such a function is given by,

    [itex] S' = k_B \ln{N} [/itex] (which is a positive quantity)

    Now I want to show that this is a maximum by showing that

    [itex] S' - S = k_B \ln{N} + \sum _{i=1}^N p(i)\ln{p(i)} > 0[/itex]



    2. Relevant equations

    The [itex] p(i) [/itex]'s are constrained by

    [itex] \Sum_{i=1}^N p(i) =1 [/itex]


    3. The attempt at a solution

    I'm kind of stuck here. The second term is inherently negative, so it's not a priori obvious that [itex] S' - S > 0 [/itex]. I would probably want to take the ratio and show [itex] \frac{S'}{S} \geq 1 [/itex] but I'm not sure how to do this.

    Any ideas?
     
  2. jcsd
  3. Sep 24, 2013 #2
    I suggest writing a Taylor series for [itex]S[/itex] in terms of the [itex]p_i[/itex] and looking at the second-order term.
     
  4. Sep 24, 2013 #3

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    You only found one extremum which has a positive entropy (using Lagrange multipliers, I presume). The only other place you could have an extremum is on the boundary. The boundary of your set of p(i) would be the case where one of the p(i) is 1 and the rest are 0. What's the entropy there? You have to think about limits, since log(0) is undefined.
     
  5. Sep 25, 2013 #4
    Hello Dick!

    The boundary of the space in question is actually much bigger than this - it's the set of points for which at least one p(i) is zero. So doing it this way, some work remains.
     
  6. Sep 25, 2013 #5

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    Good point. But if you fix one of your N p(i) to be zero. Then you are in the N-1 case with the remaining p(i). Suggests you use induction. In the case N=2, the boundary is the two points (p(1),p(2)) equal to (1,0) or (0,1).
     
  7. Sep 25, 2013 #6
    Correct :smile:

    But (at the risk of giving away the plot), I think the easiest way to demonstrate the global nature of the max is to notice that you can always increase the entropy by transferring probability from somewhere with a large p(i) to somewhere with a small p(i).
     
  8. Sep 25, 2013 #7

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    That is a simpler way to look at it.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted