Show extremum of the Entropy is a maximum

  • #1
556
148

Homework Statement



The Entropy of a probability distribution is given by,

[itex] S = -k_B \sum _{i=1}^N p(i)\ln{p(i)} [/itex]

I've shown that the extremum of such a function is given by,

[itex] S' = k_B \ln{N} [/itex] (which is a positive quantity)

Now I want to show that this is a maximum by showing that

[itex] S' - S = k_B \ln{N} + \sum _{i=1}^N p(i)\ln{p(i)} > 0[/itex]



Homework Equations



The [itex] p(i) [/itex]'s are constrained by

[itex] \Sum_{i=1}^N p(i) =1 [/itex]


The Attempt at a Solution



I'm kind of stuck here. The second term is inherently negative, so it's not a priori obvious that [itex] S' - S > 0 [/itex]. I would probably want to take the ratio and show [itex] \frac{S'}{S} \geq 1 [/itex] but I'm not sure how to do this.

Any ideas?
 

Answers and Replies

  • #2
I suggest writing a Taylor series for [itex]S[/itex] in terms of the [itex]p_i[/itex] and looking at the second-order term.
 
  • #3

Homework Statement



The Entropy of a probability distribution is given by,

[itex] S = -k_B \sum _{i=1}^N p(i)\ln{p(i)} [/itex]

I've shown that the extremum of such a function is given by,

[itex] S' = k_B \ln{N} [/itex] (which is a positive quantity)

Now I want to show that this is a maximum by showing that

[itex] S' - S = k_B \ln{N} + \sum _{i=1}^N p(i)\ln{p(i)} > 0[/itex]



Homework Equations



The [itex] p(i) [/itex]'s are constrained by

[itex] \Sum_{i=1}^N p(i) =1 [/itex]


The Attempt at a Solution



I'm kind of stuck here. The second term is inherently negative, so it's not a priori obvious that [itex] S' - S > 0 [/itex]. I would probably want to take the ratio and show [itex] \frac{S'}{S} \geq 1 [/itex] but I'm not sure how to do this.

Any ideas?

You only found one extremum which has a positive entropy (using Lagrange multipliers, I presume). The only other place you could have an extremum is on the boundary. The boundary of your set of p(i) would be the case where one of the p(i) is 1 and the rest are 0. What's the entropy there? You have to think about limits, since log(0) is undefined.
 
  • #4
The boundary of your set of p(i) would be the case where one of the p(i) is 1 and the rest are 0.

Hello Dick!

The boundary of the space in question is actually much bigger than this - it's the set of points for which at least one p(i) is zero. So doing it this way, some work remains.
 
  • #5
Hello Dick!

The boundary of the space in question is actually much bigger than this - it's the set of points for which at least one p(i) is zero. So doing it this way, some work remains.

Good point. But if you fix one of your N p(i) to be zero. Then you are in the N-1 case with the remaining p(i). Suggests you use induction. In the case N=2, the boundary is the two points (p(1),p(2)) equal to (1,0) or (0,1).
 
  • #6
Good point. But if you fix one of your N p(i) to be zero. Then you are in the N-1 case with the remaining p(i). Suggests you use induction. In the case N=2, the boundary is the two points (p(1),p(2)) equal to (1,0) or (0,1).

Correct :smile:

But (at the risk of giving away the plot), I think the easiest way to demonstrate the global nature of the max is to notice that you can always increase the entropy by transferring probability from somewhere with a large p(i) to somewhere with a small p(i).
 
  • #7
Correct :smile:

But (at the risk of giving away the plot), I think the easiest way to demonstrate the global nature of the max is to notice that you can always increase the entropy by transferring probability from somewhere with a large p(i) to somewhere with a small p(i).

That is a simpler way to look at it.
 

Suggested for: Show extremum of the Entropy is a maximum

Back
Top