- #1
- 556
- 148
Homework Statement
The Entropy of a probability distribution is given by,
[itex] S = -k_B \sum _{i=1}^N p(i)\ln{p(i)} [/itex]
I've shown that the extremum of such a function is given by,
[itex] S' = k_B \ln{N} [/itex] (which is a positive quantity)
Now I want to show that this is a maximum by showing that
[itex] S' - S = k_B \ln{N} + \sum _{i=1}^N p(i)\ln{p(i)} > 0[/itex]
Homework Equations
The [itex] p(i) [/itex]'s are constrained by
[itex] \Sum_{i=1}^N p(i) =1 [/itex]
The Attempt at a Solution
I'm kind of stuck here. The second term is inherently negative, so it's not a priori obvious that [itex] S' - S > 0 [/itex]. I would probably want to take the ratio and show [itex] \frac{S'}{S} \geq 1 [/itex] but I'm not sure how to do this.
Any ideas?