# How can one prove that the maximum entropy occurs

1. Dec 8, 2007

### ehrenfest

1. The problem statement, all variables and given/known data
I am calculating entropy using the formula:

$$S=-\sum_i P_i \ln{P_i}$$

where the sum is over all of the microstates of my system and P_i is the probability to find a particle in microstate i.

How can one prove that the maximum entropy occurs when P_i is the same for all i?

2. Relevant equations

3. The attempt at a solution

2. Dec 8, 2007

### Avodyne

Enforce $\sum_i P_i=1$ with a Lagrange multiplier; that is, extremize
$$-\sum_i P_i\ln P_i + k\bigl(\sum_i P_i-1\bigr)$$
with respect to both $P_i$ and $k$.

3. Dec 8, 2007

### Dick

The sum of the P_i=1, that's a constraint. Add a lagrange multiplier to enforce the constraint, like alpha*(sum(P_i)-1). Now take the partial derivatives wrt all variables and set them equal to zero. The partial wrt alpha gives you sum(P_i)-1=0. The partial wrt to P_i gives you an expression for P_i in terms of alpha. alpha is a constant, so all P_i are equal.

4. Dec 8, 2007

### Dick

Great minds think alike. But some are faster.