Show extremum of the Entropy is a maximum

In summary: Induction would be a good way to proceed. That is a simpler way to look at it. Induction would be a good way to proceed.
  • #1
dipole
555
151

Homework Statement



The Entropy of a probability distribution is given by,

[itex] S = -k_B \sum _{i=1}^N p(i)\ln{p(i)} [/itex]

I've shown that the extremum of such a function is given by,

[itex] S' = k_B \ln{N} [/itex] (which is a positive quantity)

Now I want to show that this is a maximum by showing that

[itex] S' - S = k_B \ln{N} + \sum _{i=1}^N p(i)\ln{p(i)} > 0[/itex]



Homework Equations



The [itex] p(i) [/itex]'s are constrained by

[itex] \Sum_{i=1}^N p(i) =1 [/itex]

The Attempt at a Solution



I'm kind of stuck here. The second term is inherently negative, so it's not a priori obvious that [itex] S' - S > 0 [/itex]. I would probably want to take the ratio and show [itex] \frac{S'}{S} \geq 1 [/itex] but I'm not sure how to do this.

Any ideas?
 
Physics news on Phys.org
  • #2
I suggest writing a Taylor series for [itex]S[/itex] in terms of the [itex]p_i[/itex] and looking at the second-order term.
 
  • #3
dipole said:

Homework Statement



The Entropy of a probability distribution is given by,

[itex] S = -k_B \sum _{i=1}^N p(i)\ln{p(i)} [/itex]

I've shown that the extremum of such a function is given by,

[itex] S' = k_B \ln{N} [/itex] (which is a positive quantity)

Now I want to show that this is a maximum by showing that

[itex] S' - S = k_B \ln{N} + \sum _{i=1}^N p(i)\ln{p(i)} > 0[/itex]



Homework Equations



The [itex] p(i) [/itex]'s are constrained by

[itex] \Sum_{i=1}^N p(i) =1 [/itex]

The Attempt at a Solution



I'm kind of stuck here. The second term is inherently negative, so it's not a priori obvious that [itex] S' - S > 0 [/itex]. I would probably want to take the ratio and show [itex] \frac{S'}{S} \geq 1 [/itex] but I'm not sure how to do this.

Any ideas?

You only found one extremum which has a positive entropy (using Lagrange multipliers, I presume). The only other place you could have an extremum is on the boundary. The boundary of your set of p(i) would be the case where one of the p(i) is 1 and the rest are 0. What's the entropy there? You have to think about limits, since log(0) is undefined.
 
  • #4
Dick said:
The boundary of your set of p(i) would be the case where one of the p(i) is 1 and the rest are 0.

Hello Dick!

The boundary of the space in question is actually much bigger than this - it's the set of points for which at least one p(i) is zero. So doing it this way, some work remains.
 
  • #5
Oxvillian said:
Hello Dick!

The boundary of the space in question is actually much bigger than this - it's the set of points for which at least one p(i) is zero. So doing it this way, some work remains.

Good point. But if you fix one of your N p(i) to be zero. Then you are in the N-1 case with the remaining p(i). Suggests you use induction. In the case N=2, the boundary is the two points (p(1),p(2)) equal to (1,0) or (0,1).
 
  • #6
Dick said:
Good point. But if you fix one of your N p(i) to be zero. Then you are in the N-1 case with the remaining p(i). Suggests you use induction. In the case N=2, the boundary is the two points (p(1),p(2)) equal to (1,0) or (0,1).

Correct :smile:

But (at the risk of giving away the plot), I think the easiest way to demonstrate the global nature of the max is to notice that you can always increase the entropy by transferring probability from somewhere with a large p(i) to somewhere with a small p(i).
 
  • #7
Oxvillian said:
Correct :smile:

But (at the risk of giving away the plot), I think the easiest way to demonstrate the global nature of the max is to notice that you can always increase the entropy by transferring probability from somewhere with a large p(i) to somewhere with a small p(i).

That is a simpler way to look at it.
 

1. What is the definition of entropy?

Entropy is a measure of the disorder or randomness in a system. In other words, it quantifies the level of uncertainty or unpredictability in a system.

2. What is an extremum of entropy?

An extremum of entropy is a point where the entropy function reaches a maximum or minimum value. In the context of thermodynamics, it represents the state of maximum disorder or randomness in a system.

3. Why is it important to show that the extremum of entropy is a maximum?

This is important because it allows us to understand the behavior of a system and make predictions about its future state. It also provides insights into the fundamental laws of thermodynamics, which govern the behavior of energy and matter.

4. How is the extremum of entropy determined?

The extremum of entropy is determined mathematically by taking the derivative of the entropy function and setting it equal to zero. This gives us the critical points where the extremum can occur.

5. Does the extremum of entropy always occur at a maximum?

Yes, the extremum of entropy always occurs at a maximum. This is because the entropy function is a concave function and can only have a maximum value, not a minimum value.

Similar threads

  • Advanced Physics Homework Help
Replies
7
Views
2K
  • Advanced Physics Homework Help
Replies
4
Views
3K
  • Advanced Physics Homework Help
Replies
10
Views
3K
Replies
6
Views
1K
  • Advanced Physics Homework Help
Replies
4
Views
1K
  • Advanced Physics Homework Help
Replies
5
Views
1K
Replies
12
Views
1K
Replies
2
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
997
  • Advanced Physics Homework Help
Replies
2
Views
1K
Back
Top