Maximum Entropy Distribution Given Marginals

  • Thread starter Legendre
  • Start date
  • #1
Legendre
62
0
Hi all,

I'm a pure mathematician (Graph Theory) who has to go through a Physics paper, and I am having trouble getting through a part of it. Maybe you guys can point me in the right direction:

Let P(x,y) be a joint distribution function.

Let H = - [itex]\Sigma[/itex]x,y P(x,y) log P(x,y), which is the Entropy.

We are given the marginal probability density functions q1(x) and q2(y).

To get the maximum entropy distribution consistent with the marginals, we solve this problem:

max H

subject to constraints,

[itex]\Sigma[/itex]y P(x,y) = q1(x)
[itex]\Sigma[/itex]x P(x,y) = q2(y)

---------------------

The Lagrangian is,

L = - [itex]\Sigma[/itex] x,y (P(x,y) log P(x,y)) + [tex]\lambda[/tex]1 [itex]\Sigma[/itex]y (P(x,y) - q1(x)) + [tex]\lambda[/tex]2 [itex]\Sigma[/itex]x (P(x,y) - q2(y))

To find the stationary point of L, we differentiate it with respect to a particular P(x',y') while keeping the other variables constant.

I know how to do this for the first sum, which gives the solution -(1 + log P(x',y')).

---------------------

How do we differentiate the second and third sum with respect to P(x',y')?

The paper gave the solution to differentiating L with respect to P(x',y') as:

-(1 + log P(x',y')) + [tex]\lambda[/tex]1 + [tex]\lambda[/tex]2


Thanks!
 

Answers and Replies

Suggested for: Maximum Entropy Distribution Given Marginals

  • Last Post
Replies
9
Views
221
Replies
12
Views
217
  • Last Post
Replies
18
Views
1K
  • Last Post
Replies
3
Views
417
  • Last Post
Replies
10
Views
661
Replies
2
Views
310
Replies
1
Views
357
Replies
2
Views
401
  • Last Post
Replies
17
Views
498
Top