Hi all,(adsbygoogle = window.adsbygoogle || []).push({});

I'm a pure mathematician (Graph Theory) who has to go through a Physics paper, and I am having trouble getting through a part of it. Maybe you guys can point me in the right direction:

Let P(x,y) be a joint distribution function.

Let H = - [itex]\Sigma[/itex]_{x,y}P(x,y) log P(x,y), which is the Entropy.

We are given the marginal probability density functions q_{1}(x) and q_{2}(y).

To get the maximum entropy distribution consistent with the marginals, we solve this problem:

max H

subject to constraints,

[itex]\Sigma[/itex]_{y}P(x,y) = q_{1}(x)

[itex]\Sigma[/itex]_{x}P(x,y) = q_{2}(y)

---------------------

The Lagrangian is,

L = - [itex]\Sigma[/itex]_{x,y}(P(x,y) log P(x,y)) + [tex]\lambda[/tex]_{1}[itex]\Sigma[/itex]_{y}(P(x,y) - q_{1}(x)) + [tex]\lambda[/tex]_{2}[itex]\Sigma[/itex]_{x}(P(x,y) - q_{2}(y))

To find the stationary point of L, we differentiate it with respect to a particular P(x',y') while keeping the other variables constant.

I know how to do this for the first sum, which gives the solution -(1 + log P(x',y')).

---------------------

How do we differentiate the second and third sum with respect to P(x',y')?

The paper gave the solution to differentiating L with respect to P(x',y') as:

-(1 + log P(x',y')) + [tex]\lambda[/tex]_{1}+ [tex]\lambda[/tex]_{2}

Thanks!

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Maximum Entropy Distribution Given Marginals

Can you offer guidance or do you also need help?

**Physics Forums | Science Articles, Homework Help, Discussion**