1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Maximum Entropy Distribution Given Marginals

  1. Jul 5, 2011 #1
    Hi all,

    I'm a pure mathematician (Graph Theory) who has to go through a Physics paper, and I am having trouble getting through a part of it. Maybe you guys can point me in the right direction:

    Let P(x,y) be a joint distribution function.

    Let H = - [itex]\Sigma[/itex]x,y P(x,y) log P(x,y), which is the Entropy.

    We are given the marginal probability density functions q1(x) and q2(y).

    To get the maximum entropy distribution consistent with the marginals, we solve this problem:

    max H

    subject to constraints,

    [itex]\Sigma[/itex]y P(x,y) = q1(x)
    [itex]\Sigma[/itex]x P(x,y) = q2(y)

    ---------------------

    The Lagrangian is,

    L = - [itex]\Sigma[/itex] x,y (P(x,y) log P(x,y)) + [tex]\lambda[/tex]1 [itex]\Sigma[/itex]y (P(x,y) - q1(x)) + [tex]\lambda[/tex]2 [itex]\Sigma[/itex]x (P(x,y) - q2(y))

    To find the stationary point of L, we differentiate it with respect to a particular P(x',y') while keeping the other variables constant.

    I know how to do this for the first sum, which gives the solution -(1 + log P(x',y')).

    ---------------------

    How do we differentiate the second and third sum with respect to P(x',y')?

    The paper gave the solution to differentiating L with respect to P(x',y') as:

    -(1 + log P(x',y')) + [tex]\lambda[/tex]1 + [tex]\lambda[/tex]2


    Thanks!
     
  2. jcsd
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted