Recent content by Legendre

  1. L

    Lagrange Multiplier Help

    Nice! Thanks a lot for the help! Problem resolved. :D
  2. L

    Lagrange Multiplier Help

    No, I meant to ask you about the problem on page 10. Your original reply solved my original problem. To make it more precise: Ignore everything that I have asked so far. The new problem is... max - \Sigmax,y P(x,y) log P(x,y), subject to \Sigmay P(x,y) = r(x) for all x, and \Sigmax P(x,y)...
  3. L

    Lagrange Multiplier Help

    Yes thanks! There is something else I don't get... Page 10 of this lecture notes: http://pillowlab.cps.utexas.edu/teaching/CompNeuro10/slides/slides16_EntropyMethods.pdf [Broken] Says we should get the solution as P(x,y) = P(x) P(y). I have no idea how to get this. Furthermore, they seem to...
  4. L

    Maximum Entropy Distribution Given Marginals

    Hi all, I'm a pure mathematician (Graph Theory) who has to go through a Physics paper, and I am having trouble getting through a part of it. Maybe you guys can point me in the right direction: Let P(x,y) be a joint distribution function. Let H = - \Sigmax,y P(x,y) log P(x,y), which is...
  5. L

    Lagrange Multiplier Help

    Homework Statement L = - \Sigma x,y (P(x,y) log P(x,y)) + \lambda \Sigmay (P(x,y) - q(x)) This is the Lagrangian. I need to maximize the first term in the sum with respect to P(x,y), subject to the constraint in the second term. The first term is a sum over all possible values of x,y...
  6. L

    Unbiased Estimator for Gamma Distribution Parameter

    Homework Statement I need to show that c / (sample mean X) is unbiased for b, for some c. Where {X} are iid Gamma (a,b). Homework Equations N.A. The Attempt at a Solution I know how to show that (sample mean X) / a is unbiased for 1/b : 1) E(sample mean X / a) =...
  7. L

    Counting Outcomes - Probability Question

    Doh! X = 0. Thats one more outcome. :( But my approach is correct?
  8. L

    Counting Outcomes - Probability Question

    Homework Statement Z plays a game where independent flips of a coin are recorded until two heads in succession are encountered. Z wins if 2 heads in succession occurs. Z loses if after 5 flips, we have not encounter two heads in succession. 1) What is the probability that Z wins the game...
  9. L

    Probability Mass Function of { Z | Z < 1 }, Z given Z is less than 1

    hgfalling, thank you so much for all the probability help!
  10. L

    Probability Mass Function of { Z | Z < 1 }, Z given Z is less than 1

    Probability Mass Function of { Z | Z < 1 }, "Z given Z is less than 1" Homework Statement Given Z = X + Y. Find the probability density function of Z|Z < 1. Homework Equations N.A. The Attempt at a Solution f(z) = P(Z=z|Z<1) = P(Z=z AND Z < 1) / P(Z < 1). I thought the...
  11. L

    Density Function for Sums of Random Variables

    i came up with an answer on my own, is the process correct? i looked at the two arguments of f(x, z-x) and try to deduce which bounds are significant. 1) argument "x" 0 < x < z - x. so x/2 < x < z/2. the z/2 is significant across all z in (0,2) since for z < 2, z/2 < 1. x < z/2 < 1, so...
  12. L

    Density Function for Sums of Random Variables

    Homework Statement Given the joint density, f(x,y), derive the probability density function for Z = X + Y and V = Y - X. Homework Equations f(x,y) = 2 for 0 < x < y < 1 f(x,y) = 0 otherwise. The Attempt at a Solution For Z = X + Y, I can derive the fact that, f_Z(z) =...
  13. L

    Support of Continuous Conditional Density Functions (Probability)

    I just realized my mistake : x is fixed since we are conditioning Y on a particular value of X = x. So the 0 < x < 1 is irrelevant. Or rather, the conditional probability density for a given X = x is a function of y only. So only the value of y matters in defining the support. I feel like...
  14. L

    Support of Continuous Conditional Density Functions (Probability)

    f(x,y) = x + y is the joint probability density function for continuous random variables X and Y. The support of this function is {0 < x < 1, 0 < y < 1}, which means it takes positive values over this region and zero elsewhere. g(x) = x + (1/2) is the probability density function of X...
  15. L

    Jacobian of the linear transform Y = AX

    Thanks guys. I wrote Ax, for a constant martix A, as a linear combination of its columns, then deduce that each of the gi(X) is a linear combination of the entries in the ith row of A. Then the jacobian is the determinant of A transposed, which is equal to the determinant of A!
Top