1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Mathematical statistics

  1. Sep 21, 2009 #1
    1. The problem statement, all variables and given/known data

    let f(x) be a density on R+ (so f(x) < 0 if x < 0). Let g(x,y) = f(x+y)/(x+y), x > 0, y> 0
    a) show g is a density on R^2
    b) assume that the expectation u and variance sigma^2 associated univariate density f exist and that mu^2 does not equal 2sigma^2. Show that X and Y are dependent.

    2. Relevant equations



    3. The attempt at a solution

    I have part a done. As for part b I am terribly confused. I first took the question as meaning E[X] = mu and E[Y] = mu. Then I assume that X,Y were independent for a proof by contradiction. So, E[XY] = mu^2 and Var(X+y) = 2sigma^2 (since cov(X,Y) = 0 and also assuming Var(X) = sigma^2 = Var(Y)). Then from the question we would assume that E[XY] does not equal Var(X+Y) from which I got nowhere.

    So I thought I would have to go back and play with the joint density g(x,y) and use the definition of independence for a joint density: g(x,y) = g_Y(y) g_X(x) except I have no clue on how to the integral for either marginal.

    Any hints as to how to attack this problem would be greatly appreciated :)
     
  2. jcsd
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted