1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Covariance - Bernoulli Distribution

  1. Sep 17, 2012 #1
    1. Consider the random variables X,Y where X~B(1,p) and
    f(y|x=0) = 1/2 0<y<2
    f(y|x=1) = 1 0<y<1

    Find cov(x,y)




    2. Relevant equations
    Cov(x,y) = E(XY) - E(X)E(Y) = E[(x-E(x))(y-E(y))]
    E(XY)=E[XE(Y|X)]



    3. The attempt at a solution
    E(X) = p (known since it's Bernoulli, can also be proven
    E(Y) = [itex]\int Y*1/2 dy[/itex] 0 to 2 + [itex]\int Y*1[/itex] 0 to 1 = 3/2
    I'm not sure E(Y) is right.

    If this is right, I still don't know how to solve E(XY).

    Could we do cov(x,y) =∫ ∫(x-p)(y-3/2)dxdy from 0 to 1, 0 to 2 ?

    Thoughts?
     
  2. jcsd
  3. Sep 18, 2012 #2

    lanedance

    User Avatar
    Homework Helper

    Your E(Y) is not correct. Rather than inputtting the coniditional distributions to start, try writing the fromula for E(Y) and work from that to see where the conditional distributions can be used.
     
    Last edited: Sep 18, 2012
  4. Sep 18, 2012 #3
    Ya I thought so.
    Well, E(Y) = ∫y*f(y)dy = ∫y*(∫f(x,y)dx)dy or = ∫y*f(x,y)*f(x|y)dy
    But, I can't see how to use f(y|x=0) and f(y|x=1)

    We do know that f(x) = px(1-p)1-x
     
    Last edited: Sep 18, 2012
  5. Sep 18, 2012 #4

    lanedance

    User Avatar
    Homework Helper

    though equivalent, the discrete veiw point for the probability mass function may be simpler to envisage here:
    f(x) = p, if x=1
    f(x) = (1-p), if x=0
    f(x) = 0, otherwise

    Now the expectation of a function of x, say g(x) will be:
    [tex] E[g(x)] = \sum_{x_i} g(x_i)f(x_i) = pg(1)+(1-p)g(0)[/tex]

    If x were continuously distributed, then the marginal distribution for Y is given by
    [tex] f_Y(y) = \int f_{X,Y}(x,y)dx =\int f_{Y}(y|X=x)f_X(x)dx [/tex]

    As x is a discrete variable, write teh marginal distribution of Y in terms of the discrete possibilties for x, and teh probabilites p? It may help to think of the integrand above as a function of x...
     
  6. Sep 18, 2012 #5
    So are you saying Y is g(x). But, we know f(y|x=0) and f(y|x=1) but do we know g(1), g(0)?
    I tried thinking of it as a discrete, but couldn't we just use
    [tex] f_Y(y) = \int f_{X,Y}(x,y)dx =\int f_{Y}(y|X=x)f_X(x)dx [/tex]
    since we would get [tex] \int_0^2 1/2*1*(1-p)dx[/tex] + [tex] \int_0^1 (1*p)dx[/tex] since f(x) = px(1-p)1-x
    Sorry, I'm just struggling to understand how to use the conditional pdf in this case.
     
    Last edited: Sep 18, 2012
  7. Sep 18, 2012 #6

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    You cannot integrate over x because X iis a discrete random variable and so does not have a probability density.

    You wrote the formula E(XY)=E[XE(Y|X)] in your original post. Do you understand what it MEANS? Can you write it out explicitly in terms of the possible values of X and their probabilities? Figuring out how to do that is Step 1 in the solution (or, at least, Step 1 in one approach to the solution).

    RGV
     
  8. Sep 19, 2012 #7
    This is where I am now:

    f(y|x=0) = 1/2 for 0<y<2 and f(y|x=1) = 1 0<y<1--> uniform distribution --> E(Y|x=0) = 1 ; E(Y|x=1) = 1/2
    Also, E(Y) = E(Y|x=0)P(x=0) + E(Y|x=1)P(x=1) = 1-p/2
    So, P(x=0,y) = (1-p)/3 for any y=0,1,2
    P(x=1,y) = 1/2 for y=0,1 and 0 for y=2

    Then,
    [tex]cov(x,y) = \sum_{y=0}^2 \sum_{x=0}^1 (x-p)(y-(1-p/2))P(x,y) = \frac{p(p-1)}{2}[/tex]

    Is this right?
     
    Last edited: Sep 19, 2012
  9. Sep 19, 2012 #8

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Yes, it's OK. But, an easier way would be to compute E(XY) = P(X=0)*E(XY|X=0) + P(X=1)*E(XY|X=1) and to use Cov(X,Y) = E(XY) - (EX)(EY).

    RGV
     
  10. Sep 19, 2012 #9
    Hmm, I thought about that but when I was thinking about it it seemed difficult to calculate E(XY|X=0) for example. That is probably an oversight of mine, though.

    But, [tex]cov(x,y) = \frac{p(p-1)}{2}[/tex] is correct?
     
  11. Sep 19, 2012 #10

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Easiest thing in the world: E(XY|X=0) = 0 (!) E(XY|X=1) = 1*E(Y|X=1).

    I do not wish to answer the last question you asked.

    RGV
     
  12. Sep 19, 2012 #11
    Doh, that was easy. It also makes a lot of sense. Both give the same answer so it's nice to see I did my summations correct.

    Thanks a lot!
     
  13. Nov 17, 2012 #12
    I think im being really stupid here but for the E(XY) part can you write down the full working im really stuck on it, thanks
     
  14. Nov 17, 2012 #13
    What do you mean?
     
  15. Nov 17, 2012 #14
    Easiest thing in the world: E(XY|X=0) = 0 (!) E(XY|X=1) = 1*E(Y|X=1)

    from this help that you got what do you do, whats the final answer?
     
  16. Nov 17, 2012 #15
    [tex]E(XY)=E(xy|x=0)P(x=0) + E(xy|x=1)P(x=1)[/tex]
     
  17. Nov 17, 2012 #16
    yep then i get that the original equation its what to do after, am i expecting an answer in terms of p and q? so i understand that the part E(XY|X=0) = 0 (!) and that E(XY|X=1) = 1*E(Y|X=1)
    so does that mean im left with E(XY)=E(Y|X=1) ?? its from here i am stuck, sorry if im completely missing the point!
     
  18. Nov 17, 2012 #17
    Yes, you should get an answer in terms of p.
    Just use the E(XY) formula above and recall the formula for cov(X,Y).
     
  19. Nov 17, 2012 #18
    Cov(XY)=E(XY) - E(X)E(Y) if im not mistaken?
    sorry im being slow but i dont understand how you get E(Y|X=1) into terms of p and q
     
  20. Nov 17, 2012 #19
    Yes, [tex]cov(X,Y) = E(XY)-E(X)E(Y)[/tex].
    Moreover, [tex]E(XY) = E(XY|X=0)P(X=0) + E(XY|X=1)P(X=1)[/tex]
    Now, find P(X=0), P(X=1) (this should be easy).
    But, you are asking about how to find E(Y|X=1)? Well, we have a formula for f(Y|X=1) don't we? It's a horizontal line at 1 from 0 to 1. Then, the expected value (E(Y|X=1)) should be right in the middle.
    Same idea for E(Y|X=0).
     
  21. Nov 17, 2012 #20
    ok ill give it a go now, thank you so much!!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Covariance - Bernoulli Distribution
  1. Covariance question (Replies: 1)

  2. Calculating covariance (Replies: 0)

  3. Covariance question. (Replies: 20)

Loading...