Covariance - Bernoulli Distribution

  • Thread starter Scootertaj
  • Start date
  • #1
Scootertaj
97
0
1. Consider the random variables X,Y where X~B(1,p) and
f(y|x=0) = 1/2 0<y<2
f(y|x=1) = 1 0<y<1

Find cov(x,y)




Homework Equations


Cov(x,y) = E(XY) - E(X)E(Y) = E[(x-E(x))(y-E(y))]
E(XY)=E[XE(Y|X)]



The Attempt at a Solution


E(X) = p (known since it's Bernoulli, can also be proven
E(Y) = [itex]\int Y*1/2 dy[/itex] 0 to 2 + [itex]\int Y*1[/itex] 0 to 1 = 3/2
I'm not sure E(Y) is right.

If this is right, I still don't know how to solve E(XY).

Could we do cov(x,y) =∫ ∫(x-p)(y-3/2)dxdy from 0 to 1, 0 to 2 ?

Thoughts?
 

Answers and Replies

  • #2
lanedance
Homework Helper
3,304
2
1. Consider the random variables X,Y where X~B(1,p) and
f(y|x=0) = 1/2 0<y<2
f(y|x=1) = 1 0<y<1

Find cov(x,y)




Homework Equations


Cov(x,y) = E(XY) - E(X)E(Y) = E[(x-E(x))(y-E(y))]
E(XY)=E[XE(Y|X)]



The Attempt at a Solution


E(X) = p (known since it's Bernoulli, can also be proven
E(Y) = [itex]\int Y*1/2 dy[/itex] 0 to 2 + [itex]\int Y*1[/itex] 0 to 1 = 3/2
I'm not sure E(Y) is right.
Your E(Y) is not correct. Rather than inputtting the coniditional distributions to start, try writing the fromula for E(Y) and work from that to see where the conditional distributions can be used.
If this is right, I still don't know how to solve E(XY).

Could we do cov(x,y) =∫ ∫(x-p)(y-3/2)dxdy from 0 to 1, 0 to 2 ?

Thoughts?
 
Last edited:
  • #3
Scootertaj
97
0
Ya I thought so.
Well, E(Y) = ∫y*f(y)dy = ∫y*(∫f(x,y)dx)dy or = ∫y*f(x,y)*f(x|y)dy
But, I can't see how to use f(y|x=0) and f(y|x=1)

We do know that f(x) = px(1-p)1-x
 
Last edited:
  • #4
lanedance
Homework Helper
3,304
2
though equivalent, the discrete veiw point for the probability mass function may be simpler to envisage here:
f(x) = p, if x=1
f(x) = (1-p), if x=0
f(x) = 0, otherwise

Now the expectation of a function of x, say g(x) will be:
[tex] E[g(x)] = \sum_{x_i} g(x_i)f(x_i) = pg(1)+(1-p)g(0)[/tex]

If x were continuously distributed, then the marginal distribution for Y is given by
[tex] f_Y(y) = \int f_{X,Y}(x,y)dx =\int f_{Y}(y|X=x)f_X(x)dx [/tex]

As x is a discrete variable, write teh marginal distribution of Y in terms of the discrete possibilties for x, and teh probabilites p? It may help to think of the integrand above as a function of x...
 
  • #5
Scootertaj
97
0
though equivalent, the discrete veiw point for the probability mass function may be simpler to envisage here:
f(x) = p, if x=1
f(x) = (1-p), if x=0
f(x) = 0, otherwise

Now the expectation of a function of x, say g(x) will be:
[tex] E[g(x)] = \sum_{x_i} g(x_i)f(x_i) = pg(1)+(1-p)g(0)[/tex]

If x were continuously distributed, then the marginal distribution for Y is given by
[tex] f_Y(y) = \int f_{X,Y}(x,y)dx =\int f_{Y}(y|X=x)f_X(x)dx [/tex]

As x is a discrete variable, write teh marginal distribution of Y in terms of the discrete possibilties for x, and teh probabilites p? It may help to think of the integrand above as a function of x...

So are you saying Y is g(x). But, we know f(y|x=0) and f(y|x=1) but do we know g(1), g(0)?
I tried thinking of it as a discrete, but couldn't we just use
[tex] f_Y(y) = \int f_{X,Y}(x,y)dx =\int f_{Y}(y|X=x)f_X(x)dx [/tex]
since we would get [tex] \int_0^2 1/2*1*(1-p)dx[/tex] + [tex] \int_0^1 (1*p)dx[/tex] since f(x) = px(1-p)1-x
Sorry, I'm just struggling to understand how to use the conditional pdf in this case.
 
Last edited:
  • #6
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,722
So are you saying Y is g(x). But, we know f(y|x=0) and f(y|x=1) but do we know g(1), g(0)?
I tried thinking of it as a discrete, but couldn't we just use
[tex] f_Y(y) = \int f_{X,Y}(x,y)dx =\int f_{Y}(y|X=x)f_X(x)dx [/tex]
since we would get [tex] \int_0^2 1/2*1*(1-p)dx[/tex] + [tex] \int_0^1 (1*p)dx[/tex] since f(x) = px(1-p)1-x
Sorry, I'm just struggling to understand how to use the conditional pdf in this case.

You cannot integrate over x because X iis a discrete random variable and so does not have a probability density.

You wrote the formula E(XY)=E[XE(Y|X)] in your original post. Do you understand what it MEANS? Can you write it out explicitly in terms of the possible values of X and their probabilities? Figuring out how to do that is Step 1 in the solution (or, at least, Step 1 in one approach to the solution).

RGV
 
  • #7
Scootertaj
97
0
This is where I am now:

f(y|x=0) = 1/2 for 0<y<2 and f(y|x=1) = 1 0<y<1--> uniform distribution --> E(Y|x=0) = 1 ; E(Y|x=1) = 1/2
Also, E(Y) = E(Y|x=0)P(x=0) + E(Y|x=1)P(x=1) = 1-p/2
So, P(x=0,y) = (1-p)/3 for any y=0,1,2
P(x=1,y) = 1/2 for y=0,1 and 0 for y=2

Then,
[tex]cov(x,y) = \sum_{y=0}^2 \sum_{x=0}^1 (x-p)(y-(1-p/2))P(x,y) = \frac{p(p-1)}{2}[/tex]

Is this right?
 
Last edited:
  • #8
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,722
This is where I am now:

f(y|x=0) = 1/2 for 0<y<2 and f(y|x=1) = 1 0<y<1--> uniform distribution --> E(Y|x=0) = 1 ; E(Y|x=1) = 1/2
Also, E(Y) = E(Y|x=0)P(x=0) + E(Y|x=1)P(x=1) = 1-p/2
So, P(x=0,y) = (1-p)/3 for any y=0,1,2
P(x=1,y) = 1/2 for y=0,1 and 0 for y=2

Then,
[tex]cov(x,y) = \sum_{y=0}^2 \sum_{x=0}^1 (x-p)(y-(1-p/2))P(x,y) = \frac{p(p-1)}{2}[/tex]

Is this right?

Yes, it's OK. But, an easier way would be to compute E(XY) = P(X=0)*E(XY|X=0) + P(X=1)*E(XY|X=1) and to use Cov(X,Y) = E(XY) - (EX)(EY).

RGV
 
  • #9
Scootertaj
97
0
Yes, it's OK. But, an easier way would be to compute E(XY) = P(X=0)*E(XY|X=0) + P(X=1)*E(XY|X=1) and to use Cov(X,Y) = E(XY) - (EX)(EY).

RGV
Hmm, I thought about that but when I was thinking about it it seemed difficult to calculate E(XY|X=0) for example. That is probably an oversight of mine, though.

But, [tex]cov(x,y) = \frac{p(p-1)}{2}[/tex] is correct?
 
  • #10
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,722
Hmm, I thought about that but when I was thinking about it it seemed difficult to calculate E(XY|X=0) for example. That is probably an oversight of mine, though.

But, [tex]cov(x,y) = \frac{p(p-1)}{2}[/tex] is correct?

Easiest thing in the world: E(XY|X=0) = 0 (!) E(XY|X=1) = 1*E(Y|X=1).

I do not wish to answer the last question you asked.

RGV
 
  • #11
Scootertaj
97
0
Doh, that was easy. It also makes a lot of sense. Both give the same answer so it's nice to see I did my summations correct.

Thanks a lot!
 
  • #12
cg7193
5
0
I think I am being really stupid here but for the E(XY) part can you write down the full working I am really stuck on it, thanks
 
  • #13
Scootertaj
97
0
What do you mean?
 
  • #14
cg7193
5
0
Easiest thing in the world: E(XY|X=0) = 0 (!) E(XY|X=1) = 1*E(Y|X=1)

from this help that you got what do you do, what's the final answer?
 
  • #15
Scootertaj
97
0
[tex]E(XY)=E(xy|x=0)P(x=0) + E(xy|x=1)P(x=1)[/tex]
 
  • #16
cg7193
5
0
yep then i get that the original equation its what to do after, am i expecting an answer in terms of p and q? so i understand that the part E(XY|X=0) = 0 (!) and that E(XY|X=1) = 1*E(Y|X=1)
so does that mean I am left with E(XY)=E(Y|X=1) ?? its from here i am stuck, sorry if I am completely missing the point!
 
  • #17
Scootertaj
97
0
Yes, you should get an answer in terms of p.
Just use the E(XY) formula above and recall the formula for cov(X,Y).
 
  • #18
cg7193
5
0
Cov(XY)=E(XY) - E(X)E(Y) if I am not mistaken?
sorry I am being slow but i don't understand how you get E(Y|X=1) into terms of p and q
 
  • #19
Scootertaj
97
0
Cov(XY)=E(XY) - E(X)E(Y) if I am not mistaken?
sorry I am being slow but i don't understand how you get E(Y|X=1) into terms of p and q
Yes, [tex]cov(X,Y) = E(XY)-E(X)E(Y)[/tex].
Moreover, [tex]E(XY) = E(XY|X=0)P(X=0) + E(XY|X=1)P(X=1)[/tex]
Now, find P(X=0), P(X=1) (this should be easy).
But, you are asking about how to find E(Y|X=1)? Well, we have a formula for f(Y|X=1) don't we? It's a horizontal line at 1 from 0 to 1. Then, the expected value (E(Y|X=1)) should be right in the middle.
Same idea for E(Y|X=0).
 
  • #20
cg7193
5
0
ok ill give it a go now, thank you so much!
 

Suggested for: Covariance - Bernoulli Distribution

  • Last Post
Replies
4
Views
644
Replies
6
Views
411
Replies
15
Views
795
  • Last Post
Replies
8
Views
655
Replies
9
Views
709
Replies
7
Views
457
Replies
2
Views
544
Replies
4
Views
538
Replies
2
Views
487
Top