Covariance - Bernoulli Distribution

Click For Summary
SUMMARY

This discussion focuses on calculating the covariance between two random variables, X and Y, where X follows a Bernoulli distribution X~B(1,p) and Y has conditional distributions based on the value of X. The key equations used include Cov(X,Y) = E(XY) - E(X)E(Y) and E(XY) = E[XE(Y|X)]. The final conclusion reached is that Cov(X,Y) = (p(p-1))/2, derived from the conditional expectations and probabilities of X and Y.

PREREQUISITES
  • Understanding of Bernoulli distribution and its properties (X~B(1,p))
  • Knowledge of covariance and expectation formulas (Cov(X,Y), E(XY))
  • Familiarity with conditional probability distributions (f(y|x=0), f(y|x=1))
  • Ability to perform integration for continuous random variables
NEXT STEPS
  • Study the derivation of covariance in discrete random variables
  • Learn about conditional expectation and its applications in probability theory
  • Explore the properties of the Bernoulli distribution in greater detail
  • Investigate the relationship between joint and marginal distributions
USEFUL FOR

Students and professionals in statistics, data science, or any field involving probability theory, particularly those working with random variables and their relationships.

Scootertaj
Messages
97
Reaction score
0
1. Consider the random variables X,Y where X~B(1,p) and
f(y|x=0) = 1/2 0<y<2
f(y|x=1) = 1 0<y<1

Find cov(x,y)




Homework Equations


Cov(x,y) = E(XY) - E(X)E(Y) = E[(x-E(x))(y-E(y))]
E(XY)=E[XE(Y|X)]



The Attempt at a Solution


E(X) = p (known since it's Bernoulli, can also be proven
E(Y) = \int Y*1/2 dy 0 to 2 + \int Y*1 0 to 1 = 3/2
I'm not sure E(Y) is right.

If this is right, I still don't know how to solve E(XY).

Could we do cov(x,y) =∫ ∫(x-p)(y-3/2)dxdy from 0 to 1, 0 to 2 ?

Thoughts?
 
Physics news on Phys.org
Scootertaj said:
1. Consider the random variables X,Y where X~B(1,p) and
f(y|x=0) = 1/2 0<y<2
f(y|x=1) = 1 0<y<1

Find cov(x,y)




Homework Equations


Cov(x,y) = E(XY) - E(X)E(Y) = E[(x-E(x))(y-E(y))]
E(XY)=E[XE(Y|X)]



The Attempt at a Solution


E(X) = p (known since it's Bernoulli, can also be proven
E(Y) = \int Y*1/2 dy 0 to 2 + \int Y*1 0 to 1 = 3/2
I'm not sure E(Y) is right.
Your E(Y) is not correct. Rather than inputtting the coniditional distributions to start, try writing the fromula for E(Y) and work from that to see where the conditional distributions can be used.
Scootertaj said:
If this is right, I still don't know how to solve E(XY).

Could we do cov(x,y) =∫ ∫(x-p)(y-3/2)dxdy from 0 to 1, 0 to 2 ?

Thoughts?
 
Last edited:
Ya I thought so.
Well, E(Y) = ∫y*f(y)dy = ∫y*(∫f(x,y)dx)dy or = ∫y*f(x,y)*f(x|y)dy
But, I can't see how to use f(y|x=0) and f(y|x=1)

We do know that f(x) = px(1-p)1-x
 
Last edited:
though equivalent, the discrete veiw point for the probability mass function may be simpler to envisage here:
f(x) = p, if x=1
f(x) = (1-p), if x=0
f(x) = 0, otherwise

Now the expectation of a function of x, say g(x) will be:
E[g(x)] = \sum_{x_i} g(x_i)f(x_i) = pg(1)+(1-p)g(0)

If x were continuously distributed, then the marginal distribution for Y is given by
f_Y(y) = \int f_{X,Y}(x,y)dx =\int f_{Y}(y|X=x)f_X(x)dx

As x is a discrete variable, write the marginal distribution of Y in terms of the discrete possibilties for x, and the probabilites p? It may help to think of the integrand above as a function of x...
 
lanedance said:
though equivalent, the discrete veiw point for the probability mass function may be simpler to envisage here:
f(x) = p, if x=1
f(x) = (1-p), if x=0
f(x) = 0, otherwise

Now the expectation of a function of x, say g(x) will be:
E[g(x)] = \sum_{x_i} g(x_i)f(x_i) = pg(1)+(1-p)g(0)

If x were continuously distributed, then the marginal distribution for Y is given by
f_Y(y) = \int f_{X,Y}(x,y)dx =\int f_{Y}(y|X=x)f_X(x)dx

As x is a discrete variable, write the marginal distribution of Y in terms of the discrete possibilties for x, and the probabilites p? It may help to think of the integrand above as a function of x...

So are you saying Y is g(x). But, we know f(y|x=0) and f(y|x=1) but do we know g(1), g(0)?
I tried thinking of it as a discrete, but couldn't we just use
f_Y(y) = \int f_{X,Y}(x,y)dx =\int f_{Y}(y|X=x)f_X(x)dx
since we would get \int_0^2 1/2*1*(1-p)dx + \int_0^1 (1*p)dx since f(x) = px(1-p)1-x
Sorry, I'm just struggling to understand how to use the conditional pdf in this case.
 
Last edited:
Scootertaj said:
So are you saying Y is g(x). But, we know f(y|x=0) and f(y|x=1) but do we know g(1), g(0)?
I tried thinking of it as a discrete, but couldn't we just use
f_Y(y) = \int f_{X,Y}(x,y)dx =\int f_{Y}(y|X=x)f_X(x)dx
since we would get \int_0^2 1/2*1*(1-p)dx + \int_0^1 (1*p)dx since f(x) = px(1-p)1-x
Sorry, I'm just struggling to understand how to use the conditional pdf in this case.

You cannot integrate over x because X iis a discrete random variable and so does not have a probability density.

You wrote the formula E(XY)=E[XE(Y|X)] in your original post. Do you understand what it MEANS? Can you write it out explicitly in terms of the possible values of X and their probabilities? Figuring out how to do that is Step 1 in the solution (or, at least, Step 1 in one approach to the solution).

RGV
 
This is where I am now:

f(y|x=0) = 1/2 for 0<y<2 and f(y|x=1) = 1 0<y<1--> uniform distribution --> E(Y|x=0) = 1 ; E(Y|x=1) = 1/2
Also, E(Y) = E(Y|x=0)P(x=0) + E(Y|x=1)P(x=1) = 1-p/2
So, P(x=0,y) = (1-p)/3 for any y=0,1,2
P(x=1,y) = 1/2 for y=0,1 and 0 for y=2

Then,
cov(x,y) = \sum_{y=0}^2 \sum_{x=0}^1 (x-p)(y-(1-p/2))P(x,y) = \frac{p(p-1)}{2}

Is this right?
 
Last edited:
Scootertaj said:
This is where I am now:

f(y|x=0) = 1/2 for 0<y<2 and f(y|x=1) = 1 0<y<1--> uniform distribution --> E(Y|x=0) = 1 ; E(Y|x=1) = 1/2
Also, E(Y) = E(Y|x=0)P(x=0) + E(Y|x=1)P(x=1) = 1-p/2
So, P(x=0,y) = (1-p)/3 for any y=0,1,2
P(x=1,y) = 1/2 for y=0,1 and 0 for y=2

Then,
cov(x,y) = \sum_{y=0}^2 \sum_{x=0}^1 (x-p)(y-(1-p/2))P(x,y) = \frac{p(p-1)}{2}

Is this right?

Yes, it's OK. But, an easier way would be to compute E(XY) = P(X=0)*E(XY|X=0) + P(X=1)*E(XY|X=1) and to use Cov(X,Y) = E(XY) - (EX)(EY).

RGV
 
Ray Vickson said:
Yes, it's OK. But, an easier way would be to compute E(XY) = P(X=0)*E(XY|X=0) + P(X=1)*E(XY|X=1) and to use Cov(X,Y) = E(XY) - (EX)(EY).

RGV
Hmm, I thought about that but when I was thinking about it it seemed difficult to calculate E(XY|X=0) for example. That is probably an oversight of mine, though.

But, cov(x,y) = \frac{p(p-1)}{2} is correct?
 
  • #10
Scootertaj said:
Hmm, I thought about that but when I was thinking about it it seemed difficult to calculate E(XY|X=0) for example. That is probably an oversight of mine, though.

But, cov(x,y) = \frac{p(p-1)}{2} is correct?

Easiest thing in the world: E(XY|X=0) = 0 (!) E(XY|X=1) = 1*E(Y|X=1).

I do not wish to answer the last question you asked.

RGV
 
  • #11
Doh, that was easy. It also makes a lot of sense. Both give the same answer so it's nice to see I did my summations correct.

Thanks a lot!
 
  • #12
I think I am being really stupid here but for the E(XY) part can you write down the full working I am really stuck on it, thanks
 
  • #13
What do you mean?
 
  • #14
Easiest thing in the world: E(XY|X=0) = 0 (!) E(XY|X=1) = 1*E(Y|X=1)

from this help that you got what do you do, what's the final answer?
 
  • #15
E(XY)=E(xy|x=0)P(x=0) + E(xy|x=1)P(x=1)
 
  • #16
yep then i get that the original equation its what to do after, am i expecting an answer in terms of p and q? so i understand that the part E(XY|X=0) = 0 (!) and that E(XY|X=1) = 1*E(Y|X=1)
so does that mean I am left with E(XY)=E(Y|X=1) ?? its from here i am stuck, sorry if I am completely missing the point!
 
  • #17
Yes, you should get an answer in terms of p.
Just use the E(XY) formula above and recall the formula for cov(X,Y).
 
  • #18
Cov(XY)=E(XY) - E(X)E(Y) if I am not mistaken?
sorry I am being slow but i don't understand how you get E(Y|X=1) into terms of p and q
 
  • #19
cg7193 said:
Cov(XY)=E(XY) - E(X)E(Y) if I am not mistaken?
sorry I am being slow but i don't understand how you get E(Y|X=1) into terms of p and q
Yes, cov(X,Y) = E(XY)-E(X)E(Y).
Moreover, E(XY) = E(XY|X=0)P(X=0) + E(XY|X=1)P(X=1)
Now, find P(X=0), P(X=1) (this should be easy).
But, you are asking about how to find E(Y|X=1)? Well, we have a formula for f(Y|X=1) don't we? It's a horizontal line at 1 from 0 to 1. Then, the expected value (E(Y|X=1)) should be right in the middle.
Same idea for E(Y|X=0).
 
  • #20
ok ill give it a go now, thank you so much!
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
1K
Replies
5
Views
2K
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
2
Views
2K