Always possible to obtain marginals from joint pmf?

  • Thread starter Thread starter jimholt
  • Start date Start date
  • Tags Tags
    Joint
jimholt
Messages
12
Reaction score
0
Obtain marginal probability mass function (pmf) given joint pmf

Not really a homework question, but it does have a homeworky flavor, doesn't it...

Homework Statement

Given a join probability mass function of two variables, is it always possible to obtain the marginals?
E.g., if I have a joint mass function for two Bernoulli random variables X and Y, like this:

<br /> f(x,y) = \begin{cases} 1/2 &amp; \mbox{if } (x,y) = (0, 1) \\ <br /> 1/2 &amp; \mbox{if } (x,y) = (1, 0) \\ <br /> 0 &amp; \mbox{otherwise} \end{cases}<br />

Can I obtain the marginals for X and Y?


The attempt at a solution

I want to say yes, but if the marginals for X and Y are

<br /> f(x) = \begin{cases} 1/2 &amp; \mbox{if } x = 0 \\ <br /> 1/2 &amp; \mbox{if } x = 1 \\ <br /> 0 &amp; \mbox{otherwise} \end{cases}<br />

and
<br /> f(y) = \begin{cases} 1/2 &amp; \mbox{if } y = 0 \\ <br /> 1/2 &amp; \mbox{if } y = 1 \\ <br /> 0 &amp; \mbox{otherwise} \end{cases}<br />

Then that produces a joint mass function

<br /> f(x,y) = \begin{cases} 1/4 &amp; \mbox{if } (x,y) \in \{(0, 0), (0, 1), (1, 0), (1, 1) \} \\ 0 &amp; \mbox{otherwise} \end{cases}<br />

which is clearly wrong.

So what's the right way to get at the marginals, assuming they, er, exist?
 
Last edited:
Physics news on Phys.org
Your marginal probabilities are correct. But you don't get the joint distribution from the marginals by multiplying them together because they aren't independent. You can see that easily because, for example, if you know X = 1 you know Y isn't in the joint distribution.
 
Ah yes, thanks for that. I am rustier than I thought. Using the equation f(x,y) = f(x | y) \cdot f(y) then since f(x=0|y=1) = f(x=1|y=0) = 1 and f(x=0|y=0) = f(x=1|y=1) = 0, I do indeed get the correct joint pmf.

But now, I am a little bit confused about the covariance between X and Y...

We know E(X)=1/2 and E(Y)=1/2. And since XY can only take the values 0 or 1, E(XY) = f(x=1)f(y=1)=(1/2)(1/2)=1/4 (is this right?).

Then using the equation Cov(X,Y) = E(XY) - E(X)E(Y), I get Cov(X,Y)=1/4-1/4=0, implying that X and Y are uncorrelated. Could that be right?
 
The probability that XY = 1 is zero from your joint distribution. 1*0 or 0*1 (or 0*0). E(XY) = 0.
 
Doh. Made the same mistake twice, didn't I?

I think I have it now. Cov(X,Y)=-1/4 and then

\rho_{XY}=\frac{Cov(X,Y)}{\sigma_X\sigma_Y}=-1.

Makes a lot more sense.

Thanks for your help, LCKurtz!
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top