# Always possible to obtain marginals from joint pmf?

1. May 11, 2010

### jimholt

Obtain marginal probability mass function (pmf) given joint pmf

Not really a homework question, but it does have a homeworky flavor, doesn't it...

The problem statement, all variables and given/known data

Given a join probability mass function of two variables, is it always possible to obtain the marginals?
E.g., if I have a joint mass function for two Bernoulli random variables X and Y, like this:

$$f(x,y) = \begin{cases} 1/2 & \mbox{if } (x,y) = (0, 1) \\ 1/2 & \mbox{if } (x,y) = (1, 0) \\ 0 & \mbox{otherwise} \end{cases}$$

Can I obtain the marginals for X and Y?

The attempt at a solution

I want to say yes, but if the marginals for X and Y are

$$f(x) = \begin{cases} 1/2 & \mbox{if } x = 0 \\ 1/2 & \mbox{if } x = 1 \\ 0 & \mbox{otherwise} \end{cases}$$

and
$$f(y) = \begin{cases} 1/2 & \mbox{if } y = 0 \\ 1/2 & \mbox{if } y = 1 \\ 0 & \mbox{otherwise} \end{cases}$$

Then that produces a joint mass function

$$f(x,y) = \begin{cases} 1/4 & \mbox{if } (x,y) \in \{(0, 0), (0, 1), (1, 0), (1, 1) \} \\ 0 & \mbox{otherwise} \end{cases}$$

which is clearly wrong.

So what's the right way to get at the marginals, assuming they, er, exist?

Last edited: May 11, 2010
2. May 11, 2010

### LCKurtz

Your marginal probabilities are correct. But you don't get the joint distribution from the marginals by multiplying them together because they aren't independent. You can see that easily because, for example, if you know X = 1 you know Y isn't in the joint distribution.

3. May 11, 2010

### jimholt

Ah yes, thanks for that. I am rustier than I thought. Using the equation $$f(x,y) = f(x | y) \cdot f(y)$$ then since $$f(x=0|y=1) = f(x=1|y=0) = 1$$ and $$f(x=0|y=0) = f(x=1|y=1) = 0$$, I do indeed get the correct joint pmf.

But now, I am a little bit confused about the covariance between X and Y...

We know $$E(X)=1/2$$ and $$E(Y)=1/2$$. And since $$XY$$ can only take the values 0 or 1, $$E(XY) = f(x=1)f(y=1)=(1/2)(1/2)=1/4$$ (is this right?).

Then using the equation $$Cov(X,Y) = E(XY) - E(X)E(Y)$$, I get $$Cov(X,Y)=1/4-1/4=0$$, implying that X and Y are uncorrelated. Could that be right?

4. May 11, 2010

### LCKurtz

The probability that XY = 1 is zero from your joint distribution. 1*0 or 0*1 (or 0*0). E(XY) = 0.

5. May 11, 2010

### jimholt

Doh. Made the same mistake twice, didn't I?

I think I have it now. Cov(X,Y)=-1/4 and then

$$\rho_{XY}=\frac{Cov(X,Y)}{\sigma_X\sigma_Y}=-1.$$

Makes a lot more sense.