1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Always possible to obtain marginals from joint pmf?

  1. May 11, 2010 #1
    Obtain marginal probability mass function (pmf) given joint pmf

    Not really a homework question, but it does have a homeworky flavor, doesn't it...

    The problem statement, all variables and given/known data

    Given a join probability mass function of two variables, is it always possible to obtain the marginals?
    E.g., if I have a joint mass function for two Bernoulli random variables X and Y, like this:

    [tex]
    f(x,y) = \begin{cases} 1/2 & \mbox{if } (x,y) = (0, 1) \\
    1/2 & \mbox{if } (x,y) = (1, 0) \\
    0 & \mbox{otherwise} \end{cases}
    [/tex]

    Can I obtain the marginals for X and Y?


    The attempt at a solution

    I want to say yes, but if the marginals for X and Y are

    [tex]
    f(x) = \begin{cases} 1/2 & \mbox{if } x = 0 \\
    1/2 & \mbox{if } x = 1 \\
    0 & \mbox{otherwise} \end{cases}
    [/tex]

    and
    [tex]
    f(y) = \begin{cases} 1/2 & \mbox{if } y = 0 \\
    1/2 & \mbox{if } y = 1 \\
    0 & \mbox{otherwise} \end{cases}
    [/tex]

    Then that produces a joint mass function

    [tex]
    f(x,y) = \begin{cases} 1/4 & \mbox{if } (x,y) \in \{(0, 0), (0, 1), (1, 0), (1, 1) \} \\ 0 & \mbox{otherwise} \end{cases}
    [/tex]

    which is clearly wrong.

    So what's the right way to get at the marginals, assuming they, er, exist?
     
    Last edited: May 11, 2010
  2. jcsd
  3. May 11, 2010 #2

    LCKurtz

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Your marginal probabilities are correct. But you don't get the joint distribution from the marginals by multiplying them together because they aren't independent. You can see that easily because, for example, if you know X = 1 you know Y isn't in the joint distribution.
     
  4. May 11, 2010 #3
    Ah yes, thanks for that. I am rustier than I thought. Using the equation [tex] f(x,y) = f(x | y) \cdot f(y) [/tex] then since [tex]f(x=0|y=1) = f(x=1|y=0) = 1[/tex] and [tex]f(x=0|y=0) = f(x=1|y=1) = 0[/tex], I do indeed get the correct joint pmf.

    But now, I am a little bit confused about the covariance between X and Y...

    We know [tex]E(X)=1/2[/tex] and [tex]E(Y)=1/2[/tex]. And since [tex]XY[/tex] can only take the values 0 or 1, [tex]E(XY) = f(x=1)f(y=1)=(1/2)(1/2)=1/4[/tex] (is this right?).

    Then using the equation [tex]Cov(X,Y) = E(XY) - E(X)E(Y)[/tex], I get [tex]Cov(X,Y)=1/4-1/4=0[/tex], implying that X and Y are uncorrelated. Could that be right?
     
  5. May 11, 2010 #4

    LCKurtz

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    The probability that XY = 1 is zero from your joint distribution. 1*0 or 0*1 (or 0*0). E(XY) = 0.
     
  6. May 11, 2010 #5
    Doh. Made the same mistake twice, didn't I?

    I think I have it now. Cov(X,Y)=-1/4 and then

    [tex]\rho_{XY}=\frac{Cov(X,Y)}{\sigma_X\sigma_Y}=-1.[/tex]

    Makes a lot more sense.

    Thanks for your help, LCKurtz!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Always possible to obtain marginals from joint pmf?
  1. Joint PMF problem (Replies: 1)

  2. Finding a marginal pmf (Replies: 13)

  3. Joint PMF (Replies: 6)

Loading...