Probability Mass Function and Marginal Probaility

Click For Summary

Discussion Overview

The discussion revolves around the computation of marginal probability mass functions from a joint probability mass function, specifically exploring the transition from \( p_{X,Y}(x,y) \) to \( p_X(x) \) and \( p_Y(y) \). Participants also delve into the joint probability mass function of transformed variables \( X^2 \) and \( Y^2 \), examining the necessary summations and conditions involved.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Exploratory

Main Points Raised

  • Some participants inquire whether marginal probability mass functions can be derived solely from the joint probability mass function without knowledge of conditional probabilities.
  • One participant suggests that if \( p_{X,Y}(x,y) \) is known, computing \( p_X(x) \) through summation over \( y \) should be straightforward.
  • Another participant provides an example with specific values for \( p_{X,Y}(x,y) \) to illustrate the computation of \( p_X(x) \).
  • There is a proposal to find the joint probability mass function \( p_{X^2,Y^2}(x^2,y^2) \) from \( p_{X,Y}(x,y) \), with a focus on the necessary summation over combinations of \( (x,y) \).
  • Participants discuss the formulation of the equation for \( g(a,b) \) and clarify the correct variables to use, specifically noting the need for square roots in the context of transformations.

Areas of Agreement / Disagreement

Participants generally agree on the methods for computing marginal and joint probability mass functions, but there are nuances in understanding the transformations and the specific conditions required for summation. The discussion remains exploratory with no definitive consensus on all aspects.

Contextual Notes

Some assumptions about the values and ranges of \( X \) and \( Y \) are not explicitly stated, which may affect the generalizability of the proposed methods. The discussion also does not resolve the complexities involved in the transformations of the variables.

EngWiPy
Messages
1,361
Reaction score
61
Hi,

If I have a joint probability mass function [tex]p_{X,Y}(x,y)[/tex], can we get the marginal probability mass functions [tex]p_X(x)[/tex] and [tex]p_Y(y)[/tex], without any knowledge of the conditional probability function of either of them, and the probability of each event? I mean, I know that:

[tex]p_X(x)=\sum_yp_{X,Y}(x,Y=y)=\sum_yp(x/Y=y)\text{Pr}\{y\}[/tex]

But I just have [tex]p_{X,Y}(x,y)[/tex]. Can I?

Thanks
 
Physics news on Phys.org
S_David said:
Hi,

[tex]p_X(x)=\sum_yp_{X,Y}(x,Y=y)=\sum_yp(x/Y=y)\text{Pr}\{y\}[/tex]

But I just have [tex]p_{X,Y}(x,y)[/tex]. Can I?

Thanks

If you know [tex]p_{X,Y}(x,y)[/tex], what would stop you from computing [tex]\sum_yp_{X,Y}(x,Y=y)[/tex] ?
 
Stephen Tashi said:
If you know [tex]p_{X,Y}(x,y)[/tex], what would stop you from computing [tex]\sum_yp_{X,Y}(x,Y=y)[/tex] ?

How? Let us assume for the sake of the argument that [tex]p_{X,Y}(x,y)[/tex] is 0.4 when x=y=1, and 0.6 when x=y=2. Now according to the equation you pointed to, we have:

[tex]p_X(x)=p_{X,Y}(x,Y=1)+p_{X,Y}(x,Y=2)[/tex]

Ok? then what?
 
If you "have" [tex]P_{XY}(x,y)[/tex] then you know the value of things like [tex]P_{XY}(1,2)[/tex] and [tex]P_{XY}(2,1)[/tex] so there is no problem doing those sums.

For example, if the only possible values of the variables are 1 and 2, then
[tex]P_X(1) = P_{XY}(1,1) + P_{XY}(1,2)[/tex]
 
Stephen Tashi said:
If you "have" [tex]P_{XY}(x,y)[/tex] then you know the value of things like [tex]P_{XY}(1,2)[/tex] and [tex]P_{XY}(2,1)[/tex] so there is no problem doing those sums.

For example, if the only possible values of the variables are 1 and 2, then
[tex]P_X(1) = P_{XY}(1,1) + P_{XY}(1,2)[/tex]

Good. Now what if we need to find that joint p.m.f of [tex]X^2\text{ and }Y^2[/tex]. That is, [tex]p_{X^2,Y^2}(x^2,y^2)[/tex] from [tex]p_{X,Y}(x,y)[/tex]?

Thanks for helping.
 
S_David said:
Good. Now what if we need to find that joint p.m.f of [tex]X^2\text{ and }Y^2[/tex]. That is, [tex]p_{X^2,Y^2}(x^2,y^2)[/tex] from [tex]p_{X,Y}(x,y)[/tex]?

Thanks for helping.

If [itex]g(r,s)[/itex] is the joint p.m.f of [itex](X^2,Y^2)[/itex], to find [itex]g(a,b)[/itex], you must sum [itex]p_{XY}(x,y)[/itex] over all combinations of [itex](x,y)[/itex] that give [itex]x^2 = a[/itex] and [itex]y^2 = b[/itex].
 
Stephen Tashi said:
If [itex]g(r,s)[/itex] is the joint p.m.f of [itex](X^2,Y^2)[/itex], to find [itex]g(a,b)[/itex], you must sum [itex]p_{XY}(x,y)[/itex] over all combinations of [itex](x,y)[/itex] that give [itex]x^2 = a[/itex] and [itex]y^2 = b[/itex].

Ok, I am not getting the idea very well. Let me try to write an equation of this. Using your notation, we have:

[tex]g(a,b)=\sum_{(x,y)}p_{X,Y}(x:x^2=a,y:y^2=b)[/tex]

where : means such that. So, we have the following choices of x and y that satisfy the equation:

[tex]x=\pm a \text{ and }y=\pm b[/tex]

Am I right so far?
 
You meant [itex]\sqrt{a}[/itex] and [itex]\sqrt{b}[/itex], but yes, that's the general idea.
 
Stephen Tashi said:
You meant [itex]\sqrt{a}[/itex] and [itex]\sqrt{b}[/itex], but yes, that's the general idea.

Yes you are right, the square root of the values. So, for the example I gave previously, we have the following:

[tex]g(a,b)=\left\{\begin{array}{cc}0.4&a=b=1\\0.6&a=b=4\end{array}\right.[/tex]

right?
 
  • #10
Right
 
  • #11
Stephen Tashi said:
Right

Thank you so much.
 

Similar threads

  • · Replies 17 ·
Replies
17
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
4
Views
3K