Marginal Probability Mass Functions

In summary: What you need isE[X] = \sum_x x p_X(x).But that is not what the theorem says. The theorem says: If ##X,Y## are independent, then ##E(XY) = E(X)E(Y)##.So, to check whether X and Y are independent, compute E(XY) and see if it equals E(X)E(Y). In summary, two discrete random variables X and Y, with values as positive integers, have a joint probability mass function of pXY(x,y) = 2-x-y. The marginal probability mass functions pX(x) and pY(y) can be determined by summing over all possible values of x and y, respectively.
  • #1
twoski
181
2

Homework Statement



Discrete random variables X and Y , whose values are positive integers, have the joint
probability mass function pXY(x,y) = 2-x-y. Determine the marginal probability mass
functions pX(x) and pY(y). Are X and Y independent? Determine E[X], E[Y], and E[XY].

Homework Equations



Independence is determined by whether p(x,y) = p(x)p(y) for all x and y.

The Attempt at a Solution



My notes don't have much on the topic of determining marginal PMF's using a JPMF... I was hoping someone could point me in the right direction.
 
Physics news on Phys.org
  • #2
twoski said:

Homework Statement



Discrete random variables X and Y , whose values are positive integers, have the joint
probability mass function pXY(x,y) = 2-x-y. Determine the marginal probability mass
functions pX(x) and pY(y). Are X and Y independent? Determine E[X], E[Y], and E[XY].

Homework Equations



Independence is determined by whether p(x,y) = p(x)p(y) for all x and y.

The Attempt at a Solution



My notes don't have much on the topic of determining marginal PMF's using a JPMF... I was hoping someone could point me in the right direction.

The marginal pmf of X is
[tex] p_X(x) = P\{ X = x, Y \leq \infty\} = \sum_{\text{all }y} p(x,y).[/tex]
BTW: it is bad form to use the same symbol p to stand for three different things in the same problem. Instead, use subscripts, like this (##p_X(x), p_Y(y)##) or different letters, like this: ##g(x)## and ##h(y)##.
 
  • #3
Ray Vickson said:
The marginal pmf of X is
[tex] p_X(x) = P\{ X = x, Y \leq \infty\} = \sum_{\text{all }y} p(x,y).[/tex]
BTW: it is bad form to use the same symbol p to stand for three different things in the same problem. Instead, use subscripts, like this (##p_X(x), p_Y(y)##) or different letters, like this: ##g(x)## and ##h(y)##.

So if that's the marginal PMF of x, then for y...

[tex] p_Y(y) = P\{ X \leq \infty\ , Y = y} = \sum_{\text{all }x} p(x,y).[/tex]
 
  • #4
twoski said:
So if that's the marginal PMF of x, then for y...

[tex] p_Y(y) = P\{ X \leq \infty\ , Y = y\} = \sum_{\text{all }x} p_{X,Y}(x,y).[/tex]

Fixed it, and yes that is correct.
 
  • #5
So using these 2 PMFs i have to determine whether X and Y are independent. Going by the definition i'd say they are independent.

Is this right?

[tex] E[X] = \sum_{\text{k}} x_k * p_X(x_k) = \sum_{\text{k}} x_k * \sum_{\text{x}} p(x,y).[/tex]
 
  • #6
Your ##E(X)## is right. For the next question, use the theorem: If ##X,Y## are independent, then ##E(XY) = E(X)E(Y)##.
 
  • #7
twoski said:
So using these 2 PMFs i have to determine whether X and Y are independent. Going by the definition i'd say they are independent.

Is this right?

[tex] E[X] = \sum_{\text{k}} x_k * p_X(x_k) = \sum_{\text{k}} x_k * \sum_{\text{x}} p(x,y).[/tex]

Your equation makes no sense: it is essentially summing over x twice, and not doing anything with y.
 

1. What is a marginal probability mass function?

A marginal probability mass function is a function that gives the probability of a specific outcome of a random variable, without considering any other variables. It is essentially a simplified version of a joint probability mass function, which takes into account multiple variables.

2. How is a marginal probability mass function different from a joint probability mass function?

A marginal probability mass function only considers the probability of one outcome, while a joint probability mass function takes into account the probabilities of multiple outcomes together. Marginal probability mass functions are used when we are only interested in the probability of one variable, while joint probability mass functions are used when we want to understand the relationship between multiple variables.

3. How is a marginal probability mass function calculated?

A marginal probability mass function is calculated by summing the probabilities of each outcome of the random variable, while holding all other variables constant. This can be represented with a formula: P(X = x) = ∑ P(X = x, Y = y), where X is the random variable we are interested in and Y is the other variable.

4. What is the purpose of using marginal probability mass functions?

Marginal probability mass functions are used to understand the probability of a specific outcome of a random variable. They are helpful in simplifying complex relationships between variables and making predictions about individual outcomes. They also allow us to compare the probabilities of different outcomes of the same variable.

5. Can marginal probability mass functions be used for continuous random variables?

No, marginal probability mass functions can only be used for discrete random variables, which have a finite or countably infinite number of possible outcomes. For continuous random variables, we use marginal probability density functions instead.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
942
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
Back
Top