Marginal probability & law of iterated expectations for three jointly distributed RVs

In summary, you have correctly explained how the marginal probability of Y can be calculated from the joint probability distribution, and you have shown that E(Y) = E[E(Y|X,Z)].
  • #1
slakedlime
76
2

Homework Statement


Consider three random variables [itex]X, Y, [/itex] and [itex]Z[/itex]. Suppose that:
[itex]Y[/itex] takes on [itex]k[/itex] values [itex]y_{1}... y_{k}[/itex]
[itex]X[/itex] takes on [itex]l[/itex] values [itex]x_{1}... x_{l}[/itex]
[itex]Z[/itex] takes on [itex]m[/itex] values [itex]z_{1}... z_{m}[/itex]

The joint probability distribution of [itex]X, Y, [/itex] and [itex]Z[/itex] is [itex]Pr(X=x, Y=y, Z=z)[/itex], and the conditional probability distribution of Y given X and Z is:
[itex]Pr(Y=y|X=x, Z=z) = \frac{Pr(Y=y, X=x, Z=z)}{Pr(X=x,Z=z)}[/itex]

(a) Explain how the marginal probability that [itex]Y=y[/itex] can be calculated form the joint probability distribution.
(b) Show that [itex]E(Y) = E[E(Y|X,Z)][/itex]

I have an intuitive grasp of how to reach the answers. However, I'd appreciate knowing if I need to show more steps/rigor and more precise notation in my proof for full credit. I just started learning about probability, summation, etc. and have trouble with notation and rigor. Thank you!

2. Relevant Formulae
Law of total expectation for two jointly distributed, discrete RVs:
[tex]E(Y) = \sum_{i = 1}^{k}E(Y|X=x_{i})Pr(X=x_{i})[/tex]

The Attempt at a Solution



Part (a)

We know:
[tex]Pr(Y=y|X=x, Z=z) = \frac{Pr(Y=y, X=x, Z=z)}{Pr(X=x,Z=z)}[/tex]
[tex]Pr(Y=y, X=x, Z=z) = Pr(Y=y|X=x, Z=z)*Pr(X=x,Z=z)[/tex]

The marginal probability of Y is:
[tex]Pr(Y=y_{i}) = Ʃ[Pr(Y=y_{i}, X=x_{j}, Z=z_{h})][/tex]
That is, if we wanted to get the marginal probability of Y, we'd sum up all the joint probabilities for X, Y and Z [is this assumption correct?]

Using the intuition of this, we can expand the marginal probability to find a more precise answer:
[tex]Pr(Y=y_{i}) = \sum_{j = 1}^{l}[\sum_{h = 1}^{m} Pr(Y=y_{i} | X = x_{j}, Z = z_{h})*Pr(X=x_{j}, Z=z_{h})][/tex]

In the inner summation sign, holding X constant, we find the joint probability of all Y's and Z's for that fixed X. Then, using the outer summation sign, we sum up all the joint probabilities we just found for ALL values of X.

Would this be sufficient for the proof? Am I making the correct assumptions?

Part (b)
From the definition of E(Y), we know:
[tex]E(Y) = \sum_{i = 1}^{k}y_{i}Pr(Y=y_{i})[/tex]

Plugging in the marginal probability of Y derived in part (a):
[tex]E(Y) = \sum_{i = 1}^{k}y_{i}[\sum_{j = 1}^{l}\sum_{h = 1}^{m} Pr(Y=y_{i} | X = x_{j}, Z = z_{h})*Pr(X=x_{j}, Z=z_{h})][/tex]

Rearranging:
[tex]E(Y) = \sum_{j = 1}^{l}\sum_{h = 1}^{m} [\sum_{i = 1}^{k}y_{i} Pr(Y=y_{i} | X = x_{j}, Z = z_{h})]Pr(X=x_{j}, Z=z_{h})[/tex]

The part I just bracketed is the expectation of Y conditional on X and Z. Hence:
[tex]E(Y) = \sum_{j = 1}^{l}\sum_{h = 1}^{m} E(Y|X = x_{j}, Z=z_{h})Pr(X=x_{j}, Z=z_{h})[/tex]

Comparing this against the total of total expectation for two variables, we can see:
[tex]E(Y) = E[E(Y|X,Z)][/tex]

Do I need to show any additional steps before making this conclusion?
 
Last edited:
Physics news on Phys.org
  • #2
Thank you for your help!

Your proof for part (a) looks good. Just one clarification, when you say "sum up all the joint probabilities for X, Y, and Z", you should specify that you are summing over all possible values of X, Y, and Z. This is because the joint probability distribution is a function that assigns a probability to each possible combination of values for X, Y, and Z.

For part (b), your proof is correct. Just a minor suggestion, when you write "hence", you should explain how you arrived at that conclusion. In this case, you can say "hence, using the law of total expectation, we have...". This will make your proof more clear and rigorous. Overall, your proof looks good and covers all the necessary steps.
 

1. What is "marginal probability"?

Marginal probability refers to the probability of a single event occurring, without considering any other events. In the context of three jointly distributed random variables (RVs), it refers to the probability of one of the RVs taking on a specific value, regardless of the values of the other two RVs.

2. How is marginal probability calculated for three jointly distributed RVs?

To calculate the marginal probability for one of the RVs, you would sum up the probabilities of all possible outcomes for that RV, while holding the other two RVs at fixed values. This can be represented mathematically as P(X=x) = ∑ P(X=x, Y=y, Z=z) where X, Y, and Z are the three RVs.

3. What is the "law of iterated expectations"?

The law of iterated expectations is a fundamental concept in probability theory that states that the expected value of a random variable is equal to the sum of all possible outcomes of that variable, each multiplied by its respective probability. In the context of three jointly distributed RVs, it refers to the expected value of one RV, given the values of the other two RVs.

4. How is the law of iterated expectations applied to three jointly distributed RVs?

To apply the law of iterated expectations to three jointly distributed RVs, you would first calculate the expected value of one of the RVs, while holding the other two at fixed values. Then, you would repeat this process for all possible values of the other two RVs and take the sum of these expected values. This can be represented mathematically as E(X) = ∑ E(X|Y=y, Z=z) * P(Y=y, Z=z) where X, Y, and Z are the three RVs.

5. Why are marginal probability and the law of iterated expectations important in probability theory?

Marginal probability and the law of iterated expectations are important concepts in probability theory because they allow for the calculation of probabilities and expected values for individual random variables in a joint distribution. This is useful in many applications, such as risk assessment and decision making, where understanding the behavior of individual variables is crucial.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
472
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
714
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
11
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
770
  • Programming and Computer Science
Replies
31
Views
2K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
Back
Top