(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

Consider three random variables [itex]X, Y, [/itex] and [itex]Z[/itex]. Suppose that:

[itex]Y[/itex] takes on [itex]k[/itex] values [itex]y_{1}.... y_{k}[/itex]

[itex]X[/itex] takes on [itex]l[/itex] values [itex]x_{1}.... x_{l}[/itex]

[itex]Z[/itex] takes on [itex]m[/itex] values [itex]z_{1}.... z_{m}[/itex]

The joint probability distribution of [itex]X, Y, [/itex] and [itex]Z[/itex] is [itex]Pr(X=x, Y=y, Z=z)[/itex], and the conditional probability distribution of Y given X and Z is:

[itex]Pr(Y=y|X=x, Z=z) = \frac{Pr(Y=y, X=x, Z=z)}{Pr(X=x,Z=z)}[/itex]

(a) Explain how the marginal probability that [itex]Y=y[/itex] can be calculated form the joint probability distribution.

(b) Show that [itex]E(Y) = E[E(Y|X,Z)][/itex]

I have an intuitive grasp of how to reach the answers. However, I'd appreciate knowing if I need to show more steps/rigor and more precise notation in my proof for full credit. I just started learning about probability, summation, etc. and have trouble with notation and rigor. Thank you!

2. Relevant Formulae

Law of total expectation for two jointly distributed, discrete RVs:

[tex]E(Y) = \sum_{i = 1}^{k}E(Y|X=x_{i})Pr(X=x_{i})[/tex]

3. The attempt at a solution

Part (a)

We know:

[tex]Pr(Y=y|X=x, Z=z) = \frac{Pr(Y=y, X=x, Z=z)}{Pr(X=x,Z=z)}[/tex]

[tex]Pr(Y=y, X=x, Z=z) = Pr(Y=y|X=x, Z=z)*Pr(X=x,Z=z)[/tex]

The marginal probability of Y is:

[tex]Pr(Y=y_{i}) = Ʃ[Pr(Y=y_{i}, X=x_{j}, Z=z_{h})][/tex]

That is, if we wanted to get the marginal probability of Y, we'd sum up all the joint probabilities for X, Y and Z [is this assumption correct?]

Using the intuition of this, we can expand the marginal probability to find a more precise answer:

[tex]Pr(Y=y_{i}) = \sum_{j = 1}^{l}[\sum_{h = 1}^{m} Pr(Y=y_{i} | X = x_{j}, Z = z_{h})*Pr(X=x_{j}, Z=z_{h})][/tex]

In the inner summation sign, holding X constant, we find the joint probability of all Y's and Z's for that fixed X. Then, using the outer summation sign, we sum up all the joint probabilities we just found for ALL values of X.

Would this be sufficient for the proof? Am I making the correct assumptions?

Part (b)

From the definition of E(Y), we know:

[tex]E(Y) = \sum_{i = 1}^{k}y_{i}Pr(Y=y_{i})[/tex]

Plugging in the marginal probability of Y derived in part (a):

[tex]E(Y) = \sum_{i = 1}^{k}y_{i}[\sum_{j = 1}^{l}\sum_{h = 1}^{m} Pr(Y=y_{i} | X = x_{j}, Z = z_{h})*Pr(X=x_{j}, Z=z_{h})][/tex]

Rearranging:

[tex]E(Y) = \sum_{j = 1}^{l}\sum_{h = 1}^{m} [\sum_{i = 1}^{k}y_{i} Pr(Y=y_{i} | X = x_{j}, Z = z_{h})]Pr(X=x_{j}, Z=z_{h})[/tex]

The part I just bracketed is the expectation of Y conditional on X and Z. Hence:

[tex]E(Y) = \sum_{j = 1}^{l}\sum_{h = 1}^{m} E(Y|X = x_{j}, Z=z_{h})Pr(X=x_{j}, Z=z_{h})[/tex]

Comparing this against the total of total expectation for two variables, we can see:

[tex]E(Y) = E[E(Y|X,Z)][/tex]

Do I need to show any additional steps before making this conclusion?

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Marginal probability & law of iterated expectations for three jointly distributed RVs

Can you offer guidance or do you also need help?

Draft saved
Draft deleted

**Physics Forums | Science Articles, Homework Help, Discussion**