# Conditional Expectation of Multiple Independent Random Varia

## Homework Statement

Given X,Y,Z are 3 N(1,1) random variables,

(1)
Find E[ XY | Y + Z = 1]

## The Attempt at a Solution

I'm honestly completely lost in statistics... I didn't quite grasp the intuitive aspect of expectation because my professor lives in the numbers side and not the relating it to others side. So these processes are turning into foggy abstract generalities and any light would be superb.

My book's explanation of the multi-variable cases are that they're essentially the same as the joint case which is the same as the singular case.

So:
For two jointly continuous random variables,
E[X| Y = y] = ∫-∞ , ∞ x*fx|y(x|y)dx
*note that X could easily be replaced by g(X) and that fxy(x|y) is defined as fxy(x,y) / fy(y)

My thoughts are to define two variables
W = XY
V = Y + Z
**note: X + Y = N(2,2) = 1
This allows us to write (1) as
E[W | V = 1]
which would be defined as listed above for the X,Y case.

At this point I'm stuck because finding the pdf seems to essentially be the struggle and I don't know how.

Related Calculus and Beyond Homework Help News on Phys.org
andrewkirk
Homework Helper
Gold Member
Does the problem statement specify that X, Y and Z mutually independent?

Ray Vickson
Homework Helper
Dearly Missed

## Homework Statement

Given X,Y,Z are 3 N(1,1) random variables,

(1)
Find E[ XY | Y + Z = 1]

## The Attempt at a Solution

I'm honestly completely lost in statistics... I didn't quite grasp the intuitive aspect of expectation because my professor lives in the numbers side and not the relating it to others side. So these processes are turning into foggy abstract generalities and any light would be superb.

My book's explanation of the multi-variable cases are that they're essentially the same as the joint case which is the same as the singular case.

So:
For two jointly continuous random variables,
E[X| Y = y] = ∫-∞ , ∞ x*fx|y(x|y)dx
*note that X could easily be replaced by g(X) and that fxy(x|y) is defined as fxy(x,y) / fy(y)

My thoughts are to define two variables
W = XY
V = Y + Z
**note: X + Y = N(2,2) = 1
This allows us to write (1) as
E[W | V = 1]
which would be defined as listed above for the X,Y case.

At this point I'm stuck because finding the pdf seems to essentially be the struggle and I don't know how.

For an event ##A## you can think of ##P(A|Y+Z=1)## as
$$\begin{array}{rcl}P(A|Y+Z=1) &=&\displaystyle \lim_{h \to 0} P(A| 1 < Y+Z < 1+h)\\ &=&\displaystyle \lim_{h \to 0} \frac{P(A \: \& \: Y+Z \in (1,1+h) )}{P(Y+Z \in (1,1+h)} \end{array}$$
If ##X,Y,Z## are independent, and ##A = \{ XY \in (w,w+ \Delta w) \}## we have
$$P( XY \in (w,w+\Delta w )\: \& \: Y+Z \in (1,1+h) ) \approx \int_{y=-\infty}^{\infty}f_{YZ}(y,1-y) P(Xy \in (w,w+\Delta w) |Y=y, Z=1-y)\, dy$$
(with errors of order ##h .\cdot \Delta w##, because ##Z## is not exactly ##1-y##, but differs from it by order ##< h##). Anyway, when we take ratios and go to the limit of small ##h## and ##\Delta w## this will all become exact. Now if ##X,Y,Z## are independent everything becomes a lot simpler from this point onward.

Last edited:
y=-∞ fYZ(y, 1-y)P(XY ∈ (w,w+Δw) | Y = y, Z= 1-y)
this looks quite complicated! I said they were independent in the title, how does knowing its independence make this simpler? Looking at it...
it seems like I'm expanding the problem into a series of problems, the first demanding that I find the pdf fYZ and multiplying it by P(XY | X+Y = 1).. which both of these alone sound difficult to extrapolate from just the expectation alone.

andrewkirk
Homework Helper
Gold Member
A simplification that can be done right at the start is to notice that if X is independent from Y and Z then we have

$$E[XY|Y+Z=1]=E[X]\cdot E[Y|Y+Z=1]=1\cdot E[Y|Y+Z=1]=E[Y|Y+Z=1]$$

The problem then reduces to finding the mean of y along the line y+z=1 in the yz number plane, when the distribution of the pair (y,z) is a bivariate normal with no correlation. If you note that the pdf of that distribution is rotationally symmetric about the point (y=1,z=1) and plot both that point and the line y+z=1 on a number plane, it should be easy enough to find the location of the probability centre of mass of the line with respect to the distribution.

If you note that the pdf of that distribution is rotationally symmetric about the point (y=1,z=1) and plot both that point and the line y+z=1 on a number plane, it should be easy enough to find the location of the probability centre of mass of the line with respect to the distribution.
I don't really understand the probability centre of mass... but the center of mass of the line z = 1-y would be:
01ds
= ∫01√(dz2+ dy2)
= ∫01√(1 + 1)
= √2

This answer doesn' make sense unless it's scaled by the distribution to be smaller... but I didn't quite comprehend your analogy of rotational symmetry and finding the mean along the number line as a method of solving this. it's probably a very intuitive way to derive my answer... but I'm not quite experienced enough to follow it.

andrewkirk
Homework Helper
Gold Member
I don't really understand the probability centre of mass
By this I meant to refer to the point on that line at which the integral of the pdf of y and z from that point to the 'end' of the line is the same in both directions. That is, if we parametrise the line by the y coordinate, it is the point ##(y=\alpha,z=1-\alpha)## such that ##\int_{-\infty}^\alpha p_Y(y)p_Z(1-y)\,dy=\int_\alpha^{\infty} p_Y(y)p_Z(1-y)\,dy##.

Since the joint pdf of Y and Z, which is ##p_Y(y)p_Z(z)##, is rotationally symmetric around (1,1), that centre of mass must be at the foot of the perpendicular from (1,1) to the line, so the expected value of y must be the value of y at that foot.

Ray Vickson