Conditional Expectation of Multiple Independent Random Varia

  • #1
169
0

Homework Statement


Given X,Y,Z are 3 N(1,1) random variables,

(1)
Find E[ XY | Y + Z = 1]

Homework Equations




The Attempt at a Solution


I'm honestly completely lost in statistics... I didn't quite grasp the intuitive aspect of expectation because my professor lives in the numbers side and not the relating it to others side. So these processes are turning into foggy abstract generalities and any light would be superb.

My book's explanation of the multi-variable cases are that they're essentially the same as the joint case which is the same as the singular case.

So:
For two jointly continuous random variables,
E[X| Y = y] = ∫-∞ , ∞ x*fx|y(x|y)dx
*note that X could easily be replaced by g(X) and that fxy(x|y) is defined as fxy(x,y) / fy(y)

My thoughts are to define two variables
W = XY
V = Y + Z
**note: X + Y = N(2,2) = 1
This allows us to write (1) as
E[W | V = 1]
which would be defined as listed above for the X,Y case.

At this point I'm stuck because finding the pdf seems to essentially be the struggle and I don't know how.
 

Answers and Replies

  • #2
andrewkirk
Science Advisor
Homework Helper
Insights Author
Gold Member
3,867
1,445
Does the problem statement specify that X, Y and Z mutually independent?
 
  • #3
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,728

Homework Statement


Given X,Y,Z are 3 N(1,1) random variables,

(1)
Find E[ XY | Y + Z = 1]

Homework Equations




The Attempt at a Solution


I'm honestly completely lost in statistics... I didn't quite grasp the intuitive aspect of expectation because my professor lives in the numbers side and not the relating it to others side. So these processes are turning into foggy abstract generalities and any light would be superb.

My book's explanation of the multi-variable cases are that they're essentially the same as the joint case which is the same as the singular case.

So:
For two jointly continuous random variables,
E[X| Y = y] = ∫-∞ , ∞ x*fx|y(x|y)dx
*note that X could easily be replaced by g(X) and that fxy(x|y) is defined as fxy(x,y) / fy(y)

My thoughts are to define two variables
W = XY
V = Y + Z
**note: X + Y = N(2,2) = 1
This allows us to write (1) as
E[W | V = 1]
which would be defined as listed above for the X,Y case.

At this point I'm stuck because finding the pdf seems to essentially be the struggle and I don't know how.

For an event ##A## you can think of ##P(A|Y+Z=1)## as
[tex] \begin{array}{rcl}P(A|Y+Z=1) &=&\displaystyle \lim_{h \to 0} P(A| 1 < Y+Z < 1+h)\\

&=&\displaystyle \lim_{h \to 0} \frac{P(A \: \& \: Y+Z \in (1,1+h) )}{P(Y+Z \in (1,1+h)}
\end{array}[/tex]
If ##X,Y,Z## are independent, and ##A = \{ XY \in (w,w+ \Delta w) \}## we have
[tex] P( XY \in (w,w+\Delta w )\: \& \: Y+Z \in (1,1+h) ) \approx \int_{y=-\infty}^{\infty}f_{YZ}(y,1-y) P(Xy \in (w,w+\Delta w) |Y=y, Z=1-y)\, dy [/tex]
(with errors of order ##h .\cdot \Delta w##, because ##Z## is not exactly ##1-y##, but differs from it by order ##< h##). Anyway, when we take ratios and go to the limit of small ##h## and ##\Delta w## this will all become exact. Now if ##X,Y,Z## are independent everything becomes a lot simpler from this point onward.
 
Last edited:
  • #4
169
0
y=-∞ fYZ(y, 1-y)P(XY ∈ (w,w+Δw) | Y = y, Z= 1-y)
this looks quite complicated! I said they were independent in the title, how does knowing its independence make this simpler? Looking at it...
it seems like I'm expanding the problem into a series of problems, the first demanding that I find the pdf fYZ and multiplying it by P(XY | X+Y = 1).. which both of these alone sound difficult to extrapolate from just the expectation alone.
 
  • #5
andrewkirk
Science Advisor
Homework Helper
Insights Author
Gold Member
3,867
1,445
A simplification that can be done right at the start is to notice that if X is independent from Y and Z then we have

$$E[XY|Y+Z=1]=E[X]\cdot E[Y|Y+Z=1]=1\cdot E[Y|Y+Z=1]=E[Y|Y+Z=1]$$

The problem then reduces to finding the mean of y along the line y+z=1 in the yz number plane, when the distribution of the pair (y,z) is a bivariate normal with no correlation. If you note that the pdf of that distribution is rotationally symmetric about the point (y=1,z=1) and plot both that point and the line y+z=1 on a number plane, it should be easy enough to find the location of the probability centre of mass of the line with respect to the distribution.
 
  • #6
169
0
If you note that the pdf of that distribution is rotationally symmetric about the point (y=1,z=1) and plot both that point and the line y+z=1 on a number plane, it should be easy enough to find the location of the probability centre of mass of the line with respect to the distribution.
I don't really understand the probability centre of mass... but the center of mass of the line z = 1-y would be:
01ds
= ∫01√(dz2+ dy2)
= ∫01√(1 + 1)
= √2

This answer doesn' make sense unless it's scaled by the distribution to be smaller... but I didn't quite comprehend your analogy of rotational symmetry and finding the mean along the number line as a method of solving this. it's probably a very intuitive way to derive my answer... but I'm not quite experienced enough to follow it.
 
  • #7
andrewkirk
Science Advisor
Homework Helper
Insights Author
Gold Member
3,867
1,445
I don't really understand the probability centre of mass
By this I meant to refer to the point on that line at which the integral of the pdf of y and z from that point to the 'end' of the line is the same in both directions. That is, if we parametrise the line by the y coordinate, it is the point ##(y=\alpha,z=1-\alpha)## such that ##\int_{-\infty}^\alpha p_Y(y)p_Z(1-y)\,dy=\int_\alpha^{\infty} p_Y(y)p_Z(1-y)\,dy##.

Since the joint pdf of Y and Z, which is ##p_Y(y)p_Z(z)##, is rotationally symmetric around (1,1), that centre of mass must be at the foot of the perpendicular from (1,1) to the line, so the expected value of y must be the value of y at that foot.
 
  • #8
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,728
this looks quite complicated! I said they were independent in the title, how does knowing its independence make this simpler? Looking at it...
it seems like I'm expanding the problem into a series of problems, the first demanding that I find the pdf fYZ and multiplying it by P(XY | X+Y = 1).. which both of these alone sound difficult to extrapolate from just the expectation alone.
Typically, there are several ways to do such problems. One way is to just calculate everything in detail and apply the definitions; that was essentially what I was outlining in the previous post. Even if that is not necessarily the fastest or easiest way to do it, I think it is important for you to grasp all the concepts involved, as part of the learning process regarding probability.

Of course, there are easier, trickier ways to do it, essentially without calculation and almost "by inspection". These, too use probability concepts, but different ones from the detailed grind-it-out approach of computing the conditional density functions. Post # 5 shows you the start of such an approach; however, I cannot show you the rest of the solution without violating PF rules about being too helpful. Rest assured, though, that practically no computations are needed to finish the problem.
 

Related Threads on Conditional Expectation of Multiple Independent Random Varia

Replies
3
Views
3K
Replies
4
Views
3K
Replies
0
Views
4K
  • Last Post
Replies
4
Views
904
Replies
5
Views
172
Replies
4
Views
401
  • Last Post
Replies
6
Views
2K
Replies
2
Views
5K
Top