Convolution and Probability Distributions

So in summary, we can use the symmetry of the problem and the surprising result that for iid continuous random variables X1 and X2, P(X1+X2≤t|X1<X2)=P(X1+X2≤t|X1>X2)=P(X1+X2≤t) to simplify the calculation of E[X1+X2|X1<X2]. This leads to the conclusion that E[X1+X2]=2E[X1].
  • #1

O_o

32
3

Homework Statement


Have 2 iid random variables following the distribution [tex] f(x) = \frac{\lambda}{2}e^{-\lambda |x|}, x \in\mathbb{R}[/tex]

I'm asked to solve for [tex] E[X_1 + X_2 | X_1 < X_2] [/tex]

Homework Equations




The Attempt at a Solution


So what I'm trying to do is create a new random variable[tex] Z = X_1 + X_2 [/tex] When I do this I get the following convolution formula for its density[tex] g(z) = \int_{-\infty}^{\infty} \frac{\lambda^2}{4} e^{-\lambda |z- x_1|} e^{-\lambda |x_1|} dx_1[/tex]

I'd really only like some advice on how to go about attacking this integral. It looks to me like I need to break it down into cases depending on z<x1 or z>x1 but that doesn't seem like it will produce a clean solution to me.

Or if you can see that I'm attacking this problem completely the wrong way and I shouldn't even be trying to do this please let me know. No alternative method of attack needed. I can try to figure out other ways if this is completely off base.

Thanks

edit:
I've had a thought. If X1 > 0 then [tex] Z = X_1 + X_2 \gt 2X_1 \gt X_1[/tex] So now if I can do somthing similar for X1 < 0 I can evaluate the integral.
 
Last edited:
Physics news on Phys.org
  • #2
It is possible to use an integral, but you can use the symmetry of the problem. Consider $$E[X_1 + X_2 | X_1 > X_2]$$
 
  • #3
mfb said:
It is possible to use an integral, but you can use the symmetry of the problem. Consider $$E[X_1 + X_2 | X_1 > X_2]$$
I don't see how that gets one out of doing the integral. But certainly it is worth considering symmetries. Forgetting the (irrelevant) X1<X2, there are six orderings of 0, z, x1. Symmetries get it down to only two integrals. E.g. conside g(-z).
 
  • #4
Thanks guys. I hadn't considered the symmetry of the problem. Does this look alright:

[tex] E[X_1 + X_2]\\
= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\
= \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2)f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 + \int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\
= P(X_1 < X_2) \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_1 < X_2)} dx_1 dx_2 + P(X_2 < X_1)\int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_2 < X_1)} dx_1 dx_2 \\
= P(X_1 < X_2) E[X_1 + X_2 | X_1 < X_2] + P(X2 < X1) E[X_1 + X_2 | X_2 < X_1] \\
= \frac{1}{2} \left(E[X_1 + X_2 | X_1 < X_2] + E[X_1 + X_2| X_2 < X_1] \right) \\
= E[X_1 + X_2 | X_1 < X_2]
[/tex]

So now I can solve for [tex] E[X_1 + X_2] = 2E[X_1][/tex] instead to get my answer, which looks easier.
 
Last edited:
  • #5
I think you can skip the integration steps because ##P(X_1 < X_2) = P(X_1 > X_2) = \frac{1}{2}## follows from symmetry and the formula where it appears is simply the weighted average, but it looks possible and the result is right.
 
  • #6
o_O said:
Thanks guys. I hadn't considered the symmetry of the problem. Does this look alright:

[tex] E[X_1 + X_2]\\
= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\
= \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2)f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 + \int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\
= P(X_1 < X_2) \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_1 < X_2)} dx_1 dx_2 + P(X_2 < X_1)\int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_2 < X_1)} dx_1 dx_2 \\
= P(X_1 < X_2) E[X_1 + X_2 | X_1 < X_2] + P(X2 < X1) E[X_1 + X_2 | X_2 < X_1] \\
= \frac{1}{2} \left(E[X_1 + X_2 | X_1 < X_2] + E[X_1 + X_2| X_2 < X_1] \right) \\
= E[X_1 + X_2 | X_1 < X_2]
[/tex]

So now I can solve for [tex] E[X_1 + X_2] = 2E[X_1][/tex] instead to get my answer, which looks easier.

Even easier: you can use the surprising result that for iid continuous ##X_1, X_2## and for all ##t \in \mathbb{R}## we have
[tex]P(X_1+X_2 \leq t \,|\, X_1 < X_2)\\
= P(X_1+X_2 \leq t \,|\, X_1 > X_2) \\
= P(X_1 + X_2 \leq t) [/tex]
In other words, the random variables ##X_1 + X_2##, ##[X_1 + X_2 | X_1 < X_2]## and ##[X_1+X_2|X_1 > X_2]## all have the same distribution, hence the same expectation!
 
  • #7
Ray Vickson said:
Even easier: you can use the surprising result that for iid continuous ##X_1, X_2## and for all ##t \in \mathbb{R}## we have
[tex]P(X_1+X_2 \leq t \,|\, X_1 < X_2)\\
= P(X_1+X_2 \leq t \,|\, X_1 > X_2) \\
= P(X_1 + X_2 \leq t) [/tex]
In other words, the random variables ##X_1 + X_2##, ##[X_1 + X_2 | X_1 < X_2]## and ##[X_1+X_2|X_1 > X_2]## all have the same distribution, hence the same expectation!
That's really neat, thanks for sharing.
 

Suggested for: Convolution and Probability Distributions

Replies
8
Views
881
Replies
1
Views
588
Replies
1
Views
568
Replies
2
Views
446
Replies
4
Views
673
Back
Top