Convolution and Probability Distributions

Click For Summary
The discussion revolves around calculating the expected value E[X_1 + X_2 | X_1 < X_2] for two independent identically distributed (iid) random variables with a specified exponential distribution. The initial approach involves using convolution to find the density function of Z = X_1 + X_2, but the complexity of the integral leads to considerations of symmetry in the problem. It is noted that due to the iid nature of the variables, the expectations conditioned on the orderings (X_1 < X_2 and X_2 < X_1) yield the same result, simplifying the calculation. Ultimately, the conclusion is reached that E[X_1 + X_2 | X_1 < X_2] equals E[X_1 + X_2], making the problem more straightforward. The discussion highlights the usefulness of symmetry in probability distributions for simplifying calculations.
O_o
Messages
32
Reaction score
3

Homework Statement


Have 2 iid random variables following the distribution f(x) = \frac{\lambda}{2}e^{-\lambda |x|}, x \in\mathbb{R}

I'm asked to solve for E[X_1 + X_2 | X_1 &lt; X_2]

Homework Equations

The Attempt at a Solution


So what I'm trying to do is create a new random variableZ = X_1 + X_2 When I do this I get the following convolution formula for its densityg(z) = \int_{-\infty}^{\infty} \frac{\lambda^2}{4} e^{-\lambda |z- x_1|} e^{-\lambda |x_1|} dx_1

I'd really only like some advice on how to go about attacking this integral. It looks to me like I need to break it down into cases depending on z<x1 or z>x1 but that doesn't seem like it will produce a clean solution to me.

Or if you can see that I'm attacking this problem completely the wrong way and I shouldn't even be trying to do this please let me know. No alternative method of attack needed. I can try to figure out other ways if this is completely off base.

Thanks

edit:
I've had a thought. If X1 > 0 then Z = X_1 + X_2 \gt 2X_1 \gt X_1 So now if I can do somthing similar for X1 < 0 I can evaluate the integral.
 
Last edited:
Physics news on Phys.org
It is possible to use an integral, but you can use the symmetry of the problem. Consider $$E[X_1 + X_2 | X_1 > X_2]$$
 
mfb said:
It is possible to use an integral, but you can use the symmetry of the problem. Consider $$E[X_1 + X_2 | X_1 > X_2]$$
I don't see how that gets one out of doing the integral. But certainly it is worth considering symmetries. Forgetting the (irrelevant) X1<X2, there are six orderings of 0, z, x1. Symmetries get it down to only two integrals. E.g. conside g(-z).
 
Thanks guys. I hadn't considered the symmetry of the problem. Does this look alright:

E[X_1 + X_2]\\<br /> = \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\<br /> = \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2)f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 + \int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\<br /> = P(X_1 &lt; X_2) \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_1 &lt; X_2)} dx_1 dx_2 + P(X_2 &lt; X_1)\int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_2 &lt; X_1)} dx_1 dx_2 \\<br /> = P(X_1 &lt; X_2) E[X_1 + X_2 | X_1 &lt; X_2] + P(X2 &lt; X1) E[X_1 + X_2 | X_2 &lt; X_1] \\<br /> = \frac{1}{2} \left(E[X_1 + X_2 | X_1 &lt; X_2] + E[X_1 + X_2| X_2 &lt; X_1] \right) \\<br /> = E[X_1 + X_2 | X_1 &lt; X_2]<br />

So now I can solve for E[X_1 + X_2] = 2E[X_1] instead to get my answer, which looks easier.
 
Last edited:
I think you can skip the integration steps because ##P(X_1 < X_2) = P(X_1 > X_2) = \frac{1}{2}## follows from symmetry and the formula where it appears is simply the weighted average, but it looks possible and the result is right.
 
o_O said:
Thanks guys. I hadn't considered the symmetry of the problem. Does this look alright:

E[X_1 + X_2]\\<br /> = \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\<br /> = \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2)f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 + \int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\<br /> = P(X_1 &lt; X_2) \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_1 &lt; X_2)} dx_1 dx_2 + P(X_2 &lt; X_1)\int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_2 &lt; X_1)} dx_1 dx_2 \\<br /> = P(X_1 &lt; X_2) E[X_1 + X_2 | X_1 &lt; X_2] + P(X2 &lt; X1) E[X_1 + X_2 | X_2 &lt; X_1] \\<br /> = \frac{1}{2} \left(E[X_1 + X_2 | X_1 &lt; X_2] + E[X_1 + X_2| X_2 &lt; X_1] \right) \\<br /> = E[X_1 + X_2 | X_1 &lt; X_2]<br />

So now I can solve for E[X_1 + X_2] = 2E[X_1] instead to get my answer, which looks easier.

Even easier: you can use the surprising result that for iid continuous ##X_1, X_2## and for all ##t \in \mathbb{R}## we have
P(X_1+X_2 \leq t \,|\, X_1 &lt; X_2)\\<br /> = P(X_1+X_2 \leq t \,|\, X_1 &gt; X_2) \\<br /> = P(X_1 + X_2 \leq t)
In other words, the random variables ##X_1 + X_2##, ##[X_1 + X_2 | X_1 < X_2]## and ##[X_1+X_2|X_1 > X_2]## all have the same distribution, hence the same expectation!
 
Ray Vickson said:
Even easier: you can use the surprising result that for iid continuous ##X_1, X_2## and for all ##t \in \mathbb{R}## we have
P(X_1+X_2 \leq t \,|\, X_1 &lt; X_2)\\<br /> = P(X_1+X_2 \leq t \,|\, X_1 &gt; X_2) \\<br /> = P(X_1 + X_2 \leq t)
In other words, the random variables ##X_1 + X_2##, ##[X_1 + X_2 | X_1 < X_2]## and ##[X_1+X_2|X_1 > X_2]## all have the same distribution, hence the same expectation!
That's really neat, thanks for sharing.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
Replies
5
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
7
Views
2K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 18 ·
Replies
18
Views
2K
  • · Replies 9 ·
Replies
9
Views
1K
Replies
14
Views
2K
Replies
6
Views
2K