# Convolution and Probability Distributions

1. Jan 23, 2015

### O_o

1. The problem statement, all variables and given/known data
Have 2 iid random variables following the distribution $$f(x) = \frac{\lambda}{2}e^{-\lambda |x|}, x \in\mathbb{R}$$

I'm asked to solve for $$E[X_1 + X_2 | X_1 < X_2]$$

2. Relevant equations

3. The attempt at a solution
So what I'm trying to do is create a new random variable$$Z = X_1 + X_2$$ When I do this I get the following convolution formula for its density$$g(z) = \int_{-\infty}^{\infty} \frac{\lambda^2}{4} e^{-\lambda |z- x_1|} e^{-\lambda |x_1|} dx_1$$

I'd really only like some advice on how to go about attacking this integral. It looks to me like I need to break it down into cases depending on z<x1 or z>x1 but that doesn't seem like it will produce a clean solution to me.

Or if you can see that I'm attacking this problem completely the wrong way and I shouldn't even be trying to do this please let me know. No alternative method of attack needed. I can try to figure out other ways if this is completely off base.

Thanks

edit:
I've had a thought. If X1 > 0 then $$Z = X_1 + X_2 \gt 2X_1 \gt X_1$$ So now if I can do somthing similar for X1 < 0 I can evaluate the integral.

Last edited: Jan 23, 2015
2. Jan 23, 2015

### Staff: Mentor

It is possible to use an integral, but you can use the symmetry of the problem. Consider $$E[X_1 + X_2 | X_1 > X_2]$$

3. Jan 23, 2015

### haruspex

I don't see how that gets one out of doing the integral. But certainly it is worth considering symmetries. Forgetting the (irrelevant) X1<X2, there are six orderings of 0, z, x1. Symmetries get it down to only two integrals. E.g. conside g(-z).

4. Jan 24, 2015

### O_o

Thanks guys. I hadn't considered the symmetry of the problem. Does this look alright:

$$E[X_1 + X_2]\\ = \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\ = \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2)f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 + \int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\ = P(X_1 < X_2) \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_1 < X_2)} dx_1 dx_2 + P(X_2 < X_1)\int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_2 < X_1)} dx_1 dx_2 \\ = P(X_1 < X_2) E[X_1 + X_2 | X_1 < X_2] + P(X2 < X1) E[X_1 + X_2 | X_2 < X_1] \\ = \frac{1}{2} \left(E[X_1 + X_2 | X_1 < X_2] + E[X_1 + X_2| X_2 < X_1] \right) \\ = E[X_1 + X_2 | X_1 < X_2]$$

So now I can solve for $$E[X_1 + X_2] = 2E[X_1]$$ instead to get my answer, which looks easier.

Last edited: Jan 24, 2015
5. Jan 24, 2015

### Staff: Mentor

I think you can skip the integration steps because $P(X_1 < X_2) = P(X_1 > X_2) = \frac{1}{2}$ follows from symmetry and the formula where it appears is simply the weighted average, but it looks possible and the result is right.

6. Jan 24, 2015

### Ray Vickson

Even easier: you can use the surprising result that for iid continuous $X_1, X_2$ and for all $t \in \mathbb{R}$ we have
$$P(X_1+X_2 \leq t \,|\, X_1 < X_2)\\ = P(X_1+X_2 \leq t \,|\, X_1 > X_2) \\ = P(X_1 + X_2 \leq t)$$
In other words, the random variables $X_1 + X_2$, $[X_1 + X_2 | X_1 < X_2]$ and $[X_1+X_2|X_1 > X_2]$ all have the same distribution, hence the same expectation!

7. Jan 24, 2015

### O_o

That's really neat, thanks for sharing.