Convolution and Probability Distributions

Click For Summary

Homework Help Overview

The discussion revolves around the expectation of the sum of two independent identically distributed (iid) random variables, specifically focusing on the conditional expectation given one variable is less than the other. The probability distribution in question is an exponential distribution defined by the function f(x) = \frac{\lambda}{2}e^{-\lambda |x|} for x in the real numbers.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • The original poster attempts to derive the convolution of the two random variables to find the density function of their sum. They express uncertainty about how to approach the integral involved and consider breaking it into cases based on the values of z and x1. Some participants suggest leveraging the symmetry of the problem to simplify the calculations, while others explore the implications of the orderings of the random variables.

Discussion Status

Participants are actively discussing various approaches to the problem, including the use of integrals and symmetry arguments. There is recognition of the potential to simplify the problem through symmetry, and some participants have provided insights that may guide the original poster towards a more manageable solution without reaching a consensus.

Contextual Notes

There is an emphasis on the iid nature of the random variables and the implications of their distribution on the calculations. The discussion includes considerations of conditional probabilities and the relationships between different expectations based on the ordering of the random variables.

O_o
Messages
32
Reaction score
3

Homework Statement


Have 2 iid random variables following the distribution f(x) = \frac{\lambda}{2}e^{-\lambda |x|}, x \in\mathbb{R}

I'm asked to solve for E[X_1 + X_2 | X_1 < X_2]

Homework Equations

The Attempt at a Solution


So what I'm trying to do is create a new random variableZ = X_1 + X_2 When I do this I get the following convolution formula for its densityg(z) = \int_{-\infty}^{\infty} \frac{\lambda^2}{4} e^{-\lambda |z- x_1|} e^{-\lambda |x_1|} dx_1

I'd really only like some advice on how to go about attacking this integral. It looks to me like I need to break it down into cases depending on z<x1 or z>x1 but that doesn't seem like it will produce a clean solution to me.

Or if you can see that I'm attacking this problem completely the wrong way and I shouldn't even be trying to do this please let me know. No alternative method of attack needed. I can try to figure out other ways if this is completely off base.

Thanks

edit:
I've had a thought. If X1 > 0 then Z = X_1 + X_2 \gt 2X_1 \gt X_1 So now if I can do somthing similar for X1 < 0 I can evaluate the integral.
 
Last edited:
Physics news on Phys.org
It is possible to use an integral, but you can use the symmetry of the problem. Consider $$E[X_1 + X_2 | X_1 > X_2]$$
 
mfb said:
It is possible to use an integral, but you can use the symmetry of the problem. Consider $$E[X_1 + X_2 | X_1 > X_2]$$
I don't see how that gets one out of doing the integral. But certainly it is worth considering symmetries. Forgetting the (irrelevant) X1<X2, there are six orderings of 0, z, x1. Symmetries get it down to only two integrals. E.g. conside g(-z).
 
Thanks guys. I hadn't considered the symmetry of the problem. Does this look alright:

E[X_1 + X_2]\\<br /> = \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\<br /> = \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2)f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 + \int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\<br /> = P(X_1 &lt; X_2) \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_1 &lt; X_2)} dx_1 dx_2 + P(X_2 &lt; X_1)\int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_2 &lt; X_1)} dx_1 dx_2 \\<br /> = P(X_1 &lt; X_2) E[X_1 + X_2 | X_1 &lt; X_2] + P(X2 &lt; X1) E[X_1 + X_2 | X_2 &lt; X_1] \\<br /> = \frac{1}{2} \left(E[X_1 + X_2 | X_1 &lt; X_2] + E[X_1 + X_2| X_2 &lt; X_1] \right) \\<br /> = E[X_1 + X_2 | X_1 &lt; X_2]<br />

So now I can solve for E[X_1 + X_2] = 2E[X_1] instead to get my answer, which looks easier.
 
Last edited:
I think you can skip the integration steps because ##P(X_1 < X_2) = P(X_1 > X_2) = \frac{1}{2}## follows from symmetry and the formula where it appears is simply the weighted average, but it looks possible and the result is right.
 
o_O said:
Thanks guys. I hadn't considered the symmetry of the problem. Does this look alright:

E[X_1 + X_2]\\<br /> = \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\<br /> = \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2)f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 + \int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\<br /> = P(X_1 &lt; X_2) \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_1 &lt; X_2)} dx_1 dx_2 + P(X_2 &lt; X_1)\int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_2 &lt; X_1)} dx_1 dx_2 \\<br /> = P(X_1 &lt; X_2) E[X_1 + X_2 | X_1 &lt; X_2] + P(X2 &lt; X1) E[X_1 + X_2 | X_2 &lt; X_1] \\<br /> = \frac{1}{2} \left(E[X_1 + X_2 | X_1 &lt; X_2] + E[X_1 + X_2| X_2 &lt; X_1] \right) \\<br /> = E[X_1 + X_2 | X_1 &lt; X_2]<br />

So now I can solve for E[X_1 + X_2] = 2E[X_1] instead to get my answer, which looks easier.

Even easier: you can use the surprising result that for iid continuous ##X_1, X_2## and for all ##t \in \mathbb{R}## we have
P(X_1+X_2 \leq t \,|\, X_1 &lt; X_2)\\<br /> = P(X_1+X_2 \leq t \,|\, X_1 &gt; X_2) \\<br /> = P(X_1 + X_2 \leq t)
In other words, the random variables ##X_1 + X_2##, ##[X_1 + X_2 | X_1 < X_2]## and ##[X_1+X_2|X_1 > X_2]## all have the same distribution, hence the same expectation!
 
Ray Vickson said:
Even easier: you can use the surprising result that for iid continuous ##X_1, X_2## and for all ##t \in \mathbb{R}## we have
P(X_1+X_2 \leq t \,|\, X_1 &lt; X_2)\\<br /> = P(X_1+X_2 \leq t \,|\, X_1 &gt; X_2) \\<br /> = P(X_1 + X_2 \leq t)
In other words, the random variables ##X_1 + X_2##, ##[X_1 + X_2 | X_1 < X_2]## and ##[X_1+X_2|X_1 > X_2]## all have the same distribution, hence the same expectation!
That's really neat, thanks for sharing.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
Replies
5
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
7
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 18 ·
Replies
18
Views
2K
  • · Replies 9 ·
Replies
9
Views
1K
Replies
14
Views
2K
Replies
6
Views
2K