• Support PF! Buy your school textbooks, materials and every day products Here!

Convolution and Probability Distributions

  • Thread starter O_o
  • Start date
  • #1
O_o
32
3

Homework Statement


Have 2 iid random variables following the distribution [tex] f(x) = \frac{\lambda}{2}e^{-\lambda |x|}, x \in\mathbb{R}[/tex]

I'm asked to solve for [tex] E[X_1 + X_2 | X_1 < X_2] [/tex]

Homework Equations




The Attempt at a Solution


So what I'm trying to do is create a new random variable[tex] Z = X_1 + X_2 [/tex] When I do this I get the following convolution formula for its density[tex] g(z) = \int_{-\infty}^{\infty} \frac{\lambda^2}{4} e^{-\lambda |z- x_1|} e^{-\lambda |x_1|} dx_1[/tex]

I'd really only like some advice on how to go about attacking this integral. It looks to me like I need to break it down into cases depending on z<x1 or z>x1 but that doesn't seem like it will produce a clean solution to me.

Or if you can see that I'm attacking this problem completely the wrong way and I shouldn't even be trying to do this please let me know. No alternative method of attack needed. I can try to figure out other ways if this is completely off base.

Thanks

edit:
I've had a thought. If X1 > 0 then [tex] Z = X_1 + X_2 \gt 2X_1 \gt X_1[/tex] So now if I can do somthing similar for X1 < 0 I can evaluate the integral.
 
Last edited:

Answers and Replies

  • #2
34,045
9,895
It is possible to use an integral, but you can use the symmetry of the problem. Consider $$E[X_1 + X_2 | X_1 > X_2]$$
 
  • #3
haruspex
Science Advisor
Homework Helper
Insights Author
Gold Member
32,721
5,029
It is possible to use an integral, but you can use the symmetry of the problem. Consider $$E[X_1 + X_2 | X_1 > X_2]$$
I don't see how that gets one out of doing the integral. But certainly it is worth considering symmetries. Forgetting the (irrelevant) X1<X2, there are six orderings of 0, z, x1. Symmetries get it down to only two integrals. E.g. conside g(-z).
 
  • #4
O_o
32
3
Thanks guys. I hadn't considered the symmetry of the problem. Does this look alright:

[tex] E[X_1 + X_2]\\
= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\
= \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2)f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 + \int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\
= P(X_1 < X_2) \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_1 < X_2)} dx_1 dx_2 + P(X_2 < X_1)\int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_2 < X_1)} dx_1 dx_2 \\
= P(X_1 < X_2) E[X_1 + X_2 | X_1 < X_2] + P(X2 < X1) E[X_1 + X_2 | X_2 < X_1] \\
= \frac{1}{2} \left(E[X_1 + X_2 | X_1 < X_2] + E[X_1 + X_2| X_2 < X_1] \right) \\
= E[X_1 + X_2 | X_1 < X_2]
[/tex]

So now I can solve for [tex] E[X_1 + X_2] = 2E[X_1][/tex] instead to get my answer, which looks easier.
 
Last edited:
  • #5
34,045
9,895
I think you can skip the integration steps because ##P(X_1 < X_2) = P(X_1 > X_2) = \frac{1}{2}## follows from symmetry and the formula where it appears is simply the weighted average, but it looks possible and the result is right.
 
  • #6
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,728
Thanks guys. I hadn't considered the symmetry of the problem. Does this look alright:

[tex] E[X_1 + X_2]\\
= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\
= \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2)f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 + \int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\
= P(X_1 < X_2) \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_1 < X_2)} dx_1 dx_2 + P(X_2 < X_1)\int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_2 < X_1)} dx_1 dx_2 \\
= P(X_1 < X_2) E[X_1 + X_2 | X_1 < X_2] + P(X2 < X1) E[X_1 + X_2 | X_2 < X_1] \\
= \frac{1}{2} \left(E[X_1 + X_2 | X_1 < X_2] + E[X_1 + X_2| X_2 < X_1] \right) \\
= E[X_1 + X_2 | X_1 < X_2]
[/tex]

So now I can solve for [tex] E[X_1 + X_2] = 2E[X_1][/tex] instead to get my answer, which looks easier.
Even easier: you can use the surprising result that for iid continuous ##X_1, X_2## and for all ##t \in \mathbb{R}## we have
[tex]P(X_1+X_2 \leq t \,|\, X_1 < X_2)\\
= P(X_1+X_2 \leq t \,|\, X_1 > X_2) \\
= P(X_1 + X_2 \leq t) [/tex]
In other words, the random variables ##X_1 + X_2##, ##[X_1 + X_2 | X_1 < X_2]## and ##[X_1+X_2|X_1 > X_2]## all have the same distribution, hence the same expectation!
 
  • #7
O_o
32
3
Even easier: you can use the surprising result that for iid continuous ##X_1, X_2## and for all ##t \in \mathbb{R}## we have
[tex]P(X_1+X_2 \leq t \,|\, X_1 < X_2)\\
= P(X_1+X_2 \leq t \,|\, X_1 > X_2) \\
= P(X_1 + X_2 \leq t) [/tex]
In other words, the random variables ##X_1 + X_2##, ##[X_1 + X_2 | X_1 < X_2]## and ##[X_1+X_2|X_1 > X_2]## all have the same distribution, hence the same expectation!
That's really neat, thanks for sharing.
 

Related Threads on Convolution and Probability Distributions

  • Last Post
Replies
0
Views
1K
  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
23
Views
2K
  • Last Post
Replies
0
Views
2K
  • Last Post
Replies
1
Views
3K
  • Last Post
Replies
14
Views
1K
Replies
1
Views
1K
Replies
2
Views
813
  • Last Post
Replies
1
Views
835
  • Last Post
Replies
3
Views
4K
Top