1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Convolution and Probability Distributions

  1. Jan 23, 2015 #1

    O_o

    User Avatar

    1. The problem statement, all variables and given/known data
    Have 2 iid random variables following the distribution [tex] f(x) = \frac{\lambda}{2}e^{-\lambda |x|}, x \in\mathbb{R}[/tex]

    I'm asked to solve for [tex] E[X_1 + X_2 | X_1 < X_2] [/tex]

    2. Relevant equations


    3. The attempt at a solution
    So what I'm trying to do is create a new random variable[tex] Z = X_1 + X_2 [/tex] When I do this I get the following convolution formula for its density[tex] g(z) = \int_{-\infty}^{\infty} \frac{\lambda^2}{4} e^{-\lambda |z- x_1|} e^{-\lambda |x_1|} dx_1[/tex]

    I'd really only like some advice on how to go about attacking this integral. It looks to me like I need to break it down into cases depending on z<x1 or z>x1 but that doesn't seem like it will produce a clean solution to me.

    Or if you can see that I'm attacking this problem completely the wrong way and I shouldn't even be trying to do this please let me know. No alternative method of attack needed. I can try to figure out other ways if this is completely off base.

    Thanks

    edit:
    I've had a thought. If X1 > 0 then [tex] Z = X_1 + X_2 \gt 2X_1 \gt X_1[/tex] So now if I can do somthing similar for X1 < 0 I can evaluate the integral.
     
    Last edited: Jan 23, 2015
  2. jcsd
  3. Jan 23, 2015 #2

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    It is possible to use an integral, but you can use the symmetry of the problem. Consider $$E[X_1 + X_2 | X_1 > X_2]$$
     
  4. Jan 23, 2015 #3

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    I don't see how that gets one out of doing the integral. But certainly it is worth considering symmetries. Forgetting the (irrelevant) X1<X2, there are six orderings of 0, z, x1. Symmetries get it down to only two integrals. E.g. conside g(-z).
     
  5. Jan 24, 2015 #4

    O_o

    User Avatar

    Thanks guys. I hadn't considered the symmetry of the problem. Does this look alright:

    [tex] E[X_1 + X_2]\\
    = \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\
    = \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2)f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 + \int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) f_{X_1,X_2}(x_1, x_2) dx_1 dx_2 \\
    = P(X_1 < X_2) \int_{-\infty}^{\infty}\int_{-\infty}^{x_2} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_1 < X_2)} dx_1 dx_2 + P(X_2 < X_1)\int_{-\infty}^{\infty}\int_{x_2}^{\infty} (x_1 + x_2) \frac{f_{X_1,X_2}(x_1, x_2)}{P(X_2 < X_1)} dx_1 dx_2 \\
    = P(X_1 < X_2) E[X_1 + X_2 | X_1 < X_2] + P(X2 < X1) E[X_1 + X_2 | X_2 < X_1] \\
    = \frac{1}{2} \left(E[X_1 + X_2 | X_1 < X_2] + E[X_1 + X_2| X_2 < X_1] \right) \\
    = E[X_1 + X_2 | X_1 < X_2]
    [/tex]

    So now I can solve for [tex] E[X_1 + X_2] = 2E[X_1][/tex] instead to get my answer, which looks easier.
     
    Last edited: Jan 24, 2015
  6. Jan 24, 2015 #5

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    I think you can skip the integration steps because ##P(X_1 < X_2) = P(X_1 > X_2) = \frac{1}{2}## follows from symmetry and the formula where it appears is simply the weighted average, but it looks possible and the result is right.
     
  7. Jan 24, 2015 #6

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Even easier: you can use the surprising result that for iid continuous ##X_1, X_2## and for all ##t \in \mathbb{R}## we have
    [tex]P(X_1+X_2 \leq t \,|\, X_1 < X_2)\\
    = P(X_1+X_2 \leq t \,|\, X_1 > X_2) \\
    = P(X_1 + X_2 \leq t) [/tex]
    In other words, the random variables ##X_1 + X_2##, ##[X_1 + X_2 | X_1 < X_2]## and ##[X_1+X_2|X_1 > X_2]## all have the same distribution, hence the same expectation!
     
  8. Jan 24, 2015 #7

    O_o

    User Avatar

    That's really neat, thanks for sharing.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Convolution and Probability Distributions
  1. Probability Distribution (Replies: 13)

Loading...