Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Explain p.d.f. of the sum of random variables

  1. Oct 13, 2012 #1
    Hi,
    I need your help,
    Say we have two random variables with some joint pdf f(x,y). How would I go about finding the pdf of their sum?
     
  2. jcsd
  3. Oct 13, 2012 #2

    mathman

    User Avatar
    Science Advisor
    Gold Member

    If the two random variables are dependent there is no easy solution. (I tried Google!)
     
  4. Oct 13, 2012 #3
    I think I found the answer: google docs... page 10

    another question, if I end up with an integral like that and a constant inside, then does the pdf of the two variables diverges and there's no answer?
     
  5. Oct 14, 2012 #4

    chiro

    User Avatar
    Science Advisor

    Hey exoCHA.

    The PDF won't diverge if it's a valid PDF.

    The PDF however may not give finite moments but that is a completely different story. Your PDF should be a valid PDF if you did everything right and it should contain a region that has finite realizations of Z if the domain of your X and Y in your joint PDF are also finite for something like Z = X + Y.

    Remember that even a Normal distribution is valid for the entire real line just like the Z will also be valid across the entire real line.
     
  6. Oct 14, 2012 #5

    Mute

    User Avatar
    Homework Helper

    Another way to find the pdf would be to calculate the joint characteristic function

    $$\Gamma(\mu_x,\mu_y) = \langle e^{i\mu_x x + i\mu_y y} \rangle = \int_{-\infty}^\infty dx \int_{-\infty}^\infty dy~\rho(x,y) e^{i\mu_x x + i\mu_y y}.$$

    One can recover the joint pdf by inverse transforming in the two different mu's, but you can also find the pdf of their sum by setting both ##\mu_x=\mu_y = \mu## and inverse fourier transforming in ##\mu##:

    $$\rho_Z(z) = \int_{-\infty}^\infty \frac{d\mu}{2\pi} \Gamma(\mu,\mu)e^{-i\mu z},$$
    where z = x + y. (Whether or not this integral is easy to do is of course a separate issue).
     
  7. Oct 14, 2012 #6

    mathman

    User Avatar
    Science Advisor
    Gold Member

    If you have a density function f(x,y), then the probability that X+Y < z can be expressed as a double integral,
    ∫∫f(x,y)dxdy where the integral range is given by x+y < z. This can be evaluated by u=x+y replacing x, so du = dx. The double integral now has y range (-∞,∞) and u range (-∞,z) and g(u,y) = f(u-y,y) as the integrand.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Explain p.d.f. of the sum of random variables
Loading...