# Explain p.d.f. of the sum of random variables

1. Oct 13, 2012

### exoCHA

Hi,
Say we have two random variables with some joint pdf f(x,y). How would I go about finding the pdf of their sum?

2. Oct 13, 2012

### mathman

If the two random variables are dependent there is no easy solution. (I tried Google!)

3. Oct 13, 2012

### exoCHA

another question, if I end up with an integral like that and a constant inside, then does the pdf of the two variables diverges and there's no answer?

4. Oct 14, 2012

### chiro

Hey exoCHA.

The PDF won't diverge if it's a valid PDF.

The PDF however may not give finite moments but that is a completely different story. Your PDF should be a valid PDF if you did everything right and it should contain a region that has finite realizations of Z if the domain of your X and Y in your joint PDF are also finite for something like Z = X + Y.

Remember that even a Normal distribution is valid for the entire real line just like the Z will also be valid across the entire real line.

5. Oct 14, 2012

### Mute

Another way to find the pdf would be to calculate the joint characteristic function

$$\Gamma(\mu_x,\mu_y) = \langle e^{i\mu_x x + i\mu_y y} \rangle = \int_{-\infty}^\infty dx \int_{-\infty}^\infty dy~\rho(x,y) e^{i\mu_x x + i\mu_y y}.$$

One can recover the joint pdf by inverse transforming in the two different mu's, but you can also find the pdf of their sum by setting both $\mu_x=\mu_y = \mu$ and inverse fourier transforming in $\mu$:

$$\rho_Z(z) = \int_{-\infty}^\infty \frac{d\mu}{2\pi} \Gamma(\mu,\mu)e^{-i\mu z},$$
where z = x + y. (Whether or not this integral is easy to do is of course a separate issue).

6. Oct 14, 2012

### mathman

If you have a density function f(x,y), then the probability that X+Y < z can be expressed as a double integral,
∫∫f(x,y)dxdy where the integral range is given by x+y < z. This can be evaluated by u=x+y replacing x, so du = dx. The double integral now has y range (-∞,∞) and u range (-∞,z) and g(u,y) = f(u-y,y) as the integrand.