Explain p.d.f. of the sum of random variables

AI Thread Summary
To find the probability density function (PDF) of the sum of two random variables with a joint PDF f(x,y), one can use the joint characteristic function or evaluate a double integral. If the variables are dependent, the process is more complex, but a valid PDF will not diverge if calculated correctly. The PDF may not yield finite moments, but it should still represent a valid distribution if the domains of X and Y are finite. The probability that X + Y is less than a certain value can be expressed as a double integral with appropriate limits.
exoCHA
Messages
2
Reaction score
0
Hi,
I need your help,
Say we have two random variables with some joint pdf f(x,y). How would I go about finding the pdf of their sum?
 
Physics news on Phys.org
If the two random variables are dependent there is no easy solution. (I tried Google!)
 
I think I found the answer: google docs... page 10

another question, if I end up with an integral like that and a constant inside, then does the pdf of the two variables diverges and there's no answer?
 
Hey exoCHA.

The PDF won't diverge if it's a valid PDF.

The PDF however may not give finite moments but that is a completely different story. Your PDF should be a valid PDF if you did everything right and it should contain a region that has finite realizations of Z if the domain of your X and Y in your joint PDF are also finite for something like Z = X + Y.

Remember that even a Normal distribution is valid for the entire real line just like the Z will also be valid across the entire real line.
 
Another way to find the pdf would be to calculate the joint characteristic function

$$\Gamma(\mu_x,\mu_y) = \langle e^{i\mu_x x + i\mu_y y} \rangle = \int_{-\infty}^\infty dx \int_{-\infty}^\infty dy~\rho(x,y) e^{i\mu_x x + i\mu_y y}.$$

One can recover the joint pdf by inverse transforming in the two different mu's, but you can also find the pdf of their sum by setting both ##\mu_x=\mu_y = \mu## and inverse Fourier transforming in ##\mu##:

$$\rho_Z(z) = \int_{-\infty}^\infty \frac{d\mu}{2\pi} \Gamma(\mu,\mu)e^{-i\mu z},$$
where z = x + y. (Whether or not this integral is easy to do is of course a separate issue).
 
If you have a density function f(x,y), then the probability that X+Y < z can be expressed as a double integral,
∫∫f(x,y)dxdy where the integral range is given by x+y < z. This can be evaluated by u=x+y replacing x, so du = dx. The double integral now has y range (-∞,∞) and u range (-∞,z) and g(u,y) = f(u-y,y) as the integrand.
 
I was reading a Bachelor thesis on Peano Arithmetic (PA). PA has the following axioms (not including the induction schema): $$\begin{align} & (A1) ~~~~ \forall x \neg (x + 1 = 0) \nonumber \\ & (A2) ~~~~ \forall xy (x + 1 =y + 1 \to x = y) \nonumber \\ & (A3) ~~~~ \forall x (x + 0 = x) \nonumber \\ & (A4) ~~~~ \forall xy (x + (y +1) = (x + y ) + 1) \nonumber \\ & (A5) ~~~~ \forall x (x \cdot 0 = 0) \nonumber \\ & (A6) ~~~~ \forall xy (x \cdot (y + 1) = (x \cdot y) + x) \nonumber...
Back
Top