B The CDF of the Sum of Independent Random Variables

EngWiPy
Messages
1,361
Reaction score
61
Hello all,

Suppose I have the following summation ##X=\sum_{k=1}^KX_k## where the ##\{X_k\}## are independent and identically distributed random variables with CDF and PDF of ##F_{X_k}(x)## and ##f_{X_k}(x)##, respectively. How can I find the CDF of ##X##?

Thanks in advance
 
Physics news on Phys.org
mathman said:
There are two ways. Direct - using convolutions and via Fourier transforms.

https://www.statlect.com/fundamentals-of-probability/sums-of-independent-random-variables (direct)

The Fourier transform method. Get Fourier transform of each density function (in your case all the same), multiply together (in your case Kth power) and get the inverse Fourier transform.

Can I use the Laplace transform instead of the Fourier transform. I now remember in the past using the moment generating function (MGF). So, can I find the MGF of ##X##, and then find the inverse Laplace transform to find the CDF?
 
Last edited:
You can use the Laplace transform as you described. The expressions for the Laplace and Fourier transforms are similar. My personal preference is Fourier, since the inverse transform seems easier.
 
  • Like
Likes EngWiPy
He S_David.

I should point out that if the random variables are discrete random variables (as opposed to continuous ones) then you should look into probability generating functions.
 
  • Like
Likes EngWiPy
chiro said:
He S_David.

I should point out that if the random variables are discrete random variables (as opposed to continuous ones) then you should look into probability generating functions.

Thanks. They are actually continuous random variables.
 
One additional advantage to Fourier transform as opposed to Laplace is when moments are infinite. Example - Cauchy distribution.
 
  • Like
Likes EngWiPy
mathman said:
One additional advantage to Fourier transform as opposed to Laplace is when moments are infinite. Example - Cauchy distribution.

The MGF of a random variable X is defined as

\mathcal{M}_X(s)=E\left[e^{-sX}\right]=\int_Xe^{-sX}f_X(x)\,dx

In this definition we have the PDF of X is involved in the definition of the MGF. How the Fourier transform is similar to this? I mean, in the definition of the Fourier transform there is not PDF involved, right? How to get the CDF from the Fourier transform then?
 
  • #10
S_David said:
The MGF of a random variable X is defined as

\mathcal{M}_X(s)=E\left[e^{-sX}\right]=\int_Xe^{-sX}f_X(x)\,dx

In this definition we have the PDF of X is involved in the definition of the MGF. How the Fourier transform is similar to this? I mean, in the definition of the Fourier transform there is not PDF involved, right? How to get the CDF from the Fourier transform then?
The limits of integration for a moment generating function are (in general) (-\infty ,\infty ), so that it is possible the integral may not exist.

The Fourier transform is that of the PDF (similar to Laplace, except using e^{isx}). To get CDF from Fourier transform, get PDF (using inverse transform) and integrate.
 
  • Like
Likes EngWiPy
  • #11
There is a more general formula for when the variable can be either discrete, continuous, or a mixture of the two (or even singular if you wish).

We have
P[X+Y\le x] = P[X\le x-Y] = E[F_X(x-Y)] = \int F_X(x-y)dF(y)

The integral above is a Stieltjes integral so we recover the standard convolution formulas when X and Y are both purely discrete or purely absolutely continuous.
 
Back
Top