The CDF of the Sum of Independent Random Variables

Click For Summary

Discussion Overview

The discussion revolves around finding the cumulative distribution function (CDF) of the sum of independent and identically distributed random variables. Participants explore various methods, including convolutions, Fourier transforms, and moment generating functions (MGFs), while considering both continuous and discrete cases.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant introduces the problem of finding the CDF of the sum of independent random variables and asks for methods to achieve this.
  • Another participant suggests using convolutions and Fourier transforms as two methods to find the CDF.
  • A follow-up question arises about the possibility of using the Laplace transform and MGFs instead of Fourier transforms.
  • Some participants express a preference for Fourier transforms, citing ease of use, especially in cases where moments are infinite, such as with the Cauchy distribution.
  • There is a discussion about the relationship between MGFs and Fourier transforms, particularly regarding the involvement of probability density functions (PDFs) in their definitions.
  • One participant mentions a more general formula applicable to discrete, continuous, or mixed random variables, referencing a Stieltjes integral for convolution.

Areas of Agreement / Disagreement

Participants present multiple competing views on the methods to find the CDF, including the use of Fourier transforms, Laplace transforms, and MGFs. There is no consensus on a single preferred method, and the discussion remains unresolved regarding the best approach.

Contextual Notes

Participants note that the choice of method may depend on whether the random variables are discrete or continuous, and some express concerns about the existence of integrals in the context of MGFs.

EngWiPy
Messages
1,361
Reaction score
61
Hello all,

Suppose I have the following summation ##X=\sum_{k=1}^KX_k## where the ##\{X_k\}## are independent and identically distributed random variables with CDF and PDF of ##F_{X_k}(x)## and ##f_{X_k}(x)##, respectively. How can I find the CDF of ##X##?

Thanks in advance
 
Physics news on Phys.org
mathman said:
There are two ways. Direct - using convolutions and via Fourier transforms.

https://www.statlect.com/fundamentals-of-probability/sums-of-independent-random-variables (direct)

The Fourier transform method. Get Fourier transform of each density function (in your case all the same), multiply together (in your case Kth power) and get the inverse Fourier transform.

Can I use the Laplace transform instead of the Fourier transform. I now remember in the past using the moment generating function (MGF). So, can I find the MGF of ##X##, and then find the inverse Laplace transform to find the CDF?
 
Last edited:
You can use the Laplace transform as you described. The expressions for the Laplace and Fourier transforms are similar. My personal preference is Fourier, since the inverse transform seems easier.
 
  • Like
Likes   Reactions: EngWiPy
He S_David.

I should point out that if the random variables are discrete random variables (as opposed to continuous ones) then you should look into probability generating functions.
 
  • Like
Likes   Reactions: EngWiPy
chiro said:
He S_David.

I should point out that if the random variables are discrete random variables (as opposed to continuous ones) then you should look into probability generating functions.

Thanks. They are actually continuous random variables.
 
One additional advantage to Fourier transform as opposed to Laplace is when moments are infinite. Example - Cauchy distribution.
 
  • Like
Likes   Reactions: EngWiPy
mathman said:
One additional advantage to Fourier transform as opposed to Laplace is when moments are infinite. Example - Cauchy distribution.

The MGF of a random variable X is defined as

\mathcal{M}_X(s)=E\left[e^{-sX}\right]=\int_Xe^{-sX}f_X(x)\,dx

In this definition we have the PDF of X is involved in the definition of the MGF. How the Fourier transform is similar to this? I mean, in the definition of the Fourier transform there is not PDF involved, right? How to get the CDF from the Fourier transform then?
 
  • #10
S_David said:
The MGF of a random variable X is defined as

\mathcal{M}_X(s)=E\left[e^{-sX}\right]=\int_Xe^{-sX}f_X(x)\,dx

In this definition we have the PDF of X is involved in the definition of the MGF. How the Fourier transform is similar to this? I mean, in the definition of the Fourier transform there is not PDF involved, right? How to get the CDF from the Fourier transform then?
The limits of integration for a moment generating function are (in general) (-\infty ,\infty ), so that it is possible the integral may not exist.

The Fourier transform is that of the PDF (similar to Laplace, except using e^{isx}). To get CDF from Fourier transform, get PDF (using inverse transform) and integrate.
 
  • Like
Likes   Reactions: EngWiPy
  • #11
There is a more general formula for when the variable can be either discrete, continuous, or a mixture of the two (or even singular if you wish).

We have
P[X+Y\le x] = P[X\le x-Y] = E[F_X(x-Y)] = \int F_X(x-y)dF(y)

The integral above is a Stieltjes integral so we recover the standard convolution formulas when X and Y are both purely discrete or purely absolutely continuous.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 35 ·
2
Replies
35
Views
4K
  • · Replies 36 ·
2
Replies
36
Views
5K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 8 ·
Replies
8
Views
3K