I Fourier Transform of a Probability Distribution

Hi all :oldbiggrin:

Yesterday I was thinking about the central limit theorem, and in doing so, I reached a conclusion that I found surprising. It could just be that my arguments are wrong, but this was my process:

1. First, define a continuous probability distribution X.

2. Define a new random variable y = x1 + x2 + x3 + .... (Y is the sum of many independent samples of X).

3. To visualise the distribution of Y as you sum each term, I imaging starting with the pdf of X, and convolving it with the same pdf. Then I take the result, and convolve it again with the pdf of X, and so on. According to the central limit theorem, each time you convolve the result again with the pdf of X, the new result will look more like the normal distribution, and this is the case (I have visualised this myself with graphical convolution animations).

4. It was at this point that I realised that the Fourier transform of a Gaussian is also a Gaussian.

5. I also recalled that convolution in the original domain (i.e. the operations we performed in 3) is the equivalent of multiplication in the Fourier domain.

6. As a result of points 4 and 5, that implies that the Fourier transform of pdf of X, raised to a large power, will approximate a normal distribution.

What I find surprising about this conclusion is that it is possible to create almost any pdf of your choosing, even a randomly drawn squiggle that is positive and integrates to 1, and its FT^n (where n is large) will always be approximately Gaussian. Is this correct? I am trying to find an intuitive explanation for this.

Thanks for the help!

Nat
 

tnich

Homework Helper
853
256
The Fourier transform of a Gaussian distribution is the characteristic function ##\exp(i \mu t - \frac {\sigma^2 t^2}2)##, which resembles a Gaussian distribution, but differs from it in a couple of significant ways. First, it has an imaginary component, so it lies in the complex plane. Second, it is not properly normalized, so it does not integrate to 1. You could avoid the first problem by using the Laplace transform (resulting in the moment generating function ##\exp(\mu t + \frac {\sigma^2 t^2}2)##), but in this case the coefficient of ##t^2## is positive, not negative. Perhaps if you can resolve these difficulties, you can check whether your intuition is correct.
 
@tnich Thanks for the reply! :)

I have thought about these two point, but I would still say that my conclusion is surprising. Let me address your points:

First, it has an imaginary component, so it lies in the complex plane
Yes, this would be the case with a non-zero mean Gaussian, or any function that is shifted (as this corresponds to a phase shift term in the Fourier domain, as your equation suggests). I should have stated, for simplicity, that the X distribution has to be zero mean, and this is indeed what I had in my head, but I failed to communicate that. In the zero-mean case, the functions in both domains should be real-valued Gaussians centred at 0.

Second, it is not properly normalized, so it does not integrate to 1
For the zero-mean case, it doesn't matter so much if the function doesn't integrate to 1, as long as it approximates a Gaussian shape.


So it still seems to me like it is possible to draw any zero-mean pdf, and the (FT(pdf))^n will still approximate a real valued, zero mean Gaussian
 
Last edited:

tnich

Homework Helper
853
256
OK. Let's try a uniform distribution with pdf
$$f(x) =
\begin{cases} 0 & \text{if } x < 0
\\ 1 & \text{if } 0 \leq x \leq 1
\\ 0 & \text{if } x > 1 \end{cases}$$
Then ##f^n(x) = f(x)##

Another case is an exponential distribution with pdf

$$g(x) =
\begin{cases} 0 & \text{if } x < 0
\\ {\exp(-x)} & \text{if } x\geq 0
\end{cases}$$

Then $$g^n(x) =
\begin{cases} 0 & \text{if } x < 0
\\ {\exp(-nx)} & \text{if } x\geq 0
\end{cases}$$

Neither of these functions approach a Gaussian form when n is large.
 
Neither of these functions approach a Gaussian form when n is large.
Perhaps I didn't state my hypothesis as well as I should have. It's the Fourier transform of zero-mean pdf, raised to large n, that will approximate a Gaussian. That is:

## \lim_{n \rightarrow \infty}\left( \mathscr{F}\left[f(x)\right]\right)^{n} \approx c_{0}e^{-c_{1}\omega^{2}}##

We are convolving the PDFs in the original domain to find the new distribution of the summed variables. This would be the equivalent operation of multiplying the Fourier transforms of the PDF in the frequency domain.
(a modified version of my hypothesis works will non-zero mean PDFs as well, but in this case we look at the real power spectrum instead of complex FT - we'll ignore this for now for simplicity, and only consider zero-mean pdf's).

In the case of a zero mean uniform distribution, the Fourier transform is a real valued sinc function. When the sinc function is raised to a large power of n, the result is almost exactly Gaussian (I didn't prove this analytically, but I plotted both).
 

tnich

Homework Helper
853
256
Perhaps I didn't state my hypothesis as well as I should have. It's the Fourier transform of zero-mean pdf, raised to large n, that will approximate a Gaussian. That is:

## \lim_{n \rightarrow \infty}\left( \mathscr{F}\left[f(x)\right]\right)^{n} \approx c_{0}e^{-c_{1}\omega^{2}}##

We are convolving the PDFs in the original domain to find the new distribution of the summed variables. This would be the equivalent operation of multiplying the Fourier transforms of the PDF in the frequency domain.
(a modified version of my hypothesis works will non-zero mean PDFs as well, but in this case we look at the real power spectrum instead of complex FT - we'll ignore this for now for simplicity, and only consider zero-mean pdf's).

In the case of a zero mean uniform distribution, the Fourier transform is a real valued sinc function. When the sinc function is raised to a large power of n, the result is almost exactly Gaussian (I didn't prove this analytically, but I plotted both).
Yes, I realized after I sent my reply that you meant the nth power of Fourier transform of the pdf. The ##sinc(s)## function, though, has zeroes at regular intervals. Wouldn't ##sync^n(s)## still have zeros at the same points?
 
Yes, I realized after I sent my reply that you meant the nth power of Fourier transform of the pdf. The ##sinc(s)## function, though, has zeroes at regular intervals. Wouldn't ##sync^n(s)## still have zeros at the same points?
Yes, this is certainly the case. However, the limiting behaviour takes care of this; as n gets large, the resulting Gaussian approximation gets "squashed" towards to centre, and so the first zero crossing actually occurs at some extreme standard deviation, where the exact Gaussian is almost 0 anyway (and the error at all the crossings approaches zero as n goes to infinity).

NOTE: In the plots, the approximation overlays the actual Gaussian, and only appears as one black plot, when if fact there are two black plots.
 

Attachments

tnich

Homework Helper
853
256
OK.
Now that I understand what you are trying to say, I agree with your conclusion. I think it is implied by the Central Limit Theorem.
 

tnich

Homework Helper
853
256
OK.
Now that I understand what you are trying to say, I agree with your conclusion. I think it is implied by the Central Limit Theorem.
So you can use the assumptions of the CLT to specify what kinds of functions raised to the nth power converge to a Gaussian form. The CLT assumes that the area under the pdf curve ##f(t)## is 1, and that the mean and variance are both finite. So the Fourier transform ##F(s)## of such a function would have ##F(0)=1##, F'(0) and F''(0) both finite, and F''(0) < 0. That means that the value of ##F(s)## at ##s = 0## is a local maximum.

The inverse Fourier transform is itself a Fourier transform, so the area under ##F(s)## must be finite because f(0) is finite. This starts to look like a curve that could behave the way you want it to.
 

tnich

Homework Helper
853
256
So you can use the assumptions of the CLT to specify what kinds of functions raised to the nth power converge to a Gaussian form. The CLT assumes that the area under the pdf curve ##f(t)## is 1, and that the mean and variance are both finite. So the Fourier transform ##F(s)## of such a function would have ##F(0)=1##, F'(0) and F''(0) both finite, and F''(0) < 0. That means that the value of ##F(s)## at ##s = 0## is a local maximum.

The inverse Fourier transform is itself a Fourier transform, so the area under ##F(s)## must be finite because f(0) is finite. This starts to look like a curve that could behave the way you want it to.
Forgot to say that since we are assuming the mean is zero, ##F'(0) = 0##.
 

Want to reply to this thread?

"Fourier Transform of a Probability Distribution" You must log in or register to reply here.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top