I Covariance of Fourier conjugates for Gaussian distributions

212
2
Given two variables ##x## and ##k##, the covariance between the variables is as follows, where ##E## denotes the expected value:
\begin{equation}
\begin{split}
COV(x,k)&= E[x k]-E[x]E[k]
\end{split}
\end{equation}

If ##x## and ##k## are Foureir conjugates and ##f(x)## and ##\hat{f}(k)## are Gaussian distributions, how does that affect the covariance?

This is not a homework problem. I am just trying to understand the covariance of Fourier conjugates, particularly for Gaussians.
 

stevendaryl

Staff Emeritus
Science Advisor
Insights Author
8,338
2,513
Okay, nobody answered yet, so I'll try to give an answer.

In ordinary probability theory, I would say that what you're asking for doesn't have an answer without clarification.

You have a variable ##x## that has a certain probability distribution, ##f(x)##. For example, suppose ##x## is a person's height in centimeters. What does ##k## mean in that case? You can certainly ask what the expected height is, but what does it mean to be the expected value of the fourier conjugate of ##x##?

In quantum mechanics, you can make sense of these things in terms of viewing ##x## and ##k## as operators, rather than simple variables. Is that what you mean?
 
212
2
I apologize if I was unclear. I am concerned with probability, specifically probability amplitude functions and Fourier transforms. This is not a question about quantum mechanics.

I will restate the question more explicitly. Given the normalized Gaussian probability amplitude in ##x##-space, where ##E(x^2)## denotes the variance:
\begin{equation}
\begin{split}
f(x) &= \left(2 \pi E(x^2) \right)^{-\frac{1}{4}} e^{-\frac{x^2}{4 E(x^2)}}
\end{split}
\end{equation}

Thus:
\begin{equation}
\begin{split}
\mathscr{F}\left[f(x) \right]&= \hat{f}(k)
\end{split}
\end{equation}

The equation for ##\hat{f}(k)## is the normalized probability amplitude in ##k##-space, where ##E(k^2)## denotes the variance, as follows:
\begin{equation}
\begin{split}
\hat{f}(k) &= \left(2 \pi E(k^2) \right)^{-\frac{1}{4}} e^{-\frac{k^2}{4 E(k^2)}}
\end{split}
\end{equation}

Clearly ##x## and ##k## are related as Fourier conjugates. For example, it is easy to derive the relationship between ##E(x^2)## and ##E(k^2)##. Given:
\begin{equation}
\begin{split}
\hat{f}(k) &\doteq \int_{-\infty}^{\infty} e^{2 \pi i k x} f(x) dx
\end{split}
\end{equation}

Thus ##E(x^2)## and ##E(k^2)## are related as follows (the relationship is an equality given the definition of ##f(x)## and ##\hat{f}(k)## given above):
\begin{equation}
\begin{split}
E(k^2) E(x^2) &= \frac{1}{16 \pi^2}
\end{split}
\end{equation}

However, I am having trouble finding the covariance of ##x## and ##k## where covariance is defined as follows:
\begin{equation}
\begin{split}
COV(x,k)&= E[x k]-E[x]E[k]
\end{split}
\end{equation}

Obviously, ##x## and ##k## are related, but how does that manifest in their covariance?
 

mathman

Science Advisor
7,614
370
x is a random variable, but what is k?
 
212
2
##k## is the Fourier conjugate of ##x##.

My understanding is that ##k## is not completely independent of ##x##. Were ##k## and ##x## completely independent, i.e., ##Cov(x,k) = 0##, then why would ##E(x^2) E(k^2) = \frac{1}{16 \pi^2}##?
 

stevendaryl

Staff Emeritus
Science Advisor
Insights Author
8,338
2,513
If ##f(x)## is an arbitrary probability distribution, then ##\hat{f}(k)## will not necessarily be real. For example, if ##f(x) = 1## for ##0 \leq x \leq A## and 0 everywhere else, then ##\hat{f}(k) = \dfrac{e^{ikA} - 1}{ikA}##. So in general, taking a Fourier transform of a probability distribution for ##x## does not give a probability distribution for ##k##. So in general, I don't understand what it means to compute an expectation value for ##k## based on a probability distribution for ##x##. Now, it happens to be in your case that ##\hat{f}(k)## is real, but why should it be interpreted as a probability distribution for ##k##?
 
212
2
##f(x)## is not an arbitrary probability distribution. I have defined ##f(x)## as the probability amplitude of the normalized Gaussian. Furthermore, the Fourier transform of a Gaussian probability distribution is also a Gaussian probability distribution (and the Fourier transform of the probability amplitude of the normalized Gaussian is also the probability amplitude of a normalized Gaussian). See for example: http://mathworld.wolfram.com/FourierTransformGaussian.html
 

stevendaryl

Staff Emeritus
Science Advisor
Insights Author
8,338
2,513
##f(x)## is not an arbitrary probability distribution. I have defined ##f(x)## as the probability amplitude of the normalized Gaussian. Furthermore, the Fourier transform of a Gaussian probability distribution is also a Gaussian probability distribution (and the Fourier transform of the probability amplitude of the normalized Gaussian is also the probability amplitude of a normalized Gaussian). See for example: http://mathworld.wolfram.com/FourierTransformGaussian.html
Yes, ##\hat{f}(k)## happens to be a Gaussian, but why do you interpret it as a probability distribution for ##k##?

And ##f(x)## is not a probability amplitude, it is a probability distribution.
 

mathman

Science Advisor
7,614
370
Using ##\hat f(k)## as a distribution function and defining a random variable with this distribution says nothing about the relationship between this variable and the one with distribution ##f(x)##.
 
212
2
Yes, ##\hat{f}(k)## happens to be a Gaussian, but why do you interpret it as a probability distribution for ##k##?

And ##f(x)## is not a probability amplitude, it is a probability distribution.
The mathematical uncertainty principle is a description of the relationship between the variances of Fourier conjugates with normalized Gaussian distributions. In order to calculate both variances, one must interpret both ##f(x)## and ##\hat{f}(k)## as probability functions. See for example: http://www.ams.org/publicoutreach/feature-column/fcarc-uncertainty

In order for normalization to hold in both position and wavenumber space, one must transform the square root of the normalized Gaussian probability density function, which I call the probability amplitude function. Happy to demonstrate if necessary.

The relationship between the variances of the two related functions is well-known, i.e., the mathematical uncertainty principle. I am merely trying to understand the relationship between their expected values as described by their covariance.
 
Last edited:
212
2
Using ##\hat f(k)## as a distribution function and defining a random variable with this distribution says nothing about the relationship between this variable and the one with distribution ##f(x)##.
The mathematical uncertainty principle says a lot about the relationship between Fourier conjugates, specifically the relationship between their variances. Again, I am merely trying to see what has been discovered regarding the relationship between their expected values as represented by the covariance.
 

stevendaryl

Staff Emeritus
Science Advisor
Insights Author
8,338
2,513
The mathematical uncertainty principle is a description of the relationship between the variances of Fourier conjugates with normalized Gaussian distributions. In order to calculate both variances, one must interpret both ##f(x)## and ##\hat{f}(k)## as probability functions. See for example: http://www.ams.org/publicoutreach/feature-column/fcarc-uncertainty
It is certainly true that if ##f(x) = A e^{-\lambda x^2}##, then its Fourier transform ##\hat{f}(k)## has the same form:

##\hat{f}(k) = A' e^{-\lambda' k^2}##

where ##A' = A/\sqrt{2\lambda}## and ##\lambda' = 1/(4 \lambda)##

That's a fact about functions and their Fourier transforms. But that has nothing to do with interpreting ##f(x)## and ##\hat{f}(k)## as probability distributions. There is no meaning to the expectation value you are asking about: ##E(xk)##, as far as I know. That quantity makes no sense for probability distributions. It does make sense for probability amplitudes in the sense of quantum mechanics. If ##\psi(x)## is a probability amplitude, so that ##|\psi(x)|^2## is the corresponding probability distribution, then we can Fourier-transform ##\psi(x)## to get an amplitude for ##k##: ##\hat{\psi}(k) = \frac{1}{\sqrt{2\pi}} \int \psi(x) e^{-ikx} dx##. Then we can interpret the operator ##\hat{k}## via:

##\hat{k} \psi(x) = -i \frac{d \psi}{dx}##

Then the expectation value ##\langle x \hat{k} \rangle = \int \psi^*(x) x \hat{k} \psi(x) dx##
 
212
2
I understand what you are saying, and I don't want to make this a debate about definitions. I note that the American Mathematical society considers the Fourier transform of the square root of the normalized Gaussian also to be a the square root of a p.d.f., i.e. both ##f(x)^2## and ##\hat{f}(k)^2## are p.d.f.'s.
See: http://www.ams.org/publicoutreach/feature-column/fcarc-uncertainty

Specifically:
And More Generally ...
{C}This property does not generalize to arbitrary probability distribution functions (p.d.f.'s) because Ff may not be a p.d.f. at all (it was not for our fh examples) and its variance may not be defined. On the other hand, the Fourier transform preserves the integral of the square of a function (this is the Plancherel Theorem). So if f 2 is a p.d.f., then (Ff )2, which is automatically non-negative, will also have total integral equal to 1: it is also a p.d.f.

Perhaps, we can just agree to disagree on this point.

What I really want to understand is the covariance relationship between ##x## and ##k##. One can easily derive the following:

\begin{equation}
\begin{split}
E(x^2) E(k^2) &\geq E(x k)^2
\end{split}
\end{equation}

I am trying to figure out how ##E(x^2) E(k^2)## relates to ##E(x)^2 E(k)^2## and for that I need to determine the covariance of ##x## and ##k##.

Of course, if the covariance is 0, then ##E(x k) = E(x) E(k)##, which is fine. However, considering that ##E(x^2)## and ##E(k^2)## are closely related via uncertainty, I am not inclined to assume ##Cov(x,k) = 0##.
 
Last edited:

Want to reply to this thread?

"Covariance of Fourier conjugates for Gaussian distributions" You must log in or register to reply here.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top