Covariance of Fourier conjugates for Gaussian distributions

Click For Summary

Discussion Overview

The discussion revolves around the covariance of Fourier conjugates, specifically focusing on Gaussian distributions. Participants explore the implications of this covariance in the context of probability amplitude functions and Fourier transforms, while clarifying the definitions and relationships between the variables involved.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants question the meaning of the expected value of the Fourier conjugate ##k## in relation to the variable ##x##, suggesting that clarification is needed regarding their definitions.
  • Others assert that ##k## is not independent of ##x##, citing the relationship between their variances as evidence against independence.
  • One participant emphasizes that the Fourier transform of a Gaussian distribution remains a Gaussian distribution, arguing for the interpretation of ##\hat{f}(k)## as a probability distribution.
  • Another participant challenges the interpretation of ##\hat{f}(k)## as a probability distribution, stating that it does not necessarily follow from the properties of arbitrary probability distributions.
  • Some participants reference the mathematical uncertainty principle, discussing its implications for the variances of Fourier conjugates and the need to interpret both functions as probability distributions for covariance calculations.
  • There is a contention regarding the interpretation of ##f(x)## as a probability amplitude versus a probability distribution, with differing views on the implications of this distinction for expected values.

Areas of Agreement / Disagreement

Participants express differing views on the interpretation of the covariance of Fourier conjugates, with no consensus reached on the definitions or implications of the expected values involved. The discussion remains unresolved regarding the relationship between the expected values of ##x## and ##k##.

Contextual Notes

There are unresolved assumptions regarding the definitions of probability amplitude functions versus probability distributions, as well as the implications of these definitions on the calculation of covariance. The mathematical steps leading to the relationship between variances are also not fully explored.

redtree
Messages
335
Reaction score
15
Given two variables ##x## and ##k##, the covariance between the variables is as follows, where ##E## denotes the expected value:
\begin{equation}
\begin{split}
COV(x,k)&= E[x k]-E[x]E[k]
\end{split}
\end{equation}

If ##x## and ##k## are Foureir conjugates and ##f(x)## and ##\hat{f}(k)## are Gaussian distributions, how does that affect the covariance?

This is not a homework problem. I am just trying to understand the covariance of Fourier conjugates, particularly for Gaussians.
 
Physics news on Phys.org
Okay, nobody answered yet, so I'll try to give an answer.

In ordinary probability theory, I would say that what you're asking for doesn't have an answer without clarification.

You have a variable ##x## that has a certain probability distribution, ##f(x)##. For example, suppose ##x## is a person's height in centimeters. What does ##k## mean in that case? You can certainly ask what the expected height is, but what does it mean to be the expected value of the Fourier conjugate of ##x##?

In quantum mechanics, you can make sense of these things in terms of viewing ##x## and ##k## as operators, rather than simple variables. Is that what you mean?
 
I apologize if I was unclear. I am concerned with probability, specifically probability amplitude functions and Fourier transforms. This is not a question about quantum mechanics.

I will restate the question more explicitly. Given the normalized Gaussian probability amplitude in ##x##-space, where ##E(x^2)## denotes the variance:
\begin{equation}
\begin{split}
f(x) &= \left(2 \pi E(x^2) \right)^{-\frac{1}{4}} e^{-\frac{x^2}{4 E(x^2)}}
\end{split}
\end{equation}

Thus:
\begin{equation}
\begin{split}
\mathscr{F}\left[f(x) \right]&= \hat{f}(k)
\end{split}
\end{equation}

The equation for ##\hat{f}(k)## is the normalized probability amplitude in ##k##-space, where ##E(k^2)## denotes the variance, as follows:
\begin{equation}
\begin{split}
\hat{f}(k) &= \left(2 \pi E(k^2) \right)^{-\frac{1}{4}} e^{-\frac{k^2}{4 E(k^2)}}
\end{split}
\end{equation}

Clearly ##x## and ##k## are related as Fourier conjugates. For example, it is easy to derive the relationship between ##E(x^2)## and ##E(k^2)##. Given:
\begin{equation}
\begin{split}
\hat{f}(k) &\doteq \int_{-\infty}^{\infty} e^{2 \pi i k x} f(x) dx
\end{split}
\end{equation}

Thus ##E(x^2)## and ##E(k^2)## are related as follows (the relationship is an equality given the definition of ##f(x)## and ##\hat{f}(k)## given above):
\begin{equation}
\begin{split}
E(k^2) E(x^2) &= \frac{1}{16 \pi^2}
\end{split}
\end{equation}

However, I am having trouble finding the covariance of ##x## and ##k## where covariance is defined as follows:
\begin{equation}
\begin{split}
COV(x,k)&= E[x k]-E[x]E[k]
\end{split}
\end{equation}

Obviously, ##x## and ##k## are related, but how does that manifest in their covariance?
 
x is a random variable, but what is k?
 
##k## is the Fourier conjugate of ##x##.

My understanding is that ##k## is not completely independent of ##x##. Were ##k## and ##x## completely independent, i.e., ##Cov(x,k) = 0##, then why would ##E(x^2) E(k^2) = \frac{1}{16 \pi^2}##?
 
If ##f(x)## is an arbitrary probability distribution, then ##\hat{f}(k)## will not necessarily be real. For example, if ##f(x) = 1## for ##0 \leq x \leq A## and 0 everywhere else, then ##\hat{f}(k) = \dfrac{e^{ikA} - 1}{ikA}##. So in general, taking a Fourier transform of a probability distribution for ##x## does not give a probability distribution for ##k##. So in general, I don't understand what it means to compute an expectation value for ##k## based on a probability distribution for ##x##. Now, it happens to be in your case that ##\hat{f}(k)## is real, but why should it be interpreted as a probability distribution for ##k##?
 
##f(x)## is not an arbitrary probability distribution. I have defined ##f(x)## as the probability amplitude of the normalized Gaussian. Furthermore, the Fourier transform of a Gaussian probability distribution is also a Gaussian probability distribution (and the Fourier transform of the probability amplitude of the normalized Gaussian is also the probability amplitude of a normalized Gaussian). See for example: http://mathworld.wolfram.com/FourierTransformGaussian.html
 
redtree said:
##f(x)## is not an arbitrary probability distribution. I have defined ##f(x)## as the probability amplitude of the normalized Gaussian. Furthermore, the Fourier transform of a Gaussian probability distribution is also a Gaussian probability distribution (and the Fourier transform of the probability amplitude of the normalized Gaussian is also the probability amplitude of a normalized Gaussian). See for example: http://mathworld.wolfram.com/FourierTransformGaussian.html

Yes, ##\hat{f}(k)## happens to be a Gaussian, but why do you interpret it as a probability distribution for ##k##?

And ##f(x)## is not a probability amplitude, it is a probability distribution.
 
Using ##\hat f(k)## as a distribution function and defining a random variable with this distribution says nothing about the relationship between this variable and the one with distribution ##f(x)##.
 
  • #10
stevendaryl said:
Yes, ##\hat{f}(k)## happens to be a Gaussian, but why do you interpret it as a probability distribution for ##k##?

And ##f(x)## is not a probability amplitude, it is a probability distribution.

The mathematical uncertainty principle is a description of the relationship between the variances of Fourier conjugates with normalized Gaussian distributions. In order to calculate both variances, one must interpret both ##f(x)## and ##\hat{f}(k)## as probability functions. See for example: http://www.ams.org/publicoutreach/feature-column/fcarc-uncertainty

In order for normalization to hold in both position and wavenumber space, one must transform the square root of the normalized Gaussian probability density function, which I call the probability amplitude function. Happy to demonstrate if necessary.

The relationship between the variances of the two related functions is well-known, i.e., the mathematical uncertainty principle. I am merely trying to understand the relationship between their expected values as described by their covariance.
 
Last edited:
  • #11
mathman said:
Using ##\hat f(k)## as a distribution function and defining a random variable with this distribution says nothing about the relationship between this variable and the one with distribution ##f(x)##.

The mathematical uncertainty principle says a lot about the relationship between Fourier conjugates, specifically the relationship between their variances. Again, I am merely trying to see what has been discovered regarding the relationship between their expected values as represented by the covariance.
 
  • #12
redtree said:
The mathematical uncertainty principle is a description of the relationship between the variances of Fourier conjugates with normalized Gaussian distributions. In order to calculate both variances, one must interpret both ##f(x)## and ##\hat{f}(k)## as probability functions. See for example: http://www.ams.org/publicoutreach/feature-column/fcarc-uncertainty

It is certainly true that if ##f(x) = A e^{-\lambda x^2}##, then its Fourier transform ##\hat{f}(k)## has the same form:

##\hat{f}(k) = A' e^{-\lambda' k^2}##

where ##A' = A/\sqrt{2\lambda}## and ##\lambda' = 1/(4 \lambda)##

That's a fact about functions and their Fourier transforms. But that has nothing to do with interpreting ##f(x)## and ##\hat{f}(k)## as probability distributions. There is no meaning to the expectation value you are asking about: ##E(xk)##, as far as I know. That quantity makes no sense for probability distributions. It does make sense for probability amplitudes in the sense of quantum mechanics. If ##\psi(x)## is a probability amplitude, so that ##|\psi(x)|^2## is the corresponding probability distribution, then we can Fourier-transform ##\psi(x)## to get an amplitude for ##k##: ##\hat{\psi}(k) = \frac{1}{\sqrt{2\pi}} \int \psi(x) e^{-ikx} dx##. Then we can interpret the operator ##\hat{k}## via:

##\hat{k} \psi(x) = -i \frac{d \psi}{dx}##

Then the expectation value ##\langle x \hat{k} \rangle = \int \psi^*(x) x \hat{k} \psi(x) dx##
 
  • #13
I understand what you are saying, and I don't want to make this a debate about definitions. I note that the American Mathematical society considers the Fourier transform of the square root of the normalized Gaussian also to be a the square root of a p.d.f., i.e. both ##f(x)^2## and ##\hat{f}(k)^2## are p.d.f.'s.
See: http://www.ams.org/publicoutreach/feature-column/fcarc-uncertainty

Specifically:
And More Generally ...
{C}This property does not generalize to arbitrary probability distribution functions (p.d.f.'s) because Ff may not be a p.d.f. at all (it was not for our fh examples) and its variance may not be defined. On the other hand, the Fourier transform preserves the integral of the square of a function (this is the Plancherel Theorem). So if f 2 is a p.d.f., then (Ff )2, which is automatically non-negative, will also have total integral equal to 1: it is also a p.d.f.

Perhaps, we can just agree to disagree on this point.

What I really want to understand is the covariance relationship between ##x## and ##k##. One can easily derive the following:

\begin{equation}
\begin{split}
E(x^2) E(k^2) &\geq E(x k)^2
\end{split}
\end{equation}

I am trying to figure out how ##E(x^2) E(k^2)## relates to ##E(x)^2 E(k)^2## and for that I need to determine the covariance of ##x## and ##k##.

Of course, if the covariance is 0, then ##E(x k) = E(x) E(k)##, which is fine. However, considering that ##E(x^2)## and ##E(k^2)## are closely related via uncertainty, I am not inclined to assume ##Cov(x,k) = 0##.
 
Last edited:

Similar threads

  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 25 ·
Replies
25
Views
3K