Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Fourier representation of a random function

  1. Dec 14, 2012 #1

    Jano L.

    User Avatar
    Gold Member

    Consider continuous function [itex]x(t)[/itex], which has zero time average:

    [tex]
    \lim_{T\rightarrow\infty}\frac{1}{T}\int_{-T/2}^{T/2} x(t)\,dt = 0
    [/tex]

    and exponentially decaying autocorrelation function:
    [tex]
    \lim_{T\rightarrow\infty}\frac{1}{T}\int_{-T/2}^{T/2} x(t)x(t-\tau)\,dt = C_0e^{-\gamma |\tau|}.
    [/tex]

    Is it possible to write such function as the Fourier integral

    [tex]
    \int_{-\infty}^{\infty} x(\omega) e^{i\omega t} d\omega/2\pi
    [/tex]

    for some function/distribution [itex]x(\omega)[/itex]? It seems the Fourier analysis is used in the theory of random functions a lot, but on the other hand, when I insert the integral representation into the time averaging integral, I'm getting autocorrelation function equal to 0.
     
  2. jcsd
  3. Dec 14, 2012 #2

    Stephen Tashi

    User Avatar
    Science Advisor

    If you are an engineer, someone in the engineering section of the forum might quickly answer your question in engineering terms. I approach the question as a mathematician and, from that point of view, your question mixes two different concepts of integration.

    There is a saying (discussed recently on the forum) that "Random variables are not random and they are not variables". Likewise, "A real valued random function of a real variable isn't a real valued function of a real variable and it isn't random."

    When you write the integral in the autocorrelation function, your notation makes sense as an integral of a deterministic function. Perhaps engineers have some notation where it also makes sense as the integral of a "random function". However, these two integrals do not have the same definition. Likewise when you write the integral involved in the fourier transform, you must be clear what type of integral you mean.

    One way to think about random functions is to imagine a bag containing slips of paper with various formulas for functions on them. A specific "random function x(t)" is defined by a specific bag of formulas and a specific probability distribution that gives the probability of picking each function. (More information is needed to specify a "random function" than is needed to specify a single "function". When you pick a specific function from the bag, you have a realization of the random function and you might call this a "randomly chosen function". But the specific function you picked isn't the "random function" as a whole.)

    If x(t) is a random function, then what does [itex] \int_0^1 x(t) dt [/itex] mean? If you have picked a particular function from the bag then this can be interpreted as the familiar integral from calculus. The value of the integral would be associated only with the particular realization of the random function. The value would not be correct for all possible choices of the random function. If you want to define value that applies to the "random function", you could define an integral that was the expected value of [itex] \int_0^1 x(t) dt [/itex] taken over all possible realizations of [itex] x(t) [/itex] as particular functions.

    Perhaps your mixing together the usual kind of integral with an integral that is defined by an average value. There are correct and incorrect ways of using these types of integration together. Show what you did.
     
  4. Dec 15, 2012 #3

    Jano L.

    User Avatar
    Gold Member

    Stephen,

    thank you for your response. I understand there are intricacies in the meaning and properties of random functions, and later I would like to ask about this as well.

    However, now I would like to ask something a little bit different.

    I just want to study a normal, differentiable and integrable function x(t), fulfilling the requirements given by the above formula for average values.

    I am curious whether such function exists, and if so, can it be represented by a Fourier integral ?

    I am not sure I need to say it is a "random function" in any technical sense, because it may, as you point out, cause some problems.

    The difficulty with the Fourier integral is that when I perform the time averaging in the indicated way, and switch the order of the two integrations, I'm getting

    [tex]
    \frac{1}{T}\int_{-T/2}^{T/2} e^{i\omega t} dt = \frac{\sin(\omega T/2)}{\omega T/2},
    [/tex]

    which, in case [itex]x(\omega)[/itex] is continuous at 0, in the limit [itex] T\rightarrow \infty[/itex] acts as the distribution [itex]2\pi / T \delta(\omega)[/itex]. The auto-correlation function is then

    [tex]
    \lim_{T\rightarrow \infty} \frac{x(0)x(0)}{T} = 0
    [/tex]

    for any function [itex]x(\omega)[/itex], since x(0) does not depend on [itex]T[/itex]. So it seems that a function given by the Fourier integral of continuous [itex]x(\omega)[/itex] can't have non-zero auto-correlation function. Is that right?
     
  5. Dec 15, 2012 #4

    Stephen Tashi

    User Avatar
    Science Advisor

    OK, we consider only functions, not random functions. The number of [itex] x[/itex]'s in your notation confuses me. Perhaps my only contribution will be to restate the question:


    Consider a continuous function [itex]f(t)[/itex], which has zero time average:

    [tex]
    \lim_{T\rightarrow\infty}\frac{1}{T}\int_{-T/2}^{T/2} f(t)\,dt = 0
    [/tex]

    and exponentially decaying autocorrelation function:
    [tex]
    R(\tau) = \lim_{T\rightarrow\infty}\frac{1}{T}\int_{-T/2}^{T/2} f(t)f(t-\tau)\,dt = C_0e^{-\gamma |\tau|}.
    [/tex]

    Is it possible to write such function as the Fourier integral

    [tex]
    f(t) = \int_{-\infty}^{\infty} x(\omega) e^{i\omega t} d\omega/2\pi
    [/tex]

    for some function [itex]x(\omega)[/itex]?

    Assume it is. The autocorrelation function of [itex] f(t) [/itex] is

    [tex] R(\tau) = \lim_{T\rightarrow\infty}\frac{1}{T}\int_{-T/2}^{T/2} \left[ \int_{-\infty}^\infty x(\omega) e^{i\omega t} d\omega/ 2\pi \right] \left[ \int_{-\infty}^{\infty} x(\alpha) e^{i\alpha (t-\tau)} d\alpha/2\pi \right] \ dt [/tex]

    (Is that how the question begins?)
     
  6. Dec 16, 2012 #5

    Jano L.

    User Avatar
    Gold Member

    Yes, that is right. I used the same symbol for function and its Fourier transform, but your notation is more clear. Sorry for being too sloppy.

    Now, when we switch the integrals and integrate the time-dependent function [itex]e^{i(\omega+\alpha)t}[/itex], it seems that we obtain the function [itex]R(\tau)[/itex] which is zero everywhere, which is not equal to [itex]C_0 e^{-\gamma |\tau|}[/itex].

    Where exactly is the problem? In the assumption that the function f(t) exists? Or that the Fourier transform [itex]x(\omega)[/itex] exists?
     
  7. Dec 16, 2012 #6

    Mute

    User Avatar
    Homework Helper

    Are you sure you can switch the order of the integrals (and limits)? If your ##x(t)## is not nice enough (e.g., is a true stochastic function), that step is not going to be valid.
     
  8. Dec 16, 2012 #7

    Jano L.

    User Avatar
    Gold Member

    Yes, that could be a problem. But that is in fact the question: is there some problem with the standard Fourier transform of the function [itex]f(t)[/itex], provided I know f(t) has those properties I indicated in the 1st post ?

    Initially, I do not necessarily want f(t) to be a "true stochastic function", because I am not sure I need it.
    All I require are the long-time properties of a single function - average value and autocorrelation function.

    Perhaps the answer is that in order to have such properties as I require, I need to consider "true stochastic functions". Is that right? And if so, why?
     
  9. Dec 16, 2012 #8

    marcusl

    User Avatar
    Science Advisor
    Gold Member

    E. T. Jaynes explicitly warns that you can never exchange a limit with either summation or integration. He points out that it is one of the favored tricks people use to "prove" that 0=1, and other apparent paradoxes.

    If your signal is square integrable (that is, f(t) has finite energy), then an engineer would rephrase your problem as
    [tex] \int_{-\infty}^{\infty} f(t)\,dt = 0 ,[/tex]
    [tex]R(\tau) = \int_{-\infty }^{\infty } f(t)f(t-\tau)\,dt = C_0e^{-\gamma |\tau|} .[/tex]
    Stephen Tashi’s equation then reads
    [tex]R(\tau) = \int_{-\infty }^{\infty } \left[ \int_{-\infty}^\infty x(\omega) e^{i\omega t} d\omega/ 2\pi \right] \left[ \int_{-\infty}^{\infty} x(\alpha) e^{i\alpha (t-\tau)} d\alpha/2\pi \right] \ dt,[/tex]
    for which it is valid to exchange the order of integration. The key integral is now
    [tex] \int_{-\infty }^{\infty } e^{i(\omega+\alpha)t} dt = \delta(\omega+\alpha)[/tex] leading to

    [tex]R(\tau) = C_0e^{-\gamma |\tau|} =
    \frac{1}{(2\pi)^2} \int_{-\infty }^{\infty } x(\omega) x(-\omega) e^{i\omega \tau} d\omega .[/tex]
    The inverse transform of the exponential to the left is well known—I don’t have my books handy, but I seem to recall that it’s a Cauchy distribution?

    If you had worked this problem in the complex domain, then you likely would have gotten instead that x(w)x*(-w) = |x(w)|^2 is Cauchy. This is the well-known power spectral density of a spectroscopic absorption line, whose real and imaginary parts are the absorption and dispersion parts of the Lorentzian lineshape. The functions that give rise to it are damped sinusoids times a Heaviside step.

    Disclaimer: I'm doing this from memory. Check books on nuclear magnetic resonance or optical spectroscopy for the real deal.
     
    Last edited: Dec 16, 2012
  10. Dec 16, 2012 #9

    Jano L.

    User Avatar
    Gold Member

    I hope not - it is very basic rule which makes the mathematics useful! Of course, the interchange of order of integrations is not possible in every case. More probably he meant that the exchange should not be assumed to be always justified. Can you please give the reference to the above Jaynes' comment?

    Whether the exchange is justified or not, depends on the function and the definition of the integral. So in this case, can you think of some reason why the premises do not allow this exchange?

    Of course, such formulae work mathematically, but the problem is they do not give average value nor correlation function. Such rephrasing would be incompatible with the premises - I really need the above relations for time average and autocorrelation function.

    Consider a stationary fluctuating function f(t) which does not decay with time, in neither of the two directions.

    It is possible to express such function on interval [itex]\langle -T/2,T/2\rangle[/itex] via Fourier series approximately:

    [tex]
    f(t) = f_0 + \sum_{k=1}^{\infty} f_k \sin (k\omega_0 t - \varphi_k).
    [/tex]

    where

    [tex]
    f_k = \sqrt{\frac{8C_0}{T}\frac{\gamma - e^{-\gamma T/2} \gamma \cos(\pi k)\big)}{(k\omega_0)^2+\gamma^2 }}
    [/tex]

    and the phases [itex]\varphi_k[/itex] are chosen randomly. The only drawback of this is that we cannot fulfill both of two above conditions exactly - either we can have correct average, or we can have correct autocorrelation function, but not both at the same time, due to finite interval on which f(t) is expanded.

    As the limit [itex]T\rightarrow \infty[/itex] is taken, the quality of the function gets better, so I thought Fourier integral would make things work exactly, but there is this problem - the autocorrelation function comes out wrong, for some reason.

    It can be an unjustified exchange of operations, or that the function does not exist, or that it exists but does not have Fourier transform. I really do not know, but I am inclined to believe that the function f(t) does not have Fourier integral.

    That's right, the imaginary part of the Fourier transform of auto-correlation function is very similar to absorption spectrum of damped oscillator, but with zero natural frequency.
     
  11. Dec 16, 2012 #10

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    The function cannot be square integrable, otherwise the autocorrelation would vanish (simply due to the factor 1/T). I think this is the main problem, as you cannot use all those nice theorems about the fourier transformation here.

    Is there an easy example of a function with that (or a similar) autocorrelation?
     
  12. Dec 16, 2012 #11

    marcusl

    User Avatar
    Science Advisor
    Gold Member

    I will look up the Jaynes reference when I get back to work. Perhaps I have quoted it incorrectly.

    In the signal processing world, we don't normalize by 1/T--instead, the autocorrelation is defined as I wrote above. With your definition, the autocorrelation of any finite-length signal (which is always the kind we process!) must always be zero for all lags, which is not very useful. I guess I won't be very helpful for your mathematical problem...
     
  13. Dec 17, 2012 #12

    Jano L.

    User Avatar
    Gold Member

    Of course, in practical signal processing, the signals are finite. But in other areas it is useful to work with stationary functions and their average values - consider position of Brownian particle - and have the possibility to do the limit [itex]T\rightarrow \infty[/itex]. The normalization by [itex]1/T[/itex] is due to fact that we are often interested in a time average of some quantity.

    That's right, the function f(t) is not square-integrable, so the common proofs of the various manipulations are inapplicable. But recall objects such as [itex]\sin \Omega t[/itex] or [itex]\delta(t-t_0)[/itex]; these are not square-integrable either, but the exchange of integrals / limits works for them in many cases. For example,

    [tex]
    \lim_{T\rightarrow\infty} \frac{1}{T} \int_{-T/2}^{T/2} \sin(\Omega t)\sin(\Omega (t-\tau))\,dt = \frac{1}{2} \cos \Omega \tau
    [/tex]

    This makes me think that perhaps even for more complicated fluctuating function f(t), similar manipulations may work.
     
  14. Dec 17, 2012 #13
    Look right (at least according to Mathematica) and taking the square root and the inverse transform we get a BesselK function.
     
  15. Dec 17, 2012 #14

    marcusl

    User Avatar
    Science Advisor
    Gold Member

    I did misquote Jaynes. In Ch. 15 of "Probability Theory: The Logic of Science" he addresses the need to sum an infinite sequence before taking a limit, but does not discuss integrals. Sorry for that...

    To answer Jano's question to me in post #9: no, I can't think of a reason not to interchange the order of integrations/limits. I'll point out that you dropped a factor of alpha from your sinc function in post #3. Interchanging the order in Steven Tashi's last equation and performing the limit gives
    [tex] \lim_{T\rightarrow\infty}\frac{1}{T}\int_{-T/2}^{T/2} e^{i(\omega+\alpha)t} dt = \lim_{T\rightarrow\infty}\frac{\sin[(\omega + \alpha)T/2]}{(\omega + \alpha)T/2} .[/tex]
    To my non-mathematical brain this evaluates to zero (as Jano said in his first post), suggesting that there is no Fourier representation to your problem as posed. What a difference a "T" makes!

    bpet, I think it's an exponential rather than a Bessel.

    EDIT: I'm having second thoughts. When (omega+alpha)=0,
    [tex] \lim_{T\rightarrow\infty} \frac{1}{T}\int_{-T/2}^{T/2} e^0 dt = \lim_{T\rightarrow\infty} \frac{T}{T} = 1.[/tex]
    Is this is right? Then the integral becomes, effectively, [tex]\delta(\omega + \alpha)[/tex] and the rest of the comments in my post #8 apply.
     
    Last edited: Dec 17, 2012
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Fourier representation of a random function
Loading...