Autocorrelation of white noise.

  • Thread starter vanesch
  • Start date
  • #1
vanesch
Staff Emeritus
Science Advisor
Gold Member
5,028
16

Main Question or Discussion Point

I'm stuck with an elementary thing, it must be something obvious but I can't see what's wrong.

Here it goes. I was writing up some elementary course material for an instrumentation course, and wanted to quickly introduce "white noise".

Now, the usual definition of white noise is something like a stationary random process such that E( X(t) ) = 0 for all t and a flat power spectral density.

On the other hand, the autocorrelation function is defined as R(tau) = E (X(t) X(t+tau) ) (independent of t).

But here's the problem. The Wiener-Khinchine theorem states that the power spectral density equals the fourier transform of the autocorrelation function, so a flat power spectral density comes down to a Dirac for the autocorrelation. And for example on Wiki, you find that as a defining property of white noise.

But the autocorrelation of white noise E (X(t) X(t) ) is nothing else but sigma-squared.

So it would seem that the autocorrelation function is everywhere 0, except in 0, where it is a finite number.

What am I missing here ?
 

Answers and Replies

  • #2
5
0
White noise cannot be defined rigorously in any of these ways. White noise does not exist as a stochastic process, in the same way that the Dirac delta function does not exist as a function.

There is no (measurable) continuous time stochastic process X that satisfies E[X(t)] = 0, var(X(t)) = σ2, with X(s) and X(t) independent whenever s ≠ t. If we allow var(X(t)) to be infinite, then we can construct such a process, but of course it cannot be continuous. Such a definition, however, is completely useless, because we need the integral of white noise to be Brownian motion. (And it would not be for such a process.)

To rigorously define white noise, we could start with a Brownian motion, B(t). Each sample path, t → B(ω,t), has a derivative in the space of generalized functions on the positive half-line. Call this derivative W(ω). Then W(ω) is our white noise process. Strictly speaking, it does not have pointwise values. We can only integrate it against test functions. Formally, we would have

φ(t)W(t) dt = -∫ φ'(t)B(t) dt
= ∫ φ(t) dB(t),

where this last integral is the Ito integral.

If we want to consider the "process" σW, then this is the distributional derivative of σB, and var(σB(t + h) - σB(t)) = σ2h. If we want to look at difference quotients of σB (which diverge, of course), then we have

var((σB(t + h) - σB(t))/h) = σ2/h.

So even heuristically, the variance of white noise at a single point in time should be infinite. For a more accurate heuristic, we might say that

var((1/h)∫tt+h σW(s) ds) = σ2/h.
 
  • #3
vanesch
Staff Emeritus
Science Advisor
Gold Member
5,028
16
Thanks !
 

Related Threads for: Autocorrelation of white noise.

  • Last Post
Replies
7
Views
20K
  • Last Post
Replies
4
Views
10K
Replies
2
Views
559
Replies
2
Views
4K
Replies
0
Views
1K
Replies
1
Views
456
Replies
3
Views
5K
  • Last Post
Replies
2
Views
3K
Top