# Filter out Gaussian noise from data curve

1. Aug 27, 2008

### Gerenuk

What is the best way to filter out Gaussian noise from points in a data curve? (i.e. each point of the x-y-graph is displaced in y by a random amount)

A simple running mean does it, but on the other hand it also changes the shape of the underlying real curve.

Is it possible to use the "non-correlation properties" of the noise and some smooth-behaviour-property of the data to preserve the shape the the real data curve?

2. Aug 27, 2008

Generally you will have to trade off the two effects. The goal should be lots of noise reduction with minimal disturbance to the 'real' curve. But the only way to take out the noise without changing the 'real' curve is if you already know what the real curve looks like, in which case you don't need to de-noise corrupted data in the first place.

Well, for Gaussian noise, the correlation properties completely characterize it. I.e., there aren't any other independent noise properties. The key will be using assumptions on how the "real" data should behave: smoothness, correlation, monotonicity, something along those lines. The way to proceed is to figure out exactly what you DO know about the data (which is presumably less than exactly what the real curve looks like), and then apply that. But for anyone to help, you'll first have to tell us what you already know about the data.

3. Aug 28, 2008

### Gerenuk

I have approximately 30 points of data. The real curve is one or two mostly Gaussian bumps spreading over 15 of the central points. Usually the remaining 15 points form a quadric background, so it's fairly smooth. The amplitude of the big bumps is about 20 times the errorbar (standard deviation) of one point.

So I'm usually interested in long wavelength structures.