Sampling theorem says that we should sample at least at a double rate than the bandwidth of the signal : Fs>=2B.(adsbygoogle = window.adsbygoogle || []).push({});

Ideally Fs==2B e.i. oversampling is unnecessary.

However, there's always noise present in the signal. Hartley's theorem connects the SNR with the bandwidth and the sampling rate for a binary digital signal : http://en.wikipedia.org/wiki/Shannon–Hartley_theorem

But I deal with an analog signal which is similar to an audio signal. What is the relation between the sampling rate and the SNR?

Wiki about oversampling:

"Noise reduction/cancellation. If multiple samples are taken of the same quantity with a different (and uncorrelated) random noise added to each sample, then averaging N samples reduces the noise variance (or noise power) by a factor of 1/N"

I cann't find the proof of it.

To precise my question, I have an oversampling signal acquiring system and the oversampling normally improves the accuracy in spectrum calculation. Matlab simulation confirms it.

I'd like to have a mathematical proof of this.

Thank you

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Sampling theorem with noise

Loading...

Similar Threads - Sampling theorem noise | Date |
---|---|

Help interpreting a design for power management unit | Jan 3, 2018 |

Down sampling, bandpass sampling theorem, downconversion | Nov 29, 2011 |

Sampling theorem problem. | Jul 11, 2010 |

Shannon sampling theorem and Nyquist | May 6, 2007 |

Counter-example to Nyquist's Sampling Theorem? | Mar 23, 2005 |

**Physics Forums - The Fusion of Science and Community**