- #1
- 893
- 483
Hi all,
I think there is a really obvious answer to this, but I just don't see it yet. Suppose you had N data sets that all measured the same quantity as a function of time. Each data set shows the same signal plus a random noise component which is normally distributed about the signal with a constant standard deviation. If you were to take an average over the N data sets, you would expect to see the same signal reduced noise. If I'm not mistaken, the more data sets averaged, the lower the resulting noise. What would the standard deviation of the noise be in the averaged data? Thanks!
I think there is a really obvious answer to this, but I just don't see it yet. Suppose you had N data sets that all measured the same quantity as a function of time. Each data set shows the same signal plus a random noise component which is normally distributed about the signal with a constant standard deviation. If you were to take an average over the N data sets, you would expect to see the same signal reduced noise. If I'm not mistaken, the more data sets averaged, the lower the resulting noise. What would the standard deviation of the noise be in the averaged data? Thanks!