# Simple question about Error Analysis

1. Feb 17, 2013

### startinallover

I'm doing a report on a set of lab data and am supposed to find the mean and mean's standard deviation

$$\bar{x} + \sigma_{\bar{x}}$$

The mean is given by

$$\displaystyle{ \bar{x} = \sum_{i=1}^{N} w_{i} x_{i} }$$

Where

$$\displaystyle{ w_{i} = \left( \frac{\sigma}{\sigma_{i}} \right)^2 }$$

and for the error (mean's standard deviation)

$$\displaystyle{ \sigma_{\bar{x}} = \sigma = \frac{1}{ \sqrt{ \sum_{i=1}^{N} \frac{1}{\sigma_{i}^{2} } } } }$$

The problem is I can't quite figure it out what the σi would be, is it the standard deviation? This might sound very silly but it's been a long time I've dealt with this.

Any help is appreciated.

2. Feb 17, 2013

### Staff: Mentor

σi is the uncertainty (standard deviation*) of data point i.

*any multiple of it will work as well, if you keep it consistent, as it cancels in the fraction