Hi everyone, When people present their data in plots, they always talk about 1-sigma errors, which I totally dont understand. Don't we use the standard deviation to tell us how far the a data point is from the "mean"? Why would I care about the mean when discussing errors? Shouldn't one care about the deviation from the "true" value? do they still use the "Sigma" terminology when refering to the deviation from the true value? Thank ya
Often, when making a measurement, one does not know the "true" value. That's why one has to do measurements in the first place. Therefore, we repeat the experiment many times and we take the mean value as an approximation for the "true" value (the idea being, of course, that our measurements were rather precise). How large the standard-deviation is tells us something about the accuracy of the measurements. If we do not miss any systematic effects (for example, a mis-aligned detector or something more subtle, such as the velocity of the earth with respect to the aether :tongue:) then this approximation is reasonable.
I think how large the standard-deviation is tells us about the precision rather than the accuracy of the measurements. So if there are a couple of data points with some errors +/-, and one says that all the errors are 1-sigma errors, what do they mean? The data points are not multiple measurments of the same thing (for example flux with time), so there is no point of talking about a mean and deviation from that mean. Is 1-sigma errors used to mean something else maybe? Thank you