The Sigma Terminology in Data Plots: Confusion and Questions

AI Thread Summary
The discussion centers on the confusion surrounding the use of 1-sigma errors in data plots. It highlights that 1-sigma refers to the standard deviation, which indicates the precision of measurements rather than their accuracy. Participants note that the mean value is often used as an approximation for the "true" value, especially when the true value is unknown. The conversation questions the relevance of discussing deviations from the mean when data points represent different measurements rather than repeated trials. Overall, the thread seeks clarity on the application of sigma terminology in the context of error analysis.
Lorna
Messages
44
Reaction score
0
Hi everyone,
When people present their data in plots, they always talk about 1-sigma errors, which I totally don't understand. Don't we use the standard deviation to tell us how far the a data point is from the "mean"? Why would I care about the mean when discussing errors? Shouldn't one care about the deviation from the "true" value? do they still use the "Sigma" terminology when referring to the deviation from the true value?

Thank ya
 
Physics news on Phys.org
Often, when making a measurement, one does not know the "true" value. That's why one has to do measurements in the first place. Therefore, we repeat the experiment many times and we take the mean value as an approximation for the "true" value (the idea being, of course, that our measurements were rather precise). How large the standard-deviation is tells us something about the accuracy of the measurements. If we do not miss any systematic effects (for example, a mis-aligned detector or something more subtle, such as the velocity of the Earth with respect to the aether :-p) then this approximation is reasonable.
 
I think how large the standard-deviation is tells us about the precision rather than the accuracy of the measurements.

So if there are a couple of data points with some errors +/-, and one says that all the errors are 1-sigma errors, what do they mean? The data points are not multiple measurments of the same thing (for example flux with time), so there is no point of talking about a mean and deviation from that mean. Is 1-sigma errors used to mean something else maybe?

Thank you
 
Back
Top