The Sigma Terminology in Data Plots: Confusion and Questions

Click For Summary
SUMMARY

The discussion clarifies the use of "1-sigma errors" in data plots, emphasizing that it refers to the standard deviation of a dataset rather than the deviation from a "true" value. Participants explain that the mean is often used as an approximation for the true value when the actual true value is unknown. The conversation highlights the distinction between precision and accuracy, noting that standard deviation indicates precision, while systematic errors can affect accuracy. The confusion arises when data points are not multiple measurements of the same quantity, leading to questions about the relevance of 1-sigma errors in such contexts.

PREREQUISITES
  • Understanding of standard deviation and its role in statistical analysis
  • Familiarity with the concepts of precision and accuracy in measurements
  • Basic knowledge of experimental design and data collection methods
  • Awareness of systematic errors and their impact on measurement outcomes
NEXT STEPS
  • Research the implications of systematic errors in experimental measurements
  • Learn about the differences between precision and accuracy in data analysis
  • Explore the application of standard deviation in various scientific fields
  • Investigate alternative statistical methods for analyzing data with unknown true values
USEFUL FOR

Researchers, data analysts, and scientists who present data in plots and seek to understand the implications of using sigma terminology in their analyses.

Lorna
Messages
44
Reaction score
0
Hi everyone,
When people present their data in plots, they always talk about 1-sigma errors, which I totally don't understand. Don't we use the standard deviation to tell us how far the a data point is from the "mean"? Why would I care about the mean when discussing errors? Shouldn't one care about the deviation from the "true" value? do they still use the "Sigma" terminology when referring to the deviation from the true value?

Thank ya
 
Physics news on Phys.org
Often, when making a measurement, one does not know the "true" value. That's why one has to do measurements in the first place. Therefore, we repeat the experiment many times and we take the mean value as an approximation for the "true" value (the idea being, of course, that our measurements were rather precise). How large the standard-deviation is tells us something about the accuracy of the measurements. If we do not miss any systematic effects (for example, a mis-aligned detector or something more subtle, such as the velocity of the Earth with respect to the aether :-p) then this approximation is reasonable.
 
I think how large the standard-deviation is tells us about the precision rather than the accuracy of the measurements.

So if there are a couple of data points with some errors +/-, and one says that all the errors are 1-sigma errors, what do they mean? The data points are not multiple measurments of the same thing (for example flux with time), so there is no point of talking about a mean and deviation from that mean. Is 1-sigma errors used to mean something else maybe?

Thank you
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 37 ·
2
Replies
37
Views
7K
  • · Replies 22 ·
Replies
22
Views
4K
Replies
28
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K