Please correct me where I am wrong but it seems to me that you could generate a very high value of sigma (e.g. 6 sigma accuracy) from a very small sample size. How then is sigma on its own reliable?(adsbygoogle = window.adsbygoogle || []).push({});

Let me see if I understand sigma.

To determine the standard deviation I first compute the mean average. For each data point, I take the difference from the mean, square it, determine the average of the squares, and then take the square root. Is this that one sigma?

If it is one sigma, with a mean of 20 and standard deviation of 2, values of 18-22 represent one sigma accuracy? Values of <8 or >32 represent 6 sigma accuracy?

What additional checks or sample size must accompany the sigma value to make it reliable?

**Physics Forums - The Fusion of Science and Community**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Understanding Sigma and Sample Sizes - (high Sigma in small samples)

Loading...

Similar Threads - Understanding Sigma Sample | Date |
---|---|

I Significance of an excess calculated by the Asimov formula | Mar 7, 2018 |

B Understanding Chi Squared p-values | Jan 23, 2018 |

I Correct way to understand logical implication? | Feb 23, 2017 |

I Understanding Yates' correction | Feb 9, 2017 |

I Understanding the null hypothesis | Feb 8, 2017 |

**Physics Forums - The Fusion of Science and Community**