I What is this formula measuring error?

AI Thread Summary
The formula in question measures error but exhibits issues for data sets centered around zero, leading to inconsistent behavior as sample size increases. It scales differently compared to standard deviation and standard error, raising concerns about its reliability. The user seeks clarification on its derivation and a more detailed description. A link to the Coefficient of Variation on Wikipedia is provided as a potentially useful resource. Understanding this formula's limitations is crucial for accurate data analysis.
crashcat
Messages
46
Reaction score
33
TL;DR Summary
I came across this formula that measures the distribution of measurements, but it makes no sense to me and I hope someone can explain it to me.
Variance and standard deviation and other measures of error I understand. This formula doesn't behave well for data sets centered around zero and also has other problems, like scaling differently as N increases than the standard deviation or standard error. Does anyone recognize this and can point me to a description or derivation? $$\frac{1}{\bar{x}}\sqrt{\frac{n\sum{x^2}-(\sum{x})^2}{n(n-1)}}$$
 
Physics news on Phys.org
Back
Top