I What is this formula measuring error?

Click For Summary
The formula in question measures error but exhibits issues for data sets centered around zero, leading to inconsistent behavior as sample size increases. It scales differently compared to standard deviation and standard error, raising concerns about its reliability. The user seeks clarification on its derivation and a more detailed description. A link to the Coefficient of Variation on Wikipedia is provided as a potentially useful resource. Understanding this formula's limitations is crucial for accurate data analysis.
crashcat
Messages
46
Reaction score
33
TL;DR
I came across this formula that measures the distribution of measurements, but it makes no sense to me and I hope someone can explain it to me.
Variance and standard deviation and other measures of error I understand. This formula doesn't behave well for data sets centered around zero and also has other problems, like scaling differently as N increases than the standard deviation or standard error. Does anyone recognize this and can point me to a description or derivation? $$\frac{1}{\bar{x}}\sqrt{\frac{n\sum{x^2}-(\sum{x})^2}{n(n-1)}}$$
 
Physics news on Phys.org
Hello, I'm joining this forum to ask two questions which have nagged me for some time. They both are presumed obvious, yet don't make sense to me. Nobody will explain their positions, which is...uh...aka science. I also have a thread for the other question. But this one involves probability, known as the Monty Hall Problem. Please see any number of YouTube videos on this for an explanation, I'll leave it to them to explain it. I question the predicate of all those who answer this...