How is Standard Deviation Derived and What is Chebychev's Rule?

  • Thread starter Thread starter Moose352
  • Start date Start date
  • Tags Tags
    Statistics
Moose352
Messages
165
Reaction score
0
It seems to me that a lot the concepts in statistics are rather arbitrary and don't seem be mathematically derived. For example, how is the equation for standard deviation derived? The textbook says that the standard deviation is the mean of all of the deviations of the values in the sample and since all the deviations add up to zero, the values are sqaured to get rid of the negative. I understand that, but why doesn't it just take the absolute value? Why is the square-root taken only after everything has been summed? Furthermore, why is it divided by (n-1) and not n?

Also, can anyone explain the proof for Chebychev's rule?

Thanks very much.
 
Physics news on Phys.org
Hello Moose,

The standard deviation formula can be derived, my Maths teacher showed me. Unfortunately I am unable to derive it so I will not be much help there. The reason we divide by n-1 sometimes is to give as a most accurate or unbiased estimate for sample data. There is also proof for that, which I am also unable to do. Hope this helped.

Regards,

Daniel
 
Thanks repugno. It's good to know that there is a proof, but I will not be convinced until i see it.
 
Lol .. tried to give you some Mathematical proof, seems that I can't get the Latex code right. You're on your own now. :D
 
Last edited:
Ah! Please try again. I can't find any other proof. I think the real problem is I haven't yet found a concept (of course, granted that I haven't learned much) that the current definition of SD exclusively works for.
 
A large amount of statistics is based on the notion of normal distributions.

For normal distributions it is possible to, for example, show that a certain fraction of the results are within a standard deviation of the peak.
 
I completely understand. But why does it have to be based on that specific definition standard distribution. Can not those fractions be recalculated based on another definition of the standard deviation?
 
Very early in the history of statistics they did use absolute values. But the math of those is difficult: they are not "analytic functions". Squares on the other hand are polynomials, easy to work with. In fact the real number here is the variance, the square of the standard deviation (or rather, the standard deviation is the square root of the variance).

Any probability distribution that has moments has the mean as its first moment, and the variance as its second moment (essentially and with some tech fiddles). What are moments? well in one sense they are the parameters that determine the equation of the probability curve. The normal curve is distinguished because it has only those two moments; it is a two parameter curve. You tell me where the mean is and what the standard deviation is and I will give you the formula for that notmal curve and be able to draw it. Any other probability distribution that has moments at all - some don't - will be determined if you specify all of its moments, which may be an infinite number.
 
  • Like
Likes WWGD
Originally posted by selfAdjoint
Any other probability distribution that has moments at all - some don't - will be determined if you specify all of its moments, which may be an infinite number.
Wow, that sounds interesting. Is this like a Taylor expansion of a function? If the distribution has no moments, is it trivial?
 

Similar threads

Back
Top