# Standard deviation

Can somebody explain to me why we have to square the variances when calculating the deviation and then finding the square root(whc is suppose to reverse the squaring) ,it doesnt make sense to me

mathman
It is simply a matter of definition. Let X be a random variable, and let A=E(X) (average).
Then the variance V is DEFINED by V=E((X-A)2) and the standard deviation is DEFINED as the square root of the variance.

Mute
Homework Helper
Can somebody explain to me why we have to square the variances when calculating the deviation and then finding the square root(whc is suppose to reverse the squaring) ,it doesnt make sense to me
If you want to find the typical deviation from the average, you can't calculate the average of x -<x>, because that average is zero. You need a measure where the deviations from the average don't cancel out. Taking the average of |x-<x>| works, but that's not nice because the absolute value function isn't differentiable at zero. (x-<x>)^2 also works, and is differentiable. However, if x has units, you can't compare <(x-<x>)^2> directly to x, because the units don't match. So, you need take the square root of to get something with the same units of x that you can treat as a deviation from the mean.

chiro