# Hi,I was looking at the way variance is calcullated. Let us say V

Hi,

I was looking at the way variance is calcullated. Let us say V represents variance, then

V = average of the sum of squares of the difference between Xi and the mean (μ)

What I do not understand is why square the difference? By squaring are we not magnifying the difference by many folds?

If squaring is only to prevent canceling of negative and positive differences, then we can instead average the sum of modulus of the difference between Xi and the mean (μ), which I believe is an accurate representation, i.e.

V = (Ʃ|Xi-μ|)/N

Am I missing something here or my understanding is wrong about variance?

Thanks,
Santosh

Related Set Theory, Logic, Probability, Statistics News on Phys.org

Hi,

I was looking at the way variance is calcullated. Let us say V represents variance, then

V = average of the sum of squares of the difference between Xi and the mean (μ)

What I do not understand is why square the difference? By squaring are we not magnifying the difference by many folds?

If squaring is only to prevent canceling of negative and positive differences, then we can instead average the sum of modulus of the difference between Xi and the mean (μ), which I believe is an accurate representation, i.e.

V = (Ʃ|Xi-μ|)/N

Am I missing something here or my understanding is wrong about variance?

Thanks,
Santosh
Squaring the deviation emphasizes larger differences. Additionally, for a normal distribution, 68% of values lie with 1 standard deviation of the mean, 95% of values lie within 2 standard deviations of the mean, and 99.7% of the values lie within 3 standard deviations of the mean. For some reason, this only seems to work if we use squared deviations as part of our definition of variance.