- #1

- 2

- 0

I was looking at the way variance is calcullated. Let us say V represents variance, then

V = average of the sum of squares of the difference between Xi and the mean (μ)

What I do not understand is why square the difference? By squaring are we not magnifying the difference by many folds?

If squaring is only to prevent canceling of negative and positive differences, then we can instead average the sum of modulus of the difference between Xi and the mean (μ), which I believe is an accurate representation, i.e.

V = (Ʃ|Xi-μ|)/N

Am I missing something here or my understanding is wrong about variance?

Thanks,

Santosh