# Why standard deviation is preferred over mean deviation?

Tags:
1. Dec 1, 2015

### Shehbaj singh

I was studying in my math textbook about these two methods to measure dispersion and variability in data. I was able to understand mean deviation that it's the average by how much a quantity deviates from mean but I was unable to understand standard deviation. Also, as standard deviation squares deviation it makes big deviation bigger. So, please help to understand why it's preferred over mean deviation.

2. Dec 1, 2015

### BvU

Hello Shehbaj,

Squares are a lot more comfortable to deal with than absolute values. But the real reason is that standard deviations have a much more prominent role in probability and statistics than mean deviations. Your 'squares make big deviations bigger' is in fact an argument pro: big deviations are a lot less likely than small deviations.

If you are -- as I guess -- just being introduced to this matter, my advice is to accept it for the moment and exercise patience. It'll become pretty obvious after a while.

3. Dec 1, 2015

### Shehbaj singh

Okay. Thanks for your suggestion . I am somewhat weird student and gets excited very often in topics of maths.

4. Dec 1, 2015

### Mentallic

Being excited in maths is not weird in itself, but claiming you're weird because of it is.

5. Dec 9, 2015

### mathexam

Standard deviation should be the same as mean deviation. Because the standard deviation is how much a set of values vary from the mean of the values.

Apart from this, I'm not sure.

6. Dec 9, 2015

### BvU

Certainly not !

7. Dec 9, 2015

### mathexam

Can you explain?

What I think it is, is that standard deviation represents the answer differently but they are similar in concept. I know standard deviation also deal with how much the numbers vary from the mean of the group.

8. Dec 9, 2015

### BvU

The two are clearly defined. There is no reason they should have the same value.

9. Dec 9, 2015

### mathexam

Yeah, I believe it's similar in concept but defined differently as output. With standard deviation, 1 standard deviation entails 68% of the data, 2 standard deviations entails 95% of the data, while 3 is over 99% of the data. Mean deviation probably holds a different value in terms of this.

10. Dec 10, 2015

### jbriggs444

It is not automatically true that 68% of the points in a data set will lie within one standard deviation of the mean. That result holds for data sets that follow a "normal" distribution. Yes, it is true that for a normal distribution, the mean deviation and the standard deviation will be in proportion. But for a population that is not normally distributed, the mean deviation and the standard deviation need not be in that same proportion.

11. Dec 11, 2015

### micromass

A bound is possible though. For example, for any distribution with mean and variance, it is known that at least 75% of the data lies within 2 standard deviations of the mean.

Anyway, you will often hear people say that we work with standard deviation instead of mean deviation because it has nicer properties mathematically, such as differentiablity. This is only part of the story. The real answer is the concept of a correlation and a covariance. This is a very natural concept which measures the degree of linear association between two data sets. This leads automatically to the concept of a variance. The mean deviation does not have a nice associated "correlation measure".

12. Dec 11, 2015

### Hornbein

It is because the standard deviation has nice mathematical properties and the mean deviation does not.

The variance is the square of the standard deviation. The sum of the variances of two independent random variables is equal to the variance of the sum of the variables. This is fundamental.