Why standard deviation is preferred over mean deviation?

Standard deviation is also closely related to the concept of correlation and covariance, which is a measure of linear association between two data sets. Mean deviation does not have a similar measure and is therefore not as useful in probability and statistics. Overall, standard deviation is preferred over mean deviation because of its mathematical properties and applications in various fields such as probability and statistics.
  • #1
Shehbaj singh
16
1
I was studying in my math textbook about these two methods to measure dispersion and variability in data. I was able to understand mean deviation that it's the average by how much a quantity deviates from mean but I was unable to understand standard deviation. Also, as standard deviation squares deviation it makes big deviation bigger. So, please help to understand why it's preferred over mean deviation.
 
Mathematics news on Phys.org
  • #2
Hello Shehbaj, :welcome:

Squares are a lot more comfortable to deal with than absolute values. But the real reason is that standard deviations have a much more prominent role in probability and statistics than mean deviations. Your 'squares make big deviations bigger' is in fact an argument pro: big deviations are a lot less likely than small deviations.

If you are -- as I guess -- just being introduced to this matter, my advice is to accept it for the moment and exercise patience. It'll become pretty obvious after a while.

:smile:
 
  • Like
Likes Shehbaj singh
  • #3
Okay. Thanks for your suggestion . I am somewhat weird student and gets excited very often in topics of maths.
 
  • #4
Being excited in maths is not weird in itself, but claiming you're weird because of it is.
 
  • #5
Standard deviation should be the same as mean deviation. Because the standard deviation is how much a set of values vary from the mean of the values.

Apart from this, I'm not sure.
 
  • #6
mathexam said:
Standard deviation should be the same as mean deviation. Because the standard deviation is how much a set of values vary from the mean of the values.
Certainly not !
 
  • Like
Likes PWiz
  • #7
BvU said:
Certainly not !

Can you explain?

What I think it is, is that standard deviation represents the answer differently but they are similar in concept. I know standard deviation also deal with how much the numbers vary from the mean of the group.
 
  • #8
The two are clearly defined. There is no reason they should have the same value.
 
  • Like
Likes PWiz
  • #9
Yeah, I believe it's similar in concept but defined differently as output. With standard deviation, 1 standard deviation entails 68% of the data, 2 standard deviations entails 95% of the data, while 3 is over 99% of the data. Mean deviation probably holds a different value in terms of this.
 
  • #10
mathexam said:
Yeah, I believe it's similar in concept but defined differently as output. With standard deviation, 1 standard deviation entails 68% of the data, 2 standard deviations entails 95% of the data, while 3 is over 99% of the data. Mean deviation probably holds a different value in terms of this.
It is not automatically true that 68% of the points in a data set will lie within one standard deviation of the mean. That result holds for data sets that follow a "normal" distribution. Yes, it is true that for a normal distribution, the mean deviation and the standard deviation will be in proportion. But for a population that is not normally distributed, the mean deviation and the standard deviation need not be in that same proportion.
 
  • #11
jbriggs444 said:
It is not automatically true that 68% of the points in a data set will lie within one standard deviation of the mean. That result holds for data sets that follow a "normal" distribution. Yes, it is true that for a normal distribution, the mean deviation and the standard deviation will be in proportion. But for a population that is not normally distributed, the mean deviation and the standard deviation need not be in that same proportion.

A bound is possible though. For example, for any distribution with mean and variance, it is known that at least 75% of the data lies within 2 standard deviations of the mean.

Anyway, you will often hear people say that we work with standard deviation instead of mean deviation because it has nicer properties mathematically, such as differentiablity. This is only part of the story. The real answer is the concept of a correlation and a covariance. This is a very natural concept which measures the degree of linear association between two data sets. This leads automatically to the concept of a variance. The mean deviation does not have a nice associated "correlation measure".
 
  • Like
Likes jbriggs444
  • #12
Shehbaj singh said:
I was studying in my math textbook about these two methods to measure dispersion and variability in data. I was able to understand mean deviation that it's the average by how much a quantity deviates from mean but I was unable to understand standard deviation. Also, as standard deviation squares deviation it makes big deviation bigger. So, please help to understand why it's preferred over mean deviation.
It is because the standard deviation has nice mathematical properties and the mean deviation does not.

The variance is the square of the standard deviation. The sum of the variances of two independent random variables is equal to the variance of the sum of the variables. This is fundamental.
 

1. Why is standard deviation considered a better measure of variation than mean deviation?

Standard deviation takes into account the variability of all data points in a dataset, while mean deviation only considers the distance of each data point from the mean. This means that standard deviation gives a more accurate representation of how spread out the data points are from the average.

2. How is standard deviation calculated and how does it differ from mean deviation?

Standard deviation is calculated by finding the square root of the sum of the squared differences between each data point and the mean, divided by the number of data points. This differs from mean deviation, which is calculated by finding the absolute value of the differences between each data point and the mean, divided by the number of data points.

3. Can standard deviation ever be equal to mean deviation?

Yes, in some rare cases, the standard deviation and mean deviation may be equal. This occurs when all data points in a dataset are equidistant from the mean, resulting in the squared differences being equal to the absolute differences.

4. Are there any limitations to using standard deviation as a measure of variation?

Standard deviation may not be an appropriate measure of variation for datasets with extreme outliers, as these outliers can greatly influence the value of the standard deviation. In these cases, alternative measures of variation such as median absolute deviation may be more suitable.

5. How does the use of standard deviation over mean deviation impact statistical analyses?

Using standard deviation over mean deviation can lead to more accurate and precise results in statistical analyses, as it takes into account the full range of data points. This is especially important in inferential statistics, where standard deviation is used to calculate confidence intervals and determine the significance of results.

Similar threads

  • General Math
Replies
6
Views
1K
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
905
  • Chemistry
Replies
1
Views
891
Replies
24
Views
2K
  • General Math
Replies
5
Views
2K
  • Introductory Physics Homework Help
Replies
3
Views
2K
Back
Top