I ran across an interesting statistic today while doing some research, but it was stated as a matter of fact without explanation and there appears to be a dearth of material on it. It was stated that the Mean Absolute Deviation ("MAD") of a Normal (Gaussian) Distribution is .7979 of a Normal Distribution's Standard Deviation ("SD"). The simple equation offered was MAD:SD=SQRT (2/pi).(adsbygoogle = window.adsbygoogle || []).push({});

Question 1: Assuming this statement is true, why is it true? That is, what is it about the Normal Distribution that would cause a MAD to be .7979 of the SD?

Question 2: Again, assuming this statment is true, how would you reconcile two samples, one of which has a more favorable Jarque-Bera Test Statistic than another, but a less favorable MAD/SD Ratio?

Thank you in advance.

Kimberley

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Mean Absolute Deviation/Standard Deviation Ratio

Loading...

Similar Threads for Mean Absolute Deviation | Date |
---|---|

I Gradient descent, hessian(E(W^T X)) = cov(X),Why mean=/=0?? | Apr 29, 2017 |

B "Strength" of the mean of the distribution curve | Apr 15, 2017 |

Mean Absolute Percentage Error in R | Jul 24, 2013 |

Comparing Absolute Deviation to Mean Absolute Deviation | Jul 17, 2012 |

Standard deviation versus absolute mean deviation | Jun 5, 2009 |

**Physics Forums - The Fusion of Science and Community**