I'm reading a finance book in which the author proposes to exclude the mean when calculating the variance of returns, because he thinks it's difficult to distinguish the drift of the price from the variance of that time series. So he basically calculates the sample variance like this [itex]\frac{1}{N}\sum_{i=1}^n (x_i)^2[/itex]. I haven't seen anybody calculate variance like this before, so my question is whether this is a common way to calculate it or whether this could yield problematic results?(adsbygoogle = window.adsbygoogle || []).push({});

**Physics Forums - The Fusion of Science and Community**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Excluding mean in variance calculation

Loading...

Similar Threads - Excluding mean variance | Date |
---|---|

I Gradient descent, hessian(E(W^T X)) = cov(X),Why mean=/=0?? | Apr 29, 2017 |

ZF Set Theory and Law of the Excluded Middle | Oct 11, 2014 |

Excluded middle and self-reference | Apr 27, 2013 |

(regression) why would you exclude an explanatory variable | Sep 10, 2010 |

Prove that the law of excluded middle does not hold in some many-valued logic | May 10, 2009 |

**Physics Forums - The Fusion of Science and Community**