Excluding mean in variance calculation

  • Thread starter Polymath89
  • Start date
  • #1
Polymath89
27
0
I'm reading a finance book in which the author proposes to exclude the mean when calculating the variance of returns, because he thinks it's difficult to distinguish the drift of the price from the variance of that time series. So he basically calculates the sample variance like this [itex]\frac{1}{N}\sum_{i=1}^n (x_i)^2[/itex]. I haven't seen anybody calculate variance like this before, so my question is whether this is a common way to calculate it or whether this could yield problematic results?
 

Answers and Replies

  • #2
36,099
13,022
This is the second moment. I doubt that it has a useful application in finance without looking at the mean of the values, but I don't know.
 

Suggested for: Excluding mean in variance calculation

Replies
1
Views
307
Replies
1
Views
467
  • Last Post
Replies
3
Views
897
  • Last Post
Replies
3
Views
388
  • Last Post
Replies
2
Views
600
Replies
1
Views
401
Top