Excluding mean in variance calculation

  • Thread starter Polymath89
  • Start date
  • #1
27
0

Main Question or Discussion Point

I'm reading a finance book in which the author proposes to exclude the mean when calculating the variance of returns, because he thinks it's difficult to distinguish the drift of the price from the variance of that time series. So he basically calculates the sample variance like this [itex]\frac{1}{N}\sum_{i=1}^n (x_i)^2[/itex]. I haven't seen anybody calculate variance like this before, so my question is whether this is a common way to calculate it or whether this could yield problematic results?
 

Answers and Replies

  • #2
34,286
10,325
This is the second moment. I doubt that it has a useful application in finance without looking at the mean of the values, but I don't know.
 

Related Threads on Excluding mean in variance calculation

  • Last Post
Replies
18
Views
5K
Replies
3
Views
468
  • Last Post
Replies
11
Views
19K
Replies
2
Views
2K
Replies
3
Views
29K
  • Last Post
Replies
8
Views
11K
Replies
11
Views
9K
Replies
3
Views
2K
Top