Pooled variance

  • Thread starter adamg
  • Start date
48
0
Suppose you are conducting a hypothesis test to compare two sample means from independent samples, with the variance unknown, but you know it is the same for both populations. Then you use the pooled estimate of the variance given by [ (n1 - 1)s1^2 + (n2-1)s2^2 ] / (n1+n2-2)

I was just wondering why we use (n1-1) etc instead of using n1 and n2 and then dividing by n1 + n2?

thanks
 

uart

Science Advisor
2,776
9
adamg said:
Suppose you are conducting a hypothesis test to compare two sample means from independent samples, with the variance unknown, but you know it is the same for both populations. Then you use the pooled estimate of the variance given by [ (n1 - 1)s1^2 + (n2-1)s2^2 ] / (n1+n2-2)

I was just wondering why we use (n1-1) etc instead of using n1 and n2 and then dividing by n1 + n2?

thanks
When you calculate the sample variance using "(sum of squared difference from mean)/N" then it turns out that this gives a biased estimate of the population variance (and it's square-root a biased estimate of the population standard deviation). Replacing "N" with "N-1" gives an unbiased estimate of the population variance and standard deviation so it's usually preferred. Unfortunately there is often a bit of ambiguity whenever sample var and sd are discussed as there doesn't seem to be a universal standard of whether to use "N" or "N-1" in the definition.

In your example above I assume that s1^2 and s2^2 are based on the "N-1" calculations.
 
Last edited:

uart

Science Advisor
2,776
9
Here is the above in a bit more detail :

[tex]\[ s_n^2 = 1/n \sum (x_i-\bar{x})^2 \][/tex]

[tex]\[= 1/n \sum [ ( (x_i-\mu) - (\bar{x}-\mu) )^2 ]\][/tex]

[tex]\[= 1/n \sum [(x_i-\mu)^2 - 2 (x-\mu)(\bar{x}-\mu) + (\bar{x}-\mu)^2 ] \][/tex]

[tex]\[= 1/n \sum [(x_i-u)^2)] - (\bar{x}-u)^2 \][/tex]

So,

[tex]\[E[s_n^2] = 1/n \sum E[(x_i-u)^2)] - E[(\bar{x}-u)^2 ] \][/tex]

[tex]\[= E[(x-u)^2)] - E[(\bar{x}-u)^2 ] \][/tex]

[tex]\[= \sigma^2 - \{\rm{term\ greater\ than\ or\ equal\ zero}\} \][/tex]

This shows that the sample variance [tex]s_n^2[/tex] always under-estimates the population variance [tex]\sigma^2[/tex].

You can further show (assuming all the samples are independant) that [tex]E[(\bar{x}-u)^2 ] [/tex] is equal to [tex]$\sigma^2/n[/tex] and hence,

[tex]\[s_n^2 = \sigma^2 - \sigma^2/n = \frac{n-1}{n} \sigma^2 \][/tex].

So not only is [tex]s_n^2[/tex] a biased estimator of [tex]\sigma^2[/tex] it is too small by a factor of precisely [tex](n-1)/n[/tex]. Clearly using [tex](n-1)[/tex] instead of [tex](n)[/tex] in the denominator fixes this and makes the expectation of this modified sample variance ([tex]E(s_{n-1}^2)[/tex]) equal to the population variance ([tex]\sigma^2[/tex]).
 
Last edited:

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving

Hot Threads

Top