# Proving an unbiased estimator

1. Sep 13, 2010

### exitwound

I have a terrible teacher and have to teach myself out of the book and don't understand this.

1. The problem statement, all variables and given/known data

Male verbal IQs
117 103 121 112 120 132 113 117 132
149 125 131 136 107 108 113 136 114

Female Verbal IQs
114 102 113 131 124 117 120 90
114 109 102 114 127 127 103

Denote the male values by X1, X2...Xm and female values by Y1, Y2...Yn. Suppose that the Xi's constitute a random sample froma distribution with mean mu_1 and standard deviation sigma_1 and the Yi's form a random sample distribution (independent from the Xi's) with mean mu_2 and standard deviation sigma_2.

a.) Use rules of expected vale to show that Xbar - Ybar is an unbiased estimator of mu_1 - mu_2.

3. The attempt at a solution

I know that bias is the difference between the Expected value of the estimator and the value of the parameter. However, I am completely lost on how I can figure this out if I don't know the true means of the IQs.

E(Xbar - Ybar) = E(Xbar) - E(Ybar) = (1/m)(X1+X2+...Xm) - (1/n)(Y1+Y2+...Yn)

I have no idea what this means or where to go.

2. Sep 13, 2010

### Staff: Mentor

You are given the means and standard deviations of the two sets of IQ data.

3. Sep 13, 2010

### exitwound

I know that E(Xbar) = E(X) = mu_1 and E(Ybar) = E(Y) = mu_2. E(Xbar-Ybar) = mu_1 - mu_2.

I don't know what it all means.

4. Sep 14, 2010

### Staff: Mentor

You pretty much have it, but are having trouble putting the pieces together.

$$E(\bar{X} - \bar{Y}) = E(\bar{X}) - E(\bar{Y}) = \mu_1 - \mu_2$$

So, on average, the statistic $$E(\bar{X} - \bar{Y})$$ can be expected to be equal to $\mu_1 - \mu_2$.

The definition (slightly paraphrased) for "unbiased estimator" in one of my books is this:
Let Y1, Y2, ..., Yn be a random sample from a distribution. An estimator W = h(Y1, Y2, ..., Yn) is said to be unbiased (for $\theta$) if E(W) = $\theta$, for all $\theta$.

Last edited: Sep 14, 2010
5. Sep 14, 2010

### exitwound

So I'm looking for the difference in the expectation of the estimator from what the estimator actually measures...

If Xbar - Ybar is mu1 - m2, and E(Xbar - Ybar) = mu1 - m2, then what exactly is this saying? I don't understand what's going on regarding the samples and distributions. I can't seem to find any good resources with pictures or graphs.

6. Sep 14, 2010

### Staff: Mentor

No, you're calculating the expectation of the statistic $$\bar{X} - \bar{Y}$$.
No, $$\bar{X} - \bar{Y} \neq \mu_1 - \mu_2$$

but if you took a large number of samples from the two populations, the differences of the sample averages --

$$\bar{X} - \bar{Y}$$

-- would cluster around $$\mu_1 - \mu_2$$.

One of my two references, Intro to Mathematical Statistics, 4th Ed., by Hogg & Craig, doesn't have a single picture or graph.

7. Sep 14, 2010

### logarithmic

You've almost got it.

There's a small error though.

Xbar = (1/m)(X1 + ... + Xm) so E(Xbar) = (1/m)[E(X1) + ... + E(Xm)]. This is because of linearity: E(aX + bY) = aE(X) +bE(Y).

Then the line: "Suppose that the Xi's constitute a random sample froma distribution with mean mu_1" says the sample is identically distributed.

Therefore E(X1) = E(X2) = ... = E(Xm). So E(Xbar) = (1/m)[m * E(X1)], and E(X1) = mu_1 (in fact the expected value of any of the X's is mu_1 because they have the same distribution).

Use the same argument for the Y's, and you get the result.
That's probably the most hardcore (but best) "introductory" stats textbook in print.

8. Sep 14, 2010

### exitwound

To be completely honest, I don't even know what the 'result' is. I'm lost on what we're actually trying to achieve. Proving that I'm using an unbiased estimator is given in the text I have and above as E(W) = 0, but I don't really understand what that means other than the results from the sample will have the same mean as the population itself.

9. Sep 14, 2010

### Staff: Mentor

E(W) = 0 would be true only if the population means (mu1 and mu2) of the two populations were equal, and this is not given in the problem.

I would advise you to spend more time on learning the definitions. If there is something that you don't understand about the definition, ask your instructor. If you do that, go in with specific questions about what you are having problems with, not vague statements such as "I don't get it" or "I'm lost" etc.

10. Sep 14, 2010

### exitwound

The instructor is Chinese, reads lectures that her former partner wrote (who was Greek), and does examples straight out of the book with no variation. There's absolutely nothing she can offer in English that would help...which is why I asked for pictures or graphs or something other than math lingo. I'm not a math major and can't think in abstracts. I seriously don't understand how the bias of a estimator relates to the estimates being done. Thanks for shoving me aside tho. It really helps.

11. Sep 14, 2010

### Staff: Mentor

OK, so she's a lousy teacher.
How do you know that? Have you gone to her office during her office hour (I assume she has regular office hours) to ask for clarification on some questions you have?
Are you assuming that only math types can think in abstracts? Regarding pictures and graphs, the higher you go in mathematics, the less likely you are to see pictures and graphs. This is why I said to focus on definitions, since they are crucial in mathematics.
I certainly appreciate your gratitude for the time I spent posting four responses to your question.

12. Sep 14, 2010

### logarithmic

Definition: The function $$g$$ is an unbiased estimator of $$\theta$$ if $$E(g)=\theta$$.

What this means intuitively is that the estimator is on average equal to the true value of what it's trying to estimate. Being unbiased is just a property (amongst many others) that good estimators should have. But this intuitive meaning has no place in a mathematical proof, like this question, although it's probably something that's good to know so you have a feeling for what's going on.

An estimator is *any* function of the observed values (that's the definition). So we could say that $$4+\frac{1}{m}\sum_{i=1}^mX_i$$ is an estimator for the mean. But this would be a terrible estimator, because it is biased in the sense that it will on average overestimate the mean by 4. So we would like an unbiased estimator instead.

You want to prove that the function $$\bar{X}-\bar{Y}=\frac{1}{m}\sum_{i=1}^mX_i - \frac{1}{n}\sum_{i=1}^nY_i$$ is an unbiased estimator for $$\mu_1-\mu_2$$.

Want to show: $$E(\bar{X}-\bar{Y}) = \mu_1-\mu_2$$

Proof:
$$E(\bar{X}-\bar{Y}) = E(\bar{X})-E(\bar{Y})$$ because E(aX+bY) = aE(X) + bE(Y)

$$=E(\frac{1}{m}\sum_{i=1}^mX_i)-E(\frac{1}{n}\sum_{i=1}^nY_i)$$ (substituting Xbar and Ybar in)

$$=\frac{1}{m}\sum_{i=1}^mE(X_i)-\frac{1}{n}\sum_{i=1}^nE(Y_i)$$ using E(aX+bY) = aE(X) + bE(Y) again

$$=\frac{1}{m}\sum_{i=1}^m\mu_1-\frac{1}{n}\sum_{i=1}^n\mu_2$$ because all the X's are identically distributed (as explained in my last post), so they all have the same expected value: namely mu_1, same with the Y's.

$$=\frac{1}{m}(m\mu_1)-\frac{1}{n}(n\mu_2)$$ (now your just summing up constants)

$$= \mu_1 - \mu_2$$ which is what we wanted to show.

Last edited: Sep 15, 2010
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook