# Mean Squared Error of an estimator.

1. Oct 24, 2012

### Ddvon

Hi.

Let X_1 and X_2 be independent random variables with
mean  μ and variance σ^2.

$\Theta$ = ( X_1 + 3X_2 ) /4

a) is it unbiased?

b) what is the variance of the estimator?

c) what is the mean squared error of the estimator?

since there are four things, divided by 4, it is unbiased.

Then the variance is E[ (X1 + 3X2/4) - μ]^2 + [(X1+3X2/4) - μ)^2

and while expanding this, I got stuck when it was time "get the stuff out of E"

Can anyone help me with this? I have been searching (book has one paragraph long explanation) for many hours, but no avail.

Thank you

2. Oct 24, 2012

### mathman

I am a little confused by your question. I get E(Θ) = μ, Var(Θ) = Var((X_1)/4) + Var(3(X_2)/4) = 5σ2/8.

What is the difference between mean square error and variance - I thought they were the same by definition.

3. Oct 24, 2012

### Ddvon

I thought that

Mean squared error (MSE)

MSE($\Theta$) = E($\Theta$ - θ)^2

so

MSE($\Theta$) = V($\Theta$) + bias^2

isn't it?

4. Oct 25, 2012

### mathman

θ Θ - could you define these symbols precisely. What is the definition of bias?

5. Oct 25, 2012

### chiro

Bias is a standard definition where bias = E[theta_hat] - theta where theta_hat is an estimator (based on a random sample) for theta.

6. Oct 25, 2012

### chiro

That's correct.

7. Oct 25, 2012

### chiro

Hey Ddvon and welcome to the forums.

What parameter are you trying to estimate? Is it the mean or variance of the distribution? Something else perhaps?

8. Oct 26, 2012

### mathman

It would be a more helpful if you described what you are driving at. I think (but I am not sure) you are computing a statistical average and using it to estimate the mean.

In the problem you are posing, what is the average and what is the random variable? I have trouble distinguishing. Θ = ( X_1 + 3X_2 ) /4 is the only thing defined.