- #1
dvvv
- 26
- 0
Homework Statement
Let [itex]\bar{X}[/itex]1 and [itex]\bar{X}[/itex]2 be the means of two independent samples of sizes n and 2n from an infinite population that has mean μ and variance σ^2 > 0. For what value of w is w[itex]\bar{X}[/itex]1 + (1 - w)[itex]\bar{X}[/itex]2 the minimum variance unbiased estimator of μ?
(a) 0
(b) 1/3
(c) 1/2
(d) 2/3
(e) 1
Homework Equations
If θ~ is unbiased for θ and
Var(θ~)= 1/E[(d loge f (x)/dθ)^2] = 1/E[(dl(θ)/dθ)^2]
then θ~ is a minimum variance unbiased estimator of θ.
The Attempt at a Solution
E[w[itex]\bar{X}[/itex]1 + (1 - w)[itex]\bar{X}[/itex]2] = wμ + (1-w)μ = μ
So it's an unbiased estimator of μ.
I tried calculated the variance but I guess it's wrong.
Var[w[itex]\bar{X}[/itex]1 + [itex]\bar{X}[/itex]2 - w[itex]\bar{X}[/itex]2] = w^2.σ^2/n + σ^2/n + w^2.σ^2/n = σ^2/n(2w^2 +1)
I think I have to use the formula above but I don't know how.
Thanks.