(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

Let [itex]\bar{X}[/itex]1 and [itex]\bar{X}[/itex]2 be the means of two independent samples of sizes n and 2n from an infinite population that has mean μ and variance σ^2 > 0. For what value of w is w[itex]\bar{X}[/itex]1 + (1 - w)[itex]\bar{X}[/itex]2 the minimum variance unbiased estimator of μ?

(a) 0

(b) 1/3

(c) 1/2

(d) 2/3

(e) 1

2. Relevant equations

If θ~ is unbiased for θ and

Var(θ~)= 1/E[(d log_{e}f (x)/dθ)^2] = 1/E[(dl(θ)/dθ)^2]

then θ~ is a minimum variance unbiased estimator of θ.

3. The attempt at a solution

E[w[itex]\bar{X}[/itex]1 + (1 - w)[itex]\bar{X}[/itex]2] = wμ + (1-w)μ = μ

So it's an unbiased estimator of μ.

I tried calculated the variance but I guess it's wrong.

Var[w[itex]\bar{X}[/itex]1 + [itex]\bar{X}[/itex]2 - w[itex]\bar{X}[/itex]2] = w^2.σ^2/n + σ^2/n + w^2.σ^2/n = σ^2/n(2w^2 +1)

I think I have to use the formula above but I don't know how.

Thanks.

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Minimum variance unbiased estimator

**Physics Forums | Science Articles, Homework Help, Discussion**