EnzoF61
- 13
- 0
If \overline{_}X1 and \overline{_}X2 are the means of independent random samples of size n1 and n2 from a normal population with the mean \mu and \sigma^2, show that the variance of the unbiased estimator Var(\omega\overline{_}X1 +(1-\omega)\overline{_}X2) is a minimum when \omega= n1 / (n1 + n2).
My professor gave a hint to find the Var(\omega\overline{_}X1 +(1-\omega)\overline{_}X2) and then to minimize with respect to \omega using first and second derivatives.
I understand to square out the coefficients due to independence but I'm not sure where to begin to find the variances of the means of independent random samples. I feel I should have an understanding to minimize with the Cramer-Rao Inequality. Maybe I'm trying to look too deep into what is being asked?
My professor gave a hint to find the Var(\omega\overline{_}X1 +(1-\omega)\overline{_}X2) and then to minimize with respect to \omega using first and second derivatives.
I understand to square out the coefficients due to independence but I'm not sure where to begin to find the variances of the means of independent random samples. I feel I should have an understanding to minimize with the Cramer-Rao Inequality. Maybe I'm trying to look too deep into what is being asked?