# Final exam questions: estimators.

1. May 7, 2005

### semidevil

have a final exam on monday, and cannot figure out the stuff on estimators:

1) a random sample of size 2, y1, y2 is drawn from the pdf f(y, theta) = 2y(theta^2), 1 < y < 1/theta.

what must c equal if the statistic c(y1 + 2y2) is to be an unbiased estimator for 1/theta.

I really dont know how to approach anything that asks about estimators. I know that unbiasedness implies E(theta) = theta. But how do I work this problem?

2. Let y1....y2....yn be a random sample size n from pdf fy(y;theta) = 2y/theta^2, 0 <y < y < theta.

show that W = 3/2n summation (Yi) is a unbiased estimator theta.

2. May 7, 2005

### xanthym

SOLUTION HINTS:
For both cases, an Unbiased Estimator $$\hat{\omega}$$ of distribution parameter $\omega$ satisfies:

$$1: \ \ \ \ \ \ \mathbf{E}(\hat{\omega}) \, \ = \, \ \int \hat{\omega} \, f(y; \, \omega) \, dy \, \ = \, \ \omega \ \ \ \ \ \ \mbox{(Unbiased Estimator)}$$

where f(y; ω) is the Probability Density Function.
Thus, problem solution will involve evaluation of the following (for Problems #1 & 2 above, respectively), where the given Estimator is shown in blue on the left and the distribution parameter being estimated in red on the right. Complete the necessary steps, and solve for any required parameters:

$$2: \ \ \ \ \ \mathbf{E}\{\color{blue}c(y_{1} \ + \ 2y_{2})\color{black}\} \ \ = \ \ c \left \{ \mathbf{E}(y_{1}) + 2\mathbf{E}(y_{2}) \right \} \ \ = \ \ 3c\mathbf{E}(y) \ \ = \ \ 3c\int_{\displaystyle 1}^{\displaystyle 1/\theta} y \, (2y\theta^{2}) \, dy \ \ \color{red} \ \ \mathbf{??=??} \ \ \ \ 1/\theta \ \ \ \ \ \textsf{(Solve for c)}$$

$$3: \ \ \ \ \mathbf{E}\left(\color{blue} \frac{3}{2n}\sum_{i\,=\,1}^{n}y_{i} \color{black} \right) \ \ = \ \ \frac{3}{2n} \sum_{i\,=\,1}^{n} \mathbf{E}(y_{i}) \ \ = \ \ \frac{3}{2n} \{n\mathbf{E}(y)\} \ \ = \ \ \frac{3}{2}\,\mathbf{E}(y) \ \ = \ \ \frac{3}{2} \int_{\displaystyle 0}^{\displaystyle \theta} y \left(\frac {2y} {\theta^{2}}\right) \, dy \color{red} \ \ \ \ \mathbf{??=??} \ \ \ \ \theta$$

~~

Last edited: May 7, 2005
3. May 7, 2005

### semidevil

thank you, this helps very much. I was able to understand and solve it better. I have a question though. for the first problem, on solving for c, what do I set the equation equal to? since it is probability, do I set it equal to 1?

and I do have a couple more estimator questins if you dot mind. on the second problem, is it consistent? from definition, it is consistent if it converges to 1. but I dont see how to prove it. also, how do I find an unbiased estimator on Ymax?

4. May 8, 2005

### xanthym

For Problem #1, solve for "c" which makes the estimator unbiased, which (in this case) involves setting the Eq #2 integral equal to (1/θ). See Problem #1 statement for other info.

An estimator $$\hat{\omega}$$ of distribution parameter $\omega$ is Consistent if 2 conditions are satisified:

$$4: \ \ \ \ \textsf{Condition #1:} \ \ \ \ \ \lim_{n \longrightarrow \infty} \textbf{E}(\hat{\omega}) \, \, = \, \, \omega$$

$$5: \ \ \ \ \textsf{Condition #2:} \ \ \ \ \ \lim_{n \longrightarrow \infty} var(\hat{\omega}) \, \, = \, \, 0$$

where "n" is Sample Size. Regarding Problem #2, Condition #1 above is true since the estimator is unbiased for all "n". For Condition #2, compute the estimator variance (using techniques similar to those shown in Msg #2) with the following:

$$6: \ \ \ \ var(\hat{\omega}) \, \ = \, \ \textbf{E}(\hat{\omega}^{2}) \, - \, \textbf{E}^{2}(\hat{\omega})$$

~~

Last edited: May 8, 2005