Final exam questions: estimators.

AI Thread Summary
The discussion focuses on two problems related to finding unbiased estimators. In the first problem, participants are tasked with determining the value of c for the statistic c(y1 + 2y2) to be an unbiased estimator for 1/theta, emphasizing that this involves setting the expected value equal to 1/theta. The second problem requires showing that W = (3/2n)ΣYi is an unbiased estimator for theta, with discussions on consistency and convergence criteria. Clarifications are sought regarding the conditions for consistency and how to derive an unbiased estimator for Ymax. The conversation highlights the importance of understanding the definitions and calculations involved in estimating parameters accurately.
semidevil
Messages
156
Reaction score
2
have a final exam on monday, and cannot figure out the stuff on estimators:

1) a random sample of size 2, y1, y2 is drawn from the pdf f(y, theta) = 2y(theta^2), 1 < y < 1/theta.

what must c equal if the statistic c(y1 + 2y2) is to be an unbiased estimator for 1/theta.

I really don't know how to approach anything that asks about estimators. I know that unbiasedness implies E(theta) = theta. But how do I work this problem?

2. Let y1...y2...yn be a random sample size n from pdf fy(y;theta) = 2y/theta^2, 0 <y < y < theta.

show that W = 3/2n summation (Yi) is a unbiased estimator theta.
 
Physics news on Phys.org
semidevil said:
have a final exam on monday, and cannot figure out the stuff on estimators:

1) a random sample of size 2, y1, y2 is drawn from the pdf f(y, theta) = 2y(theta^2), 1 < y < 1/theta.

what must c equal if the statistic c(y1 + 2y2) is to be an unbiased estimator for 1/theta.

I really don't know how to approach anything that asks about estimators. I know that unbiasedness implies E(theta) = theta. But how do I work this problem?

2. Let y1...y2...yn be a random sample size n from pdf fy(y;theta) = 2y/theta^2, 0 <y < y < theta.

show that W = 3/2n summation (Yi) is a unbiased estimator theta.
SOLUTION HINTS:
For both cases, an Unbiased Estimator \hat{\omega} of distribution parameter \omega satisfies:

1: \ \ \ \ \ \ \mathbf{E}(\hat{\omega}) \, \ = \, \ \int \hat{\omega} \, f(y; \, \omega) \, dy \, \ = \, \ \omega \ \ \ \ \ \ \mbox{(Unbiased Estimator)}

where f(y; ω) is the Probability Density Function.
Thus, problem solution will involve evaluation of the following (for Problems #1 & 2 above, respectively), where the given Estimator is shown in blue on the left and the distribution parameter being estimated in red on the right. Complete the necessary steps, and solve for any required parameters:

2: \ \ \ \ \ \mathbf{E}\{\color{blue}c(y_{1} \ + \ 2y_{2})\color{black}\} \ \ = \ \ c \left \{ \mathbf{E}(y_{1}) + 2\mathbf{E}(y_{2}) \right \} \ \ = \ \ 3c\mathbf{E}(y) \ \ = \ \ 3c\int_{\displaystyle 1}^{\displaystyle 1/\theta} y \, (2y\theta^{2}) \, dy \ \ \color{red} \ \ \mathbf{??=??} \ \ \ \ 1/\theta \ \ \ \ \ \textsf{(Solve for c)}

3: \ \ \ \ \mathbf{E}\left(\color{blue} \frac{3}{2n}\sum_{i\,=\,1}^{n}y_{i} \color{black} \right) \ \ = \ \ \frac{3}{2n} \sum_{i\,=\,1}^{n} \mathbf{E}(y_{i}) \ \ = \ \ \frac{3}{2n} \{n\mathbf{E}(y)\} \ \ = \ \ \frac{3}{2}\,\mathbf{E}(y) \ \ = \ \ \frac{3}{2} \int_{\displaystyle 0}^{\displaystyle \theta} y \left(\frac {2y} {\theta^{2}}\right) \, dy \color{red} \ \ \ \ \mathbf{??=??} \ \ \ \ \theta


~~
 
Last edited:
xanthym said:
SOLUTION HINTS:
For both cases, an Unbiased Estimator \hat{\omega} of distribution parameter \omega satisfies:

1: \ \ \ \ \ \ \mathbf{E}(\hat{\omega}) \, \ = \, \ \int \hat{\omega} \, f(y; \, \omega) \, dy \, \ = \, \ \omega \ \ \ \ \ \ \mbox{(Unbiased Estimator)}

where f(y; ω) is the Probability Density Function.
Thus, problem solution will involve evaluation of the following (for Problems #1 & 2 above, respectively), where the given Estimator is shown in blue on the left and the distribution parameter being estimated in red on the right. Complete the necessary steps.

2: \ \ \ \ \ \mathbf{E}\{\color{blue}c(y_{1} \ + \ 2y_{2})\color{black}\} \ \ = \ \ c \left \{ \mathbf{E}(y_{1}) + 2\mathbf{E}(y_{2}) \right \} \ \ = \ \ 3c\mathbf{E}(y) \ \ = \ \ 3c\int_{\displaystyle 1}^{\displaystyle 1/\theta} y \, (2y\theta^{2}) \, dy \ \ \color{red} \ \ \mathbf{??=??} \ \ \ \ 1/\theta \ \ \ \ \ \textsf{(Solve for c)}

3: \ \ \ \ \mathbf{E}\left(\color{blue} \frac{3}{2n}\sum_{i\,=\,1}^{n}y_{i} \color{black} \right) \ \ = \ \ \frac{3}{2n} \sum_{i\,=\,1}^{n} \mathbf{E}(y_{i}) \ \ = \ \ \frac{3}{2n} \{n\mathbf{E}(y)\} \ \ = \ \ \frac{3}{2}\,\mathbf{E}(y) \ \ = \ \ \frac{3}{2} \int_{\displaystyle 0}^{\displaystyle \theta} y \left(\frac {2y} {\theta^{2}}\right) \, dy \color{red} \ \ \ \ \mathbf{??=??} \ \ \ \ \theta


~~

thank you, this helps very much. I was able to understand and solve it better. I have a question though. for the first problem, on solving for c, what do I set the equation equal to? since it is probability, do I set it equal to 1?

and I do have a couple more estimator questins if you dot mind. on the second problem, is it consistent? from definition, it is consistent if it converges to 1. but I don't see how to prove it. also, how do I find an unbiased estimator on Ymax?
 
thank you, this helps very much. I was able to understand and solve it better. I have a question though. for the first problem, on solving for c, what do I set the equation equal to? since it is probability, do I set it equal to 1?

and I do have a couple more estimator questins if you dot mind. on the second problem, is it consistent? from definition, it is consistent if it converges to 1. but I don't see how to prove it. also, how do I find an unbiased estimator on Ymax?
For Problem #1, solve for "c" which makes the estimator unbiased, which (in this case) involves setting the Eq #2 integral equal to (1/θ). See Problem #1 statement for other info.

An estimator \hat{\omega} of distribution parameter \omega is Consistent if 2 conditions are satisified:

4: \ \ \ \ \textsf{Condition #1:} \ \ \ \ \ \lim_{n \longrightarrow \infty} \textbf{E}(\hat{\omega}) \, \, = \, \, \omega

5: \ \ \ \ \textsf{Condition #2:} \ \ \ \ \ \lim_{n \longrightarrow \infty} var(\hat{\omega}) \, \, = \, \, 0

where "n" is Sample Size. Regarding Problem #2, Condition #1 above is true since the estimator is unbiased for all "n". For Condition #2, compute the estimator variance (using techniques similar to those shown in Msg #2) with the following:

6: \ \ \ \ var(\hat{\omega}) \, \ = \, \ \textbf{E}(\hat{\omega}^{2}) \, - \, \textbf{E}^{2}(\hat{\omega})


~~
 
Last edited:
Kindly see the attached pdf. My attempt to solve it, is in it. I'm wondering if my solution is right. My idea is this: At any point of time, the ball may be assumed to be at an incline which is at an angle of θ(kindly see both the pics in the pdf file). The value of θ will continuously change and so will the value of friction. I'm not able to figure out, why my solution is wrong, if it is wrong .
TL;DR Summary: I came across this question from a Sri Lankan A-level textbook. Question - An ice cube with a length of 10 cm is immersed in water at 0 °C. An observer observes the ice cube from the water, and it seems to be 7.75 cm long. If the refractive index of water is 4/3, find the height of the ice cube immersed in the water. I could not understand how the apparent height of the ice cube in the water depends on the height of the ice cube immersed in the water. Does anyone have an...
Back
Top