Final exam questions: estimators.

In summary: For Ymax, find the CDF, then differentiate to find the PDF. Use the PDF to find the "Mean" or 1st Moment and the "Variance" or 2nd Moment. Use the 1st Moment to find an Unbiased Estimator for "Ymax" (using the techniques from the Msg #2 Hints), and then use the 2nd Moment to compute the Variance of the Estimator (using the techniques from the Msg #2 Hints). (Note: "n" is still Sample Size)For Ymax, find the CDF, then differentiate to find the PDF. Use the PDF to find the "Mean" or 1st Moment and the
  • #1
semidevil
157
2
have a final exam on monday, and cannot figure out the stuff on estimators:

1) a random sample of size 2, y1, y2 is drawn from the pdf f(y, theta) = 2y(theta^2), 1 < y < 1/theta.

what must c equal if the statistic c(y1 + 2y2) is to be an unbiased estimator for 1/theta.

I really don't know how to approach anything that asks about estimators. I know that unbiasedness implies E(theta) = theta. But how do I work this problem?

2. Let y1...y2...yn be a random sample size n from pdf fy(y;theta) = 2y/theta^2, 0 <y < y < theta.

show that W = 3/2n summation (Yi) is a unbiased estimator theta.
 
Physics news on Phys.org
  • #2
semidevil said:
have a final exam on monday, and cannot figure out the stuff on estimators:

1) a random sample of size 2, y1, y2 is drawn from the pdf f(y, theta) = 2y(theta^2), 1 < y < 1/theta.

what must c equal if the statistic c(y1 + 2y2) is to be an unbiased estimator for 1/theta.

I really don't know how to approach anything that asks about estimators. I know that unbiasedness implies E(theta) = theta. But how do I work this problem?

2. Let y1...y2...yn be a random sample size n from pdf fy(y;theta) = 2y/theta^2, 0 <y < y < theta.

show that W = 3/2n summation (Yi) is a unbiased estimator theta.
SOLUTION HINTS:
For both cases, an Unbiased Estimator [tex] \hat{\omega} [/tex] of distribution parameter [itex] \omega [/itex] satisfies:

[tex] 1: \ \ \ \ \ \ \mathbf{E}(\hat{\omega}) \, \ = \, \ \int \hat{\omega} \, f(y; \, \omega) \, dy \, \ = \, \ \omega \ \ \ \ \ \ \mbox{(Unbiased Estimator)}[/tex]

where f(y; ω) is the Probability Density Function.
Thus, problem solution will involve evaluation of the following (for Problems #1 & 2 above, respectively), where the given Estimator is shown in blue on the left and the distribution parameter being estimated in red on the right. Complete the necessary steps, and solve for any required parameters:

[tex] 2: \ \ \ \ \ \mathbf{E}\{\color{blue}c(y_{1} \ + \ 2y_{2})\color{black}\} \ \ = \ \ c \left \{ \mathbf{E}(y_{1}) + 2\mathbf{E}(y_{2}) \right \} \ \ = \ \ 3c\mathbf{E}(y) \ \ = \ \ 3c\int_{\displaystyle 1}^{\displaystyle 1/\theta} y \, (2y\theta^{2}) \, dy \ \ \color{red} \ \ \mathbf{??=??} \ \ \ \ 1/\theta \ \ \ \ \ \textsf{(Solve for c)}[/tex]

[tex] 3: \ \ \ \ \mathbf{E}\left(\color{blue} \frac{3}{2n}\sum_{i\,=\,1}^{n}y_{i} \color{black} \right) \ \ = \ \ \frac{3}{2n} \sum_{i\,=\,1}^{n} \mathbf{E}(y_{i}) \ \ = \ \ \frac{3}{2n} \{n\mathbf{E}(y)\} \ \ = \ \ \frac{3}{2}\,\mathbf{E}(y) \ \ = \ \ \frac{3}{2} \int_{\displaystyle 0}^{\displaystyle \theta} y \left(\frac {2y} {\theta^{2}}\right) \, dy \color{red} \ \ \ \ \mathbf{??=??} \ \ \ \ \theta [/tex]


~~
 
Last edited:
  • #3
xanthym said:
SOLUTION HINTS:
For both cases, an Unbiased Estimator [tex] \hat{\omega} [/tex] of distribution parameter [itex] \omega [/itex] satisfies:

[tex] 1: \ \ \ \ \ \ \mathbf{E}(\hat{\omega}) \, \ = \, \ \int \hat{\omega} \, f(y; \, \omega) \, dy \, \ = \, \ \omega \ \ \ \ \ \ \mbox{(Unbiased Estimator)}[/tex]

where f(y; ω) is the Probability Density Function.
Thus, problem solution will involve evaluation of the following (for Problems #1 & 2 above, respectively), where the given Estimator is shown in blue on the left and the distribution parameter being estimated in red on the right. Complete the necessary steps.

[tex] 2: \ \ \ \ \ \mathbf{E}\{\color{blue}c(y_{1} \ + \ 2y_{2})\color{black}\} \ \ = \ \ c \left \{ \mathbf{E}(y_{1}) + 2\mathbf{E}(y_{2}) \right \} \ \ = \ \ 3c\mathbf{E}(y) \ \ = \ \ 3c\int_{\displaystyle 1}^{\displaystyle 1/\theta} y \, (2y\theta^{2}) \, dy \ \ \color{red} \ \ \mathbf{??=??} \ \ \ \ 1/\theta \ \ \ \ \ \textsf{(Solve for c)}[/tex]

[tex] 3: \ \ \ \ \mathbf{E}\left(\color{blue} \frac{3}{2n}\sum_{i\,=\,1}^{n}y_{i} \color{black} \right) \ \ = \ \ \frac{3}{2n} \sum_{i\,=\,1}^{n} \mathbf{E}(y_{i}) \ \ = \ \ \frac{3}{2n} \{n\mathbf{E}(y)\} \ \ = \ \ \frac{3}{2}\,\mathbf{E}(y) \ \ = \ \ \frac{3}{2} \int_{\displaystyle 0}^{\displaystyle \theta} y \left(\frac {2y} {\theta^{2}}\right) \, dy \color{red} \ \ \ \ \mathbf{??=??} \ \ \ \ \theta [/tex]


~~

thank you, this helps very much. I was able to understand and solve it better. I have a question though. for the first problem, on solving for c, what do I set the equation equal to? since it is probability, do I set it equal to 1?

and I do have a couple more estimator questins if you dot mind. on the second problem, is it consistent? from definition, it is consistent if it converges to 1. but I don't see how to prove it. also, how do I find an unbiased estimator on Ymax?
 
  • #4
thank you, this helps very much. I was able to understand and solve it better. I have a question though. for the first problem, on solving for c, what do I set the equation equal to? since it is probability, do I set it equal to 1?

and I do have a couple more estimator questins if you dot mind. on the second problem, is it consistent? from definition, it is consistent if it converges to 1. but I don't see how to prove it. also, how do I find an unbiased estimator on Ymax?
For Problem #1, solve for "c" which makes the estimator unbiased, which (in this case) involves setting the Eq #2 integral equal to (1/θ). See Problem #1 statement for other info.

An estimator [tex]\hat{\omega}[/tex] of distribution parameter [itex]\omega[/itex] is Consistent if 2 conditions are satisified:

[tex] 4: \ \ \ \ \textsf{Condition #1:} \ \ \ \ \ \lim_{n \longrightarrow \infty} \textbf{E}(\hat{\omega}) \, \, = \, \, \omega [/tex]

[tex] 5: \ \ \ \ \textsf{Condition #2:} \ \ \ \ \ \lim_{n \longrightarrow \infty} var(\hat{\omega}) \, \, = \, \, 0 [/tex]

where "n" is Sample Size. Regarding Problem #2, Condition #1 above is true since the estimator is unbiased for all "n". For Condition #2, compute the estimator variance (using techniques similar to those shown in Msg #2) with the following:

[tex] 6: \ \ \ \ var(\hat{\omega}) \, \ = \, \ \textbf{E}(\hat{\omega}^{2}) \, - \, \textbf{E}^{2}(\hat{\omega}) [/tex]


~~
 
Last edited:

1. What is an estimator?

An estimator is a statistical tool used to estimate a population parameter based on a sample of data. It provides an estimate of the true value of the parameter that is usually unknown.

2. How is an estimator different from an estimate?

An estimator is a mathematical formula or method used to calculate an estimate, while an estimate is the actual numerical value that is obtained from the estimator. In other words, the estimator is the tool used to calculate the estimate.

3. What are the qualities of a good estimator?

A good estimator should have low bias, meaning its estimates are close to the true value; low variance, meaning its estimates are consistent; and high efficiency, meaning it produces the most accurate estimates compared to other estimators.

4. What is the purpose of using estimators in statistical analysis?

Estimators are used to make inferences about a population based on a sample of data. They help us understand and draw conclusions about the true values of population parameters, which are usually unknown.

5. How do you choose the best estimator for a given situation?

The best estimator for a given situation depends on the specific characteristics of the data and the population being studied. Generally, a good estimator should have low bias and variance, and high efficiency. It is also important to consider the assumptions and limitations of each estimator before choosing the most appropriate one.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
731
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
834
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
4K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
  • Introductory Physics Homework Help
Replies
4
Views
18K
  • Differential Equations
Replies
1
Views
1K
Back
Top