Statistics: How to prove the consistent estimator of theta?

In summary, the moment estimator of theta is equal to the expected value of X squared divided by 3, the maximum likelihood estimator of theta is equal to 1/3 times the maximum value of the sample, and the estimator of theta found using the method of moments is consistent.
  • #1
sanctifier
58
0

Homework Statement



If the probability density function (p.d.f.) of the random variable X is

[itex] f(x| \theta ) =\begin{cases} \frac{1}{3\theta} & 0 < x \leq 3\theta
\\0 & otherwise\end{cases} [/itex]

Where [itex] \theta > 0 [/itex] is an unknown parameter, and [itex] X_1, X_2 … X_n [/itex] is a sample from X where [itex] n > 2 [/itex]

Question 1: What is the moment estimator (M.E.) of [itex] \theta [/itex]?

Question 2: What is the maximum likelihood estimator (M.L.E) of [itex] \theta [/itex]?

Question 3: Prove [itex] \widehat{\theta} = \frac{1}{3} max\{X_1,X_2...X_n\} [/itex] is the consistent estimator of [itex] \theta [/itex].

Homework Equations



Nothing special.

The Attempt at a Solution



Answer 1:

Moment generating function (m.g.f.) of X is

[itex] \psi (t) = E(e^{tx}) = \int_0^{3 \theta } \frac{e^{tx}}{3\theta} dx= \frac{1}{3 \theta t} \int_0^{3 \theta }de^{tx}= \frac{1}{3 \theta t}e^{tx}|_{x=0}^{x=3 \theta}=\frac{1}{3 \theta t}(e^{3\theta t} - 1) [/itex]

[itex] \begin{cases} \psi'(t) =e^{3 \theta t} \\
\psi''(t) =3 \theta e^{3 \theta t} \end{cases} [/itex]

[itex] \begin{cases} \psi'(0) =1 \\
\psi''(0) =3 \theta \end{cases} [/itex]

Hence, M.E. is

[itex] \widehat{\theta} = \frac{\psi''(0)}{3} = \frac{E(X^2)}{3} [/itex]

Answer 2:

Let [itex] X [/itex] be a vector whose components are [itex] X_1, X_2 … X_n [/itex], then the joint distribution of [itex] X_1, X_2 … X_n [/itex] is

[itex] f(X| \theta ) = \frac{1}{(3\theta)^n} \;\;when\;\; 0<X_i \leq 3\theta \;\; for \;\; i=1,2,...,n [/itex]

Because [itex] X_i \leq 3\theta [/itex], when [itex] \widehat{\theta} = \frac{1}{3} min\{X_1,X_2...X_n\} [/itex], [itex] f(X| \theta ) [/itex] is maximized.

Hence, M.L.E of [itex] \theta [/itex] is [itex] \frac{1}{3} min\{X_1,X_2...X_n\} [/itex].

Answer 3:

I have no idea to even start the proving.
 
Last edited:
Physics news on Phys.org
  • #2
sanctifier said:

Homework Statement



If the probability density function (p.d.f.) of the random variable X is

[itex] f(x| \theta ) =\begin{cases} \frac{1}{3\theta} & 0 < x \leq 3\theta
\\0 & otherwise\end{cases} [/itex]

Where [itex] \theta > 0 [/itex] is an unknown parameter, and [itex] X_1, X_2 … X_n [/itex] is a sample from X where [itex] n > 2 [/itex]

Question 1: What is the moment estimator (M.E.) of [itex] \theta [/itex]?

Question 2: What is the maximum likelihood estimator (M.L.E) of [itex] \theta [/itex]?

Question 3: Prove [itex] \widehat{\theta} = \frac{1}{3} max\{X_1,X_2...X_n\} [/itex] is the consistent estimator of [itex] \theta [/itex].

Homework Equations



Nothing special.

The Attempt at a Solution



Answer 1:

Moment generating function (m.g.f.) of X is

[itex] \psi (t) = E(e^{tx}) = \int_0^{3 \theta } \frac{e^{tx}}{3\theta} dx= \frac{1}{3 \theta t} \int_0^{3 \theta }de^{tx}= \frac{1}{3 \theta t}e^{tx}|_{x=0}^{x=3 \theta}=\frac{1}{3 \theta t}(e^{3\theta t} - 1) [/itex]

[itex] \begin{cases} \psi'(t) =e^{3 \theta t} \\
\psi''(t) =3 \theta e^{3 \theta t} \end{cases} [/itex]

[itex] \begin{cases} \psi'(0) =1 \\
\psi''(0) =3 \theta \end{cases} [/itex]

Hence, M.E. is

[itex] \widehat{\theta} = \frac{\psi''(0)}{3} = \frac{E(X^2)}{3} [/itex]

Answer 2:

Let [itex] X [/itex] be a vector whose components are [itex] X_1, X_2 … X_n [/itex], then the joint distribution of [itex] X_1, X_2 … X_n [/itex] is

[itex] f(X| \theta ) = \frac{1}{(3\theta)^n} \;\;when\;\; 0<X_i \leq 3\theta \;\; for \;\; i=1,2,...,n [/itex]

Because [itex] X_i \leq 3\theta [/itex], when [itex] \widehat{\theta} = \frac{1}{3} min\{X_1,X_2...X_n\} [/itex], [itex] f(X| \theta ) [/itex] is maximized.

Hence, M.L.E of [itex] \theta [/itex] is [itex] \frac{1}{3} min\{X_1,X_2...X_n\} [/itex].

Answer 3:

I have no idea to even start the proving.

Your expression for ##EX## is incorrect: it should not be ##\theta/3##. I suggest you avoid moment-generating functions, since you seem to be mis-using them, and they are totally unnecessary in a question of this type. If you want ##EX##, just do the integration, or use familiar elementary results that you should have seen already in a first course but might have forgotten.
 
Last edited:
  • #3
Thank you for your replay, Ray.

Actually, the moment estimator denotes the method of moments estimator, you can find it here: http://en.wikipedia.org/wiki/Method_of_moments_(statistics )

What I’m concerning is the estimator found by using method of moments.

Answer 2 is wrong.

It should be [itex] \frac{1}{3} max\{X_1,X_2...X_n\} [/itex], since to maximize [itex] f(X| \theta ) = \frac{1}{(3\theta)^n} [/itex], [itex] \theta [/itex] needs to be its minimal value which should be [itex] \frac{1}{3} max\{X_1,X_2...X_n\} [/itex] with respect to the constraint [itex] X_i \leq 3\theta [/itex].

What about answer 1 and answer 3?
 
Last edited by a moderator:

1. What is a consistent estimator in statistics?

A consistent estimator is a statistical method that provides an estimate of a population parameter that becomes closer to the true value as the sample size increases. In other words, as more data points are collected, the estimate becomes more accurate and approaches the true value of the population parameter.

2. How is consistency of an estimator determined?

The consistency of an estimator is determined by evaluating its bias and variance. A consistent estimator has a bias of zero and a variance that decreases as the sample size increases. This means that as the sample size increases, the estimate becomes more accurate and the variability of the estimate decreases.

3. What is the importance of a consistent estimator?

A consistent estimator is important because it allows us to make more accurate predictions about the population based on a sample. It also helps us to identify relationships and patterns in the data, and to make informed decisions based on the results.

4. How do you prove the consistency of an estimator?

The consistency of an estimator can be proven using mathematical proofs and statistical tests. This involves showing that the bias and variance of the estimator approach zero as the sample size increases. In addition, simulations and numerical experiments can also be used to support the consistency of an estimator.

5. What are some common examples of consistent estimators?

Some common examples of consistent estimators include the sample mean, sample variance, and maximum likelihood estimators. These estimators have been proven to be consistent through mathematical proofs and statistical tests.

Similar threads

  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
880
  • Calculus and Beyond Homework Help
Replies
8
Views
875
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
Replies
1
Views
571
  • Calculus and Beyond Homework Help
Replies
4
Views
958
  • Calculus and Beyond Homework Help
Replies
2
Views
497
  • Calculus and Beyond Homework Help
Replies
1
Views
935
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
745
Back
Top