Statistics: How to prove the consistent estimator of theta?

  • Thread starter Thread starter sanctifier
  • Start date Start date
  • Tags Tags
    Statistics Theta
Click For Summary
SUMMARY

The discussion focuses on estimating the parameter θ from a probability density function defined as f(x|θ) = 1/(3θ) for 0 < x ≤ 3θ. The moment estimator (M.E.) for θ is derived as θ̂ = E(X²)/3, while the maximum likelihood estimator (M.L.E.) is established as θ̂ = (1/3) max{X₁, X₂, ..., Xn}. The discussion emphasizes that θ̂ = (1/3) max{X₁, X₂, ..., Xn} is a consistent estimator of θ, which requires proof that it converges in probability to θ as n approaches infinity.

PREREQUISITES
  • Understanding of probability density functions (PDFs)
  • Familiarity with moment generating functions (MGFs)
  • Knowledge of maximum likelihood estimation (MLE)
  • Concept of consistent estimators in statistics
NEXT STEPS
  • Study the method of moments for parameter estimation
  • Learn about the properties of maximum likelihood estimators
  • Explore the concept of consistent estimators in statistical theory
  • Investigate the derivation and application of moment generating functions
USEFUL FOR

Statisticians, data analysts, and students studying statistical estimation methods, particularly those focusing on maximum likelihood and moment estimation techniques.

sanctifier
Messages
58
Reaction score
0

Homework Statement



If the probability density function (p.d.f.) of the random variable X is

f(x| \theta ) =\begin{cases} \frac{1}{3\theta} &amp; 0 &lt; x \leq 3\theta<br /> \\0 &amp; otherwise\end{cases}

Where \theta &gt; 0 is an unknown parameter, and X_1, X_2 … X_n is a sample from X where n &gt; 2

Question 1: What is the moment estimator (M.E.) of \theta?

Question 2: What is the maximum likelihood estimator (M.L.E) of \theta?

Question 3: Prove \widehat{\theta} = \frac{1}{3} max\{X_1,X_2...X_n\} is the consistent estimator of \theta.

Homework Equations



Nothing special.

The Attempt at a Solution



Answer 1:

Moment generating function (m.g.f.) of X is

\psi (t) = E(e^{tx}) = \int_0^{3 \theta } \frac{e^{tx}}{3\theta} dx= \frac{1}{3 \theta t} \int_0^{3 \theta }de^{tx}= \frac{1}{3 \theta t}e^{tx}|_{x=0}^{x=3 \theta}=\frac{1}{3 \theta t}(e^{3\theta t} - 1)

\begin{cases} \psi&#039;(t) =e^{3 \theta t} \\<br /> \psi&#039;&#039;(t) =3 \theta e^{3 \theta t} \end{cases}

\begin{cases} \psi&#039;(0) =1 \\<br /> \psi&#039;&#039;(0) =3 \theta \end{cases}

Hence, M.E. is

\widehat{\theta} = \frac{\psi&#039;&#039;(0)}{3} = \frac{E(X^2)}{3}

Answer 2:

Let X be a vector whose components are X_1, X_2 … X_n, then the joint distribution of X_1, X_2 … X_n is

f(X| \theta ) = \frac{1}{(3\theta)^n} \;\;when\;\; 0&lt;X_i \leq 3\theta \;\; for \;\; i=1,2,...,n

Because X_i \leq 3\theta, when \widehat{\theta} = \frac{1}{3} min\{X_1,X_2...X_n\}, f(X| \theta ) is maximized.

Hence, M.L.E of \theta is \frac{1}{3} min\{X_1,X_2...X_n\}.

Answer 3:

I have no idea to even start the proving.
 
Last edited:
Physics news on Phys.org
sanctifier said:

Homework Statement



If the probability density function (p.d.f.) of the random variable X is

f(x| \theta ) =\begin{cases} \frac{1}{3\theta} &amp; 0 &lt; x \leq 3\theta<br /> \\0 &amp; otherwise\end{cases}

Where \theta &gt; 0 is an unknown parameter, and X_1, X_2 … X_n is a sample from X where n &gt; 2

Question 1: What is the moment estimator (M.E.) of \theta?

Question 2: What is the maximum likelihood estimator (M.L.E) of \theta?

Question 3: Prove \widehat{\theta} = \frac{1}{3} max\{X_1,X_2...X_n\} is the consistent estimator of \theta.

Homework Equations



Nothing special.

The Attempt at a Solution



Answer 1:

Moment generating function (m.g.f.) of X is

\psi (t) = E(e^{tx}) = \int_0^{3 \theta } \frac{e^{tx}}{3\theta} dx= \frac{1}{3 \theta t} \int_0^{3 \theta }de^{tx}= \frac{1}{3 \theta t}e^{tx}|_{x=0}^{x=3 \theta}=\frac{1}{3 \theta t}(e^{3\theta t} - 1)

\begin{cases} \psi&#039;(t) =e^{3 \theta t} \\<br /> \psi&#039;&#039;(t) =3 \theta e^{3 \theta t} \end{cases}

\begin{cases} \psi&#039;(0) =1 \\<br /> \psi&#039;&#039;(0) =3 \theta \end{cases}

Hence, M.E. is

\widehat{\theta} = \frac{\psi&#039;&#039;(0)}{3} = \frac{E(X^2)}{3}

Answer 2:

Let X be a vector whose components are X_1, X_2 … X_n, then the joint distribution of X_1, X_2 … X_n is

f(X| \theta ) = \frac{1}{(3\theta)^n} \;\;when\;\; 0&lt;X_i \leq 3\theta \;\; for \;\; i=1,2,...,n

Because X_i \leq 3\theta, when \widehat{\theta} = \frac{1}{3} min\{X_1,X_2...X_n\}, f(X| \theta ) is maximized.

Hence, M.L.E of \theta is \frac{1}{3} min\{X_1,X_2...X_n\}.

Answer 3:

I have no idea to even start the proving.

Your expression for ##EX## is incorrect: it should not be ##\theta/3##. I suggest you avoid moment-generating functions, since you seem to be mis-using them, and they are totally unnecessary in a question of this type. If you want ##EX##, just do the integration, or use familiar elementary results that you should have seen already in a first course but might have forgotten.
 
Last edited:
Thank you for your replay, Ray.

Actually, the moment estimator denotes the method of moments estimator, you can find it here: http://en.wikipedia.org/wiki/Method_of_moments_(statistics )

What I’m concerning is the estimator found by using method of moments.

Answer 2 is wrong.

It should be \frac{1}{3} max\{X_1,X_2...X_n\}, since to maximize f(X| \theta ) = \frac{1}{(3\theta)^n}, \theta needs to be its minimal value which should be \frac{1}{3} max\{X_1,X_2...X_n\} with respect to the constraint X_i \leq 3\theta.

What about answer 1 and answer 3?
 
Last edited by a moderator:

Similar threads

  • · Replies 2 ·
Replies
2
Views
857
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
1
Views
1K