I Checking for Biased/Consistency

Jmath
Messages
1
Reaction score
0
Hello I am trying to check if the Method of Moments and Maximum Likelihood Estimators for parameter $\theta$ from a sample with population density $$f(x;\theta) = \frac 2 \theta x e^{\frac {-x^2}{\theta}} $$
for $$x \geq 0$, $\theta > 0$$ with $\theta$ being unknown.

Taking the first moment of this function I found the Method of Moments estimator to be $$\hat{\theta}_1 = \frac{4\bar X^2}{\pi}$$ and solving for the Maximum Likelihood Estimator the Estimator to be $$\hat{\theta}_2 = 2\bar Y$$ where Y is just square of the Sample X_i, i.e. $$Y = X_i^2$$.

Steps in Solving for Method of Moments:
I took the first moment, i.e.

M_1 = E[x] = $$\int_0^\infty{\frac 2 \theta x^2 e^{\frac {-x^2}{\theta}}dx}$$

Solving this integral with $u$ substitution with $$u = \frac{-x}{2}, du = \frac{-1}{2}, v = e^\frac{x^2}{\theta}, dv = -2xe^\frac{-x^2}{\theta}$$

$$\int_0^\infty{\frac 2 \theta x^2 e^{\frac {-x^2}{\theta}}dx} = [-\frac{xe^\frac{-x^2}{\theta}}{2\theta} - \frac{\sqrt{\pi \theta}}{4}]^\infty_0 = \frac{\sqrt{\pi} {\sqrt{\theta}}}{2}$$

So that $$E[x] = \bar{x} = \frac{\sqrt{\pi} {\sqrt{\theta}}}{2}$ gives the Method of Moments Estimator $\hat{\theta_1} = \frac{4\bar{X}^2}{\pi}$$

Steps in Solving for Maximum Likelihood:

$$lnL(\theta)=(\prod_{i=1}^n\frac 2 \theta x e^{\frac {-x^2}{\theta}}) = -n ln((2\theta)) + \sum_{i=1}^nx_i - \frac {1} {\theta} \sum_{i=1}^nx^2_i$$

$$\frac {dlnL(\theta)}{d\theta} = \frac{-n}{2\theta} + \frac{1}{\theta^2} \sum_{i=1}^nx^2_i$$

Setting $\frac {dL(\theta)}{d\theta} = 0$, I found the Maximum Likelihood Estimator $\hat{\theta_2}$ to be $$\hat{\theta_2} = \frac{2\sum_{i=1}^nx^2_i}{n}$ , so that if $Y = X_i^2$ then $\hat{\theta_2} = 2\bar{Y}$$.

I am trying to check if these estimators for $\theta$ from this density function are unbiased and/or consistent but am lost on how to go about doing so, any help would be much appreciated.
 
Last edited:
Physics news on Phys.org
To check for bias, evaluate ##E[\hat\theta_2-\theta]## by integrating. The estimate is unbiased iff this evaluates to zero.

To check for consistency, evaluate ##Prob(|\hat\theta_2-\theta|<\epsilon)##. If the result is a function that goes to zero as ##\epsilon\to 0## the estimator is consistent.

A few points by the by:
  • on physicsforums the $ delimiter for latex in-line maths is not recognised. That's why the formatting is all mucked up in places above. Use a double-# instead.
  • your estimates from the two methods are the same, as ##2\bar Y##. Is that what you intended?
  • the statement ##Y=X_i{}^2## occurs twice. This should be ##Y_i=X_i{}^2## as both sides depend on ##i##
By the way, you seem to conclude
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.
Back
Top