MHB Show that the maximum likelihood estimator is unbiased

Click For Summary
The maximum likelihood estimator (MLE) for the given density family is calculated as μ₁ = (1/n)Σln(Xᵢ). To establish its unbiasedness, the expectation E(μ₁) is derived, leading to E(μ₁) = (1/n)ΣE(ln(Xᵢ)). A substitution y = ln(x) is suggested to facilitate the calculation of E(ln(X)). The expectation of ln(X) is computed using an integral involving the density function, ultimately confirming that the MLE is unbiased. The discussion emphasizes the importance of recognizing the distribution of the transformed variable Y.
Fermat1
Messages
180
Reaction score
0
Consider a density family $f(x,{\mu})=c_{{\mu}}x^{{\mu}-1}\exp(\frac{-(\ln(x))^2)^2}{2}$ , where $c_{{\mu}}=\frac{1}{{\sqrt{2{\pi}}}}\exp(-{\mu}^2/2)$
For a sample $(X_{1},...,X_{n})$ fnd the maximum likelihood estimator and show it is unbiased. You may find the substitution $y=\ln x$ helpful.

I find the MLE to be ${\mu}_{1}=\frac{1}{n}(\ln(X_{1})+...+\ln(X_{n}))$. For unbiasedness, I'm not sure what to do. If I substitute $y_{i}=\ln(x_{i}$ I get $E({\mu}_{1})=\frac{1}{n}(E(Y_{1})+...+E(Y_{n}))$. Am I meant to recognise the distribution of the $Y_{i}$?
 
Last edited:
Physics news on Phys.org
I'm not sure what $$f(x,\mu)$$ really is. I suppose it's

$$f(x,\mu)=c_{\mu}x^{\mu-1}\exp(-(\text{ln}x)^2/2)$$.

Give a sample $$(X_1,X_2...,X_n)$$, and you've got the MLE, $$\mu_1=\frac{1}{n}\sum_{i=1}^{n}\text{ln}X_i$$. For this $$f(x,\mu)$$, that's right.

To test the unbiasness, you should calculate the expectation of $$\mu_1$$.

Thus, we have, $$E(\mu_1)=\frac{1}{n}\sum_{i=1}^nE(\text{ln}X_i)$$.

Noting $$E(\text{ln}X)=\int_0^{\infty}\text{ln}xf(x,\mu)dx=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t-\mu)^2/2)}dt$$, can you figure out this?
 
stainburg said:
I'm not sure what $$f(x,\mu)$$ really is. I suppose it's

$$f(x,\mu)=c_{\mu}x^{\mu-1}\exp(-(\text{ln}x)^2/2)$$.

Give a sample $$(X_1,X_2...,X_n)$$, and you've got the MLE, $$\mu_1=\frac{1}{n}\sum_{i=1}^{n}\text{ln}X_i$$. For this $$f(x,\mu)$$, that's right.

To test the unbiasness, you should calculate the expectation of $$\mu_1$$.

Thus, we have, $$E(\mu_1)=\frac{1}{n}\sum_{i=1}^nE(\text{ln}X_i)$$.

Noting $$E(\text{ln}X)=\int_0^{\infty}\text{ln}xf(x,\mu)dx=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t-\mu)^2/2)}dt$$, can you figure out this?

what substitution are you using?
 
Fermat said:
what substitution are you using?
Let $$t=\text{ln}x\in (-\infty, \infty)$$, hence $$x=\exp(t)$$.

We then have

$$E(\text{ln}X)\\

=\int_0^{\infty}\text{ln}xc_{\mu}x^{\mu-1}\exp(-(\text{ln}x)^2/2)dx\\

=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}\exp(-\mu^2/2)t\exp{((\mu-1)t)}\exp{(-t^2/2)}d(\exp(t))\\

=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t^2-2\mu t+\mu^2)/2)}dt\\

=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t-\mu)^2/2)}dt\\$$
 
First trick I learned this one a long time ago and have used it to entertain and amuse young kids. Ask your friend to write down a three-digit number without showing it to you. Then ask him or her to rearrange the digits to form a new three-digit number. After that, write whichever is the larger number above the other number, and then subtract the smaller from the larger, making sure that you don't see any of the numbers. Then ask the young "victim" to tell you any two of the digits of the...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 19 ·
Replies
19
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 12 ·
Replies
12
Views
2K