MHB Show that the maximum likelihood estimator is unbiased

Click For Summary
SUMMARY

The maximum likelihood estimator (MLE) for the given density family \(f(x, \mu) = c_{\mu} x^{\mu - 1} \exp\left(-\frac{(\ln x)^2}{2}\right)\) is determined to be \(\mu_1 = \frac{1}{n} \sum_{i=1}^{n} \ln X_i\). To demonstrate that this estimator is unbiased, the expectation \(E(\mu_1)\) is calculated, leading to \(E(\mu_1) = \frac{1}{n} \sum_{i=1}^{n} E(\ln X_i)\). The expectation of \(\ln X\) is derived using the integral \(E(\ln X) = \int_0^{\infty} \ln x f(x, \mu) dx\), confirming the unbiased nature of the MLE.

PREREQUISITES
  • Understanding of maximum likelihood estimation (MLE)
  • Familiarity with probability density functions and their properties
  • Knowledge of integration techniques, particularly with respect to expectations
  • Basic understanding of logarithmic transformations in statistics
NEXT STEPS
  • Study the properties of maximum likelihood estimators in statistical inference
  • Learn about the method of moments and its relationship to MLE
  • Explore the concept of unbiased estimators and their significance in statistics
  • Investigate the use of transformations in deriving expectations of random variables
USEFUL FOR

Statisticians, data scientists, and researchers interested in statistical estimation methods, particularly those focusing on maximum likelihood estimation and its properties.

Fermat1
Messages
180
Reaction score
0
Consider a density family $f(x,{\mu})=c_{{\mu}}x^{{\mu}-1}\exp(\frac{-(\ln(x))^2)^2}{2}$ , where $c_{{\mu}}=\frac{1}{{\sqrt{2{\pi}}}}\exp(-{\mu}^2/2)$
For a sample $(X_{1},...,X_{n})$ fnd the maximum likelihood estimator and show it is unbiased. You may find the substitution $y=\ln x$ helpful.

I find the MLE to be ${\mu}_{1}=\frac{1}{n}(\ln(X_{1})+...+\ln(X_{n}))$. For unbiasedness, I'm not sure what to do. If I substitute $y_{i}=\ln(x_{i}$ I get $E({\mu}_{1})=\frac{1}{n}(E(Y_{1})+...+E(Y_{n}))$. Am I meant to recognise the distribution of the $Y_{i}$?
 
Last edited:
Physics news on Phys.org
I'm not sure what $$f(x,\mu)$$ really is. I suppose it's

$$f(x,\mu)=c_{\mu}x^{\mu-1}\exp(-(\text{ln}x)^2/2)$$.

Give a sample $$(X_1,X_2...,X_n)$$, and you've got the MLE, $$\mu_1=\frac{1}{n}\sum_{i=1}^{n}\text{ln}X_i$$. For this $$f(x,\mu)$$, that's right.

To test the unbiasness, you should calculate the expectation of $$\mu_1$$.

Thus, we have, $$E(\mu_1)=\frac{1}{n}\sum_{i=1}^nE(\text{ln}X_i)$$.

Noting $$E(\text{ln}X)=\int_0^{\infty}\text{ln}xf(x,\mu)dx=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t-\mu)^2/2)}dt$$, can you figure out this?
 
stainburg said:
I'm not sure what $$f(x,\mu)$$ really is. I suppose it's

$$f(x,\mu)=c_{\mu}x^{\mu-1}\exp(-(\text{ln}x)^2/2)$$.

Give a sample $$(X_1,X_2...,X_n)$$, and you've got the MLE, $$\mu_1=\frac{1}{n}\sum_{i=1}^{n}\text{ln}X_i$$. For this $$f(x,\mu)$$, that's right.

To test the unbiasness, you should calculate the expectation of $$\mu_1$$.

Thus, we have, $$E(\mu_1)=\frac{1}{n}\sum_{i=1}^nE(\text{ln}X_i)$$.

Noting $$E(\text{ln}X)=\int_0^{\infty}\text{ln}xf(x,\mu)dx=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t-\mu)^2/2)}dt$$, can you figure out this?

what substitution are you using?
 
Fermat said:
what substitution are you using?
Let $$t=\text{ln}x\in (-\infty, \infty)$$, hence $$x=\exp(t)$$.

We then have

$$E(\text{ln}X)\\

=\int_0^{\infty}\text{ln}xc_{\mu}x^{\mu-1}\exp(-(\text{ln}x)^2/2)dx\\

=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}\exp(-\mu^2/2)t\exp{((\mu-1)t)}\exp{(-t^2/2)}d(\exp(t))\\

=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t^2-2\mu t+\mu^2)/2)}dt\\

=\frac{1}{\sqrt{2\pi}}\int_{\mathbb{R}}t\exp{(-(t-\mu)^2/2)}dt\\$$
 
If there are an infinite number of natural numbers, and an infinite number of fractions in between any two natural numbers, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and... then that must mean that there are not only infinite infinities, but an infinite number of those infinities. and an infinite number of those...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 19 ·
Replies
19
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 12 ·
Replies
12
Views
2K