# LogNormal Mean Estimators

## Main Question or Discussion Point

If Y~N(mu,sigma) and y=logX, with X~LN(mu,sigma),
with a*=exp{ybar+1/2*theta*sample variance of y}, where ybar=sample mean of y and a=E[X]=exp{mu+1/2*sigma^2}, theta is constant.

If theta=1, a* is consistent but biased and we can reduce the bias by choosing a different value of theta. Use a large n-approximation in the expression E[a*] to find a value that reduces the bias, as compared to when theta=1.

During my attempt to do this, I ended up with E[a*]=E[geometric mean of x * geometric variance of x]. Knowing that y=logx and thus ybar=log(x1*x2*...*xn)/n, I ended with such an expression. I have a feeling that this is most likely incorrect and am thus completely lost. I was thinking along the lines of possibly using CLT or Weak Law of Large Numbers, with the n-approximation detail in the question, but still don't know where to go from there.

Related Set Theory, Logic, Probability, Statistics News on Phys.org
If Log[geometric mean] = Mean[y] and Log[geometric var] = Var[y], then Log[g.m.*g.v.] = Mean[y] + Var[y], and this is identical to E[Log[a*]] for theta = 2.

I think I made a mistake in my working and have taken a different approach involving mgfs of normal distributions. But I still get stuck halfway. I've put some of my working in the attached file. Any ideas as to how to get this different value of theta?

Edit to file: V= Ybar + 0.5*theta*S^2

#### Attachments

• 111.4 KB Views: 87
Last edited:
How did you end up getting E[exp (Vt)]?