LogNormal Mean Estimators

  • Thread starter Zazubird
  • Start date
  • #1
2
0

Main Question or Discussion Point

If Y~N(mu,sigma) and y=logX, with X~LN(mu,sigma),
with a*=exp{ybar+1/2*theta*sample variance of y}, where ybar=sample mean of y and a=E[X]=exp{mu+1/2*sigma^2}, theta is constant.

If theta=1, a* is consistent but biased and we can reduce the bias by choosing a different value of theta. Use a large n-approximation in the expression E[a*] to find a value that reduces the bias, as compared to when theta=1.

During my attempt to do this, I ended up with E[a*]=E[geometric mean of x * geometric variance of x]. Knowing that y=logx and thus ybar=log(x1*x2*...*xn)/n, I ended with such an expression. I have a feeling that this is most likely incorrect and am thus completely lost. I was thinking along the lines of possibly using CLT or Weak Law of Large Numbers, with the n-approximation detail in the question, but still don't know where to go from there.
 

Answers and Replies

  • #2
If Log[geometric mean] = Mean[y] and Log[geometric var] = Var[y], then Log[g.m.*g.v.] = Mean[y] + Var[y], and this is identical to E[Log[a*]] for theta = 2.
 
  • #3
2
0
I think I made a mistake in my working and have taken a different approach involving mgfs of normal distributions. But I still get stuck halfway. I've put some of my working in the attached file. Any ideas as to how to get this different value of theta?

Edit to file: V= Ybar + 0.5*theta*S^2
 

Attachments

Last edited:
  • #4
1
0
How did you end up getting E[exp (Vt)]?
 

Related Threads for: LogNormal Mean Estimators

  • Last Post
Replies
4
Views
5K
Replies
5
Views
2K
  • Last Post
Replies
1
Views
8K
  • Last Post
Replies
7
Views
2K
  • Last Post
Replies
2
Views
3K
  • Last Post
Replies
3
Views
3K
Replies
40
Views
1K
  • Last Post
Replies
0
Views
4K
Top