How Can We Reduce Bias in LogNormal Mean Estimators?

  • Context: Graduate 
  • Thread starter Thread starter Zazubird
  • Start date Start date
  • Tags Tags
    Estimators Mean
Click For Summary

Discussion Overview

The discussion focuses on methods to reduce bias in log-normal mean estimators, particularly through the choice of a parameter theta in the context of estimators derived from log-transformed data. Participants explore theoretical implications, mathematical derivations, and potential approaches to refining estimators.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant proposes that by setting theta to different values, the bias in the estimator a* can be reduced, particularly noting that with theta=1, a* is consistent but biased.
  • Another participant suggests a relationship involving the geometric mean and variance of x, indicating that Log[geometric mean] equals Mean[y] and Log[geometric var] equals Var[y], leading to a potential simplification when theta=2.
  • A different approach involving moment generating functions (mgfs) of normal distributions is mentioned, but the participant expresses uncertainty about how to derive the new value of theta from this method.
  • Questions arise regarding the derivation of E[exp(Vt)], indicating a need for clarification on this aspect of the discussion.

Areas of Agreement / Disagreement

Participants do not appear to reach a consensus on the best approach to reduce bias or the appropriate value of theta, with multiple competing views and methods being discussed.

Contextual Notes

There are indications of potential errors in calculations and assumptions made by participants, particularly regarding the derivation of expected values and the application of statistical theorems. The discussion reflects uncertainty in the mathematical steps involved.

Who May Find This Useful

Researchers and practitioners interested in statistical estimation methods, particularly in the context of log-normal distributions and bias reduction techniques.

Zazubird
Messages
2
Reaction score
0
If Y~N(mu,sigma) and y=logX, with X~LN(mu,sigma),
with a*=exp{ybar+1/2*theta*sample variance of y}, where ybar=sample mean of y and a=E[X]=exp{mu+1/2*sigma^2}, theta is constant.

If theta=1, a* is consistent but biased and we can reduce the bias by choosing a different value of theta. Use a large n-approximation in the expression E[a*] to find a value that reduces the bias, as compared to when theta=1.

During my attempt to do this, I ended up with E[a*]=E[geometric mean of x * geometric variance of x]. Knowing that y=logx and thus ybar=log(x1*x2*...*xn)/n, I ended with such an expression. I have a feeling that this is most likely incorrect and am thus completely lost. I was thinking along the lines of possibly using CLT or Weak Law of Large Numbers, with the n-approximation detail in the question, but still don't know where to go from there.
 
Physics news on Phys.org
If Log[geometric mean] = Mean[y] and Log[geometric var] = Var[y], then Log[g.m.*g.v.] = Mean[y] + Var[y], and this is identical to E[Log[a*]] for theta = 2.
 
I think I made a mistake in my working and have taken a different approach involving mgfs of normal distributions. But I still get stuck halfway. I've put some of my working in the attached file. Any ideas as to how to get this different value of theta?

Edit to file: V= Ybar + 0.5*theta*S^2
 

Attachments

Last edited:
How did you end up getting E[exp (Vt)]?
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
7K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
Replies
5
Views
6K
Replies
1
Views
4K
  • · Replies 12 ·
Replies
12
Views
5K