Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

LogNormal Mean Estimators

  1. May 21, 2009 #1
    If Y~N(mu,sigma) and y=logX, with X~LN(mu,sigma),
    with a*=exp{ybar+1/2*theta*sample variance of y}, where ybar=sample mean of y and a=E[X]=exp{mu+1/2*sigma^2}, theta is constant.

    If theta=1, a* is consistent but biased and we can reduce the bias by choosing a different value of theta. Use a large n-approximation in the expression E[a*] to find a value that reduces the bias, as compared to when theta=1.

    During my attempt to do this, I ended up with E[a*]=E[geometric mean of x * geometric variance of x]. Knowing that y=logx and thus ybar=log(x1*x2*...*xn)/n, I ended with such an expression. I have a feeling that this is most likely incorrect and am thus completely lost. I was thinking along the lines of possibly using CLT or Weak Law of Large Numbers, with the n-approximation detail in the question, but still don't know where to go from there.
     
  2. jcsd
  3. May 22, 2009 #2
    If Log[geometric mean] = Mean[y] and Log[geometric var] = Var[y], then Log[g.m.*g.v.] = Mean[y] + Var[y], and this is identical to E[Log[a*]] for theta = 2.
     
  4. May 23, 2009 #3
    I think I made a mistake in my working and have taken a different approach involving mgfs of normal distributions. But I still get stuck halfway. I've put some of my working in the attached file. Any ideas as to how to get this different value of theta?

    Edit to file: V= Ybar + 0.5*theta*S^2
     

    Attached Files:

    • Help.pdf
      Help.pdf
      File size:
      111.4 KB
      Views:
      55
    Last edited: May 23, 2009
  5. May 24, 2009 #4
    How did you end up getting E[exp (Vt)]?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook