Finding E(ln x) and Var(ln x): Cramer-Rao Lower Bound

  • Context: Graduate 
  • Thread starter Thread starter safina
  • Start date Start date
  • Tags Tags
    Logarithm Natural
Click For Summary

Discussion Overview

The discussion focuses on finding the expected value E(ln x) and variance Var(ln x) for a random variable X that follows a specific probability density function. The context includes the application of these calculations in determining the Cramer-Rao lower bound for the variance of an unbiased estimator related to a parameter θ.

Discussion Character

  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant requests assistance in calculating E(ln x) and Var(ln x) for a random sample from the distribution f(x; θ) = θ x^(θ - 1)I(0, 1)(x), where θ > 0, to aid in solving the Cramer-Rao lower bound.
  • Another participant discusses the lognormal distribution, stating that if X = e^Y where Y is normally distributed, then E(X) and Var(X) can be expressed in terms of the parameters μ and σ, but notes that moments do not uniquely define the distribution.
  • A different participant suggests using the inner product definition for E[g(x)] and refers to the continuous case for Var[g(x)], indicating that g(x) = ln x.
  • Another participant provides a formula for Var[g(x)], expressing it in terms of integrals involving g(x) and its expected value, and emphasizes the need for integration over the domain of x.

Areas of Agreement / Disagreement

Participants present various approaches and formulas for calculating E(ln x) and Var(ln x), but there is no consensus on a single method or solution. The discussion remains unresolved regarding the specific calculations and their implications for the Cramer-Rao lower bound.

Contextual Notes

Participants highlight the dependence on the specific distribution of X and the assumptions involved in calculating expected values and variances. There are also references to different methods and definitions that may affect the results.

safina
Messages
26
Reaction score
0
May I ask how to find the E\left(ln x\right) and Var\left(ln x)?
The X_{i} are random sample from the f\left(x; \theta\right) = \theta x^{\theta - 1}I_{\left(0, 1\right)}\left(x\right) where \theta > 0.

I need the information in finally solving the Cramer-Rao lower bound for the variance of an unbiased estimator of a function of \theta. And also for checking if I have an unbiased estimator.
 
Physics news on Phys.org
safina said:
May I ask how to find the E\left(ln x\right) and Var\left(ln x)?

A random variable X has a lognormal distribution if X=e^{Y} where Y is normally distributed with mean \mu and standard deviation \sigma.

Then: E(X)=e^{\mu+\sigma/2}, Var(X)=e^{2(\mu+\sigma)}-e^{2\mu+\sigma}

The moment generating function is: E(X^n)=e^{n\mu+(n^2\sigma^2/2)}

Note the moments of the lognormal distribution do not uniquely define the distribution and a finite mgf only exists on the interval (-\infty,0]

EDIT:: You can assign other distributions to Y, but without specification, I don't think there's a single answer to your question. The lognormal is the default under the Central Limit Theorem..
 
Last edited:
Let g(x)= ln x. For E[g(x)] see the inner product definition here. For Var[g(x)], look under continuous case here with g(x) replacing x and \mu is interpreted as E[g(x)]; alternatively see the delta method.
 
That is, Var[g(x)] = integral of ( g(x) - E[g(x)] )^2 f(x), over x's domain.

Alternatively, <br /> <br /> Var[g(x)] = E\left[g(x)^2\right] - \left(E[g(x)]\right)^2 <br /> = \int{g(x)^2 f(x) dx} - \left(\int {g(x)f(x)dx}\right)^2<br /> <br />

where each integral is over the domain of x.
 
Last edited:

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
4K