thrillhouse86
- 77
- 0
Hey All - I am trying to solve a problem that should be really easy (at least every paper I read says the step is!)
I'm trying to understand where the Vasicek entropy estimator comes from:
I can write the differential entropy of a system as:
<br /> H(f) = -\int^{\infty}_{-\infty} f(x)log(f(x))dx<br />
where f(x) is your probability distribution function
Apparently it is an easy step that you can re-write this in the form:
<br /> H(f) = \int^{1}_{0} log(\frac{d}{dp}F^{-1}(p)) dp<br />
Where F^{1} is the inverse of the culmative distribution function.
I've tried using the derivative of an inverse function that I learned when trying to find the derivative of inverse trig functions but all I got was:
<br /> \frac{d F^{-1}(p)}{dp} = \frac{1}{f(F^{-1}(p))}<br />
By the way just in case I need to point it out - this isn't homework - its a stupid problem which is making me feel incredibly stupid not being able to solve
Thanks,
Thrillhouse
I'm trying to understand where the Vasicek entropy estimator comes from:
I can write the differential entropy of a system as:
<br /> H(f) = -\int^{\infty}_{-\infty} f(x)log(f(x))dx<br />
where f(x) is your probability distribution function
Apparently it is an easy step that you can re-write this in the form:
<br /> H(f) = \int^{1}_{0} log(\frac{d}{dp}F^{-1}(p)) dp<br />
Where F^{1} is the inverse of the culmative distribution function.
I've tried using the derivative of an inverse function that I learned when trying to find the derivative of inverse trig functions but all I got was:
<br /> \frac{d F^{-1}(p)}{dp} = \frac{1}{f(F^{-1}(p))}<br />
By the way just in case I need to point it out - this isn't homework - its a stupid problem which is making me feel incredibly stupid not being able to solve
Thanks,
Thrillhouse
Last edited: