Hey All - I am trying to solve a problem that should be really easy (at least every paper I read says the step is!)(adsbygoogle = window.adsbygoogle || []).push({});

I'm trying to understand where the Vasicek entropy estimator comes from:

I can write the differential entropy of a system as:

[tex]

H(f) = -\int^{\infty}_{-\infty} f(x)log(f(x))dx

[/tex]

where f(x) is your probability distribution function

Apparently it is an easy step that you can re-write this in the form:

[tex]

H(f) = \int^{1}_{0} log(\frac{d}{dp}F^{-1}(p)) dp

[/tex]

Where [tex] F^{1} [/tex] is the inverse of the culmative distribution function.

I've tried using the derivative of an inverse function that I learned when trying to find the derivative of inverse trig functions but all I got was:

[tex]

\frac{d F^{-1}(p)}{dp} = \frac{1}{f(F^{-1}(p))}

[/tex]

By the way just in case I need to point it out - this isn't homework - its a stupid problem which is making me feel incredibly stupid not being able to solve

Thanks,

Thrillhouse

**Physics Forums - The Fusion of Science and Community**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Derivation of Vasicek Entropy Estimator

Loading...

Similar Threads - Derivation Vasicek Entropy | Date |
---|---|

I Derivation of the Cantor set | Oct 3, 2017 |

I Density, distribution and derivative relationship (stats) | May 15, 2017 |

I Weibull distribution derivation | May 1, 2017 |

I Deriving convolution | Sep 24, 2016 |

Is Shannon Entropy subjective? | Jan 24, 2016 |

**Physics Forums - The Fusion of Science and Community**