Does anyone know of any analytical expression for the upper bound on the Kullback–Leibler divergence for a discrete random variable?(adsbygoogle = window.adsbygoogle || []).push({});

What I am looking for is the bound expressed as

0 <= S_KL <= f(k)

Where k is the number of distinguishable outcomes.

Ultimately I am also looking for the bound in the case where the probability itself belongs to a special discrete set, rather than R. (This would correspond to combinatorics of microstates instead of a the continuum probability; edit: this means there is another variable entering the bound which is the sample/memory size, but the first question is what the limit is in sample size -> infinity.)

I think I looked for this long time ago. If anyone happens to just know a pointer to where this is worked out I would be interested.

The context of the question is to find a deeper understanding various of the entropy bounds in physics.

I am to start with not sure to what extent there are analytical expressions or if I need to make some computer simulations.

Edit: I'm not looking for some approximate (large k) bounds, I am looking for the actual dependence of the upper bound on k.

/Fredrik

**Physics Forums - The Fusion of Science and Community**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Upper bound for K-L divergence on discrete prob. space

Loading...

Similar Threads for Upper bound divergence | Date |
---|---|

I Upper bound and supremum problem | Apr 12, 2017 |

Estimating upper bound from measurements with uncertainties | Jan 21, 2014 |

Lower and Upper bound proof in R | Mar 7, 2013 |

Least upper bound of open interval. | Oct 8, 2012 |

Upper bound of random variable | Feb 21, 2012 |

**Physics Forums - The Fusion of Science and Community**