In statistics, the likelihood ratio of two probability distributions f(x), g(x) with the same support (for simplicity) is L(x)=f(x)/g(x).(adsbygoogle = window.adsbygoogle || []).push({});

It is often simpler to work with the log likelihood l(x)=ln(f(x)/g(x))=ln(f(x))-ln(g(x)).

The Kullback-Liebler information number is defined as E{l(x)} using f(x) as the true distribution i.e. the expected value of the log likelihood when the true distribution is in the NUMERATOR of the likelihood.

Is there a name for the analogous concept but without taking logs?

That is, is there a name for the expected value of the likelihood function E{L(x)} assuming that the true distribution is f(x)?

E{L(x)} =Int over support (f(x)^2)/g(x)

A name or a reference to a book or article would be very helpful.

**Physics Forums - The Fusion of Science and Community**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Statistics Definition

Can you offer guidance or do you also need help?

Draft saved
Draft deleted

Loading...

Similar Threads - Statistics Definition | Date |
---|---|

B Super basic polynomial and exponent definition help | Feb 20, 2018 |

B A whole bunch of dice -- Statistics riddle | Jun 17, 2016 |

Do people who tend to be great at math suck at statistics? | Feb 15, 2016 |

Uncertainty Propagation | Jan 18, 2016 |

How to interpret the Pearson Correlation Index? | Dec 11, 2015 |

**Physics Forums - The Fusion of Science and Community**