The source coding theorem tells us that given a discrete probability distribution, there is an optimal encoding for it. Is it possible to go in the reverse direction? That is, suppose you start with an encoding of a discrete random variable X whose distribution is unknown. Assuming that this encoding is optimal, can one derive the distribution of X?(adsbygoogle = window.adsbygoogle || []).push({});

Suppose that one can determine the distribution of X from the optimal encoding. Then this would allow us to determine the a priori probability of any string in the encoding. In other words, that would let us construct a concept of probability out of nothing but a formal language!

**Physics Forums - The Fusion of Science and Community**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Inverse source coding theorem

Loading...

Similar Threads - Inverse source coding | Date |
---|---|

B Inverse gamblers fallacy | Jun 9, 2016 |

Show a functions inverse is injective iff f is surjective | Aug 29, 2015 |

Multiple Sources At Different Frequencies (Probability) | Jun 7, 2015 |

Inverse Efficiency Matrix (error) | Dec 17, 2014 |

Converse, inverse and contrapositive | Jul 10, 2013 |

**Physics Forums - The Fusion of Science and Community**