Hi(adsbygoogle = window.adsbygoogle || []).push({});

The source coding theorem says that one needs at least N*H bits to encode a message of length N and entropy H. This supposedly is the theoretical limit of data compression.

But is it? Or does it only apply to situations where only the frequency (probability) of a given symbol is known?

For example, If I know that the symbol x is always followed by the symbol y (or that this happens with a very high probability) couldn't I use this to construct a compression algorithm that needs fewer bits than N*H?

thx

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Limit of compression (source coding theorem)

Loading...

Similar Threads - Limit compression source | Date |
---|---|

B Quotient limit law | Mar 11, 2018 |

A Lebesgue measure and integral | Jan 14, 2018 |

I Infinity and limits | Jan 6, 2018 |

An algorithm for data compression? | Mar 27, 2013 |

**Physics Forums - The Fusion of Science and Community**