Greetings,(adsbygoogle = window.adsbygoogle || []).push({});

I want to ask you somthing if i understood well this subject.

Lets say we have an order 1 binary source.

H(a)=-Paa*log(Paa)-Paa*log(Pab) bit/symbol.

From what i understand this is the average information of a symbol generated after an "a", like aa or ab.

Is it right???

**Physics Forums - The Fusion of Science and Community**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# About markov source entropy

Loading...

Similar Threads - markov source entropy | Date |
---|---|

I A seemingly simple problem about probability | Jan 29, 2018 |

I Probability of a Stochastic Markov process | Dec 27, 2017 |

A Markov Chain as a function of dimensions | Aug 10, 2017 |

Markov sources/chains | Oct 20, 2012 |

Huffman coding a Markov source | Aug 17, 2012 |

**Physics Forums - The Fusion of Science and Community**