- #1

techmologist

- 306

- 12

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter techmologist
- Start date

- #1

techmologist

- 306

- 12

- #2

chiro

Science Advisor

- 4,815

- 134

Are you referring to the long term distribution of symbols? In Markovian analysis there is a thing called the stationary distribution that you can calculate given a valid transition matrix which gives the long distribution of getting a particular value as the number of process realizations gets large (or tends to infinity).

- #3

techmologist

- 306

- 12

Another argument that is entirely convincing to me, but isn't a proof, is this. If L_n is the average codeword length of a Huffman code for blocks of n source symbols, then as n increases, L_n/n approaches the source entropy H(S). Since the average entropy of a block of n source symbols is n*H(S), the average entropy per code symbol is n*H(S)/L_n, and this tends to 1. That is, the code symbols tend toward being equiprobable.

This argument has the same gaps as my first argument, namely assuming that averages of products and quotients are the same as the products and quotients of the averages. They must not differ by much for the problem at hand, but that fact would still require a proof. It makes me wonder if there ain't a better, more direct argument.

- #4

chiro

Science Advisor

- 4,815

- 134

Remember that if you have independence, entropy is additive and the mutual entropy is 0. If you have a distribution for the Huffman codes for each block and show that it goes to uniform, then you're done.

- #5

techmologist

- 306

- 12

- #6

chiro

Science Advisor

- 4,815

- 134

Before doing a full fledged analytic proof, try simulation to generate the distribution for the final Huffman codes, instead of doing it analytically.

With regard to the analytic distribution though, you should use a computer program to generate an explicit distribution corresponding to map every n-bit block to every relevant huffman code and then calculate the entropy of that.

Basically this will do an exhaustive case for a given n and it won't be a general proof. Once you do this though, you can look at the structure and see if you can make an inductive argument about either the distribution itself or just the entropy.

What I mean by the above is that the computer generates a PDF for a given n which is explicit and then take that and try a symbolic extension to generate a PDF for some value of n.

So to verify, first do simulation. Then try analytic for specific n by simply mapping each code to it's corresponding huff-man code and go from there.

At least if you have an explicit distribution for a few values of n on your screen this might help you get the pattern you are after.

- #7

techmologist

- 306

- 12

Share:

- Last Post

- Replies
- 1

- Views
- 534

- Last Post

- Replies
- 29

- Views
- 904

- Last Post

- Replies
- 6

- Views
- 490

- Last Post

- Replies
- 0

- Views
- 986

- Last Post

- Replies
- 5

- Views
- 642

- Replies
- 1

- Views
- 281

- Last Post

- Replies
- 2

- Views
- 1K

- Last Post

- Replies
- 4

- Views
- 1K

- Last Post

- Replies
- 1

- Views
- 1K

- Replies
- 3

- Views
- 629