# Entropy of a message of N bits (help!)

1. Nov 22, 2007

### T-7

Hi,

This is a re-launch of an earlier question that I have narrowed down to just part a). I have also added my musings on it so far. This is all very new to me (and we haven't been given much in the lectures to go by [nor is there a course text]). I'd appreciate some help asap!

1. The problem statement, all variables and given/known data

Lack of information is defined as

$$S_{info} = - \sum_{i} p_{i}log_{2}p_{i}$$

($$p_{i}$$ is the probability of the outcome i). Calculate the entropy associated with the reception of N bits.

3. The attempt at a solution

(This is very much blind guess work, but I expect that will become obvious...)

Ok. I take this to be a summation over an ensemble of possible arrangements, each with its own probability $$p_{i}$$.

Well, a bit is either on or off (1 or 0). 2 choices. Suppose I define $$\Omega_{i}$$ to be the number of ways of arranging the bits (microstates) for a case where there are x in state 1 and y in state 0 (a macrostate).

$$\Omega_{i} = \frac{N!}{x!y!}$$

There are N bits. The sum is rolling from the first to the last possible outcome (macrostate). So rewrite

$$\Omega_{i} = \frac{N!}{(N-i)!i!}$$

(So for $$\Omega_{first}$$ we have all the bits in, say, state 0. For $$\Omega_{last}$$ we have all the bits in the other state - state 1).

We're after the probability of each macrostate, I take it. So

$$p_{i} = \frac{\Omega_{i}}{total no. microstates} = \frac{\Omega_{i}}{2^N} = \frac{N!}{(N-i)!i!2^{N}}$$

If that's true, I'd hope to be able to crunch

$$S_{info} = - \sum_{i} p_{i}log_{2}p_{i}$$ into an expression involving just N.

Am I right so far? Probably not.

( As it happens, I crunch it all the way down to

$$-Nlog_{2}N + N$$

which is almost certainly wrong. But I need to work through it again... I'd better get this posted first :-)

Cheers!

Last edited: Nov 22, 2007
2. Nov 22, 2007

### ozymandias

Entropy is a quantity of a system, not an instance of that system.
Your N-bit message is described by some probability distribution and you should specify that prob. distribution first.
If we take a "Bernoulli"-type distribution, where the bits are independent and each is 1 with a probability of p and 0 with a probability of q=1-p, then

H(X1,X2,X3,...,XN) = H(X1) + ... + H(XN) (by independence)

and $$H(X_i) = -p \log (p) - (1-p) \log(1-p)$$

so

$$H = N \left[ -p \log (p) - (1-p) \log(1-p) \right]$$

in particular, if p=1/2, $$H = N \log \left( 2\right)$$

--------
Assaf
http://www.physicallyincorrect.com/" [Broken]

Last edited by a moderator: May 3, 2017
3. Nov 22, 2007

### T-7

Thanks for showing me how you got to the answer. It seems we are to explicitly use the formula above, however, working out a function pi and summing.

I probably need to look at your post again more carefully...

Last edited by a moderator: May 3, 2017
4. Nov 22, 2007

### ozymandias

I don't quite understand what you mean by that, but if you'll elaborate I'll see if I can help.

--------
Assaf
http://www.physicallyincorrect.com/" [Broken]

Last edited by a moderator: May 3, 2017
5. Nov 22, 2007

### T-7

Well, I have tried to work out a function pi to fit into $$S_{info} = - \sum_{i} p_{i}log_{2}p_{i}$$, and crunch the sum into something that is just a function of N. (apparently my function is mistaken, though it seemed logical?)

Last edited by a moderator: May 3, 2017