Entropy of a message of N bits (help)

  • Thread starter Thread starter T-7
  • Start date Start date
  • Tags Tags
    Bits Entropy
Click For Summary

Homework Help Overview

The discussion revolves around calculating the entropy associated with the reception of N bits, using the formula for information entropy. The original poster expresses uncertainty about their understanding and attempts to derive the entropy based on the arrangement of bits and their probabilities.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • The original poster attempts to define the entropy using a summation over possible arrangements of bits, questioning their understanding of the probability distribution and the application of the entropy formula. Other participants suggest considering a Bernoulli distribution for independent bits and provide a formula for entropy based on that distribution.

Discussion Status

Participants are exploring different interpretations of the entropy calculation, with some providing guidance on using specific probability distributions. The original poster is seeking clarification on how to apply the suggested formulas and is working through their own understanding of the problem.

Contextual Notes

The original poster notes a lack of resources and guidance from lectures, which may contribute to their uncertainty in applying the concepts discussed.

T-7
Messages
62
Reaction score
0
Hi,

This is a re-launch of an earlier question that I have narrowed down to just part a). I have also added my musings on it so far. This is all very new to me (and we haven't been given much in the lectures to go by [nor is there a course text]). I'd appreciate some help asap!

Homework Statement



Lack of information is defined as

[tex]S_{info} = - \sum_{i} p_{i}log_{2}p_{i}[/tex]

([tex]p_{i}[/tex] is the probability of the outcome i). Calculate the entropy associated with the reception of N bits.

The Attempt at a Solution



(This is very much blind guess work, but I expect that will become obvious...)

Ok. I take this to be a summation over an ensemble of possible arrangements, each with its own probability [tex]p_{i}[/tex].

Well, a bit is either on or off (1 or 0). 2 choices. Suppose I define [tex]\Omega_{i}[/tex] to be the number of ways of arranging the bits (microstates) for a case where there are x in state 1 and y in state 0 (a macrostate).

[tex]\Omega_{i} = \frac{N!}{x!y!}[/tex]

There are N bits. The sum is rolling from the first to the last possible outcome (macrostate). So rewrite

[tex]\Omega_{i} = \frac{N!}{(N-i)!i!}[/tex]

(So for [tex]\Omega_{first}[/tex] we have all the bits in, say, state 0. For [tex]\Omega_{last}[/tex] we have all the bits in the other state - state 1).

We're after the probability of each macrostate, I take it. So

[tex]p_{i} = \frac{\Omega_{i}}{total no. microstates} = \frac{\Omega_{i}}{2^N} = \frac{N!}{(N-i)!i!2^{N}}[/tex]

If that's true, I'd hope to be able to crunch

[tex]S_{info} = - \sum_{i} p_{i}log_{2}p_{i}[/tex] into an expression involving just N.

Am I right so far? Probably not.

( As it happens, I crunch it all the way down to

[tex]-Nlog_{2}N + N[/tex]

which is almost certainly wrong. But I need to work through it again... I'd better get this posted first :-)

Cheers!
 
Last edited:
Physics news on Phys.org
Entropy is a quantity of a system, not an instance of that system.
Your N-bit message is described by some probability distribution and you should specify that prob. distribution first.
If we take a "Bernoulli"-type distribution, where the bits are independent and each is 1 with a probability of p and 0 with a probability of q=1-p, then

H(X1,X2,X3,...,XN) = H(X1) + ... + H(XN) (by independence)

and [tex]H(X_i) = -p \log (p) - (1-p) \log(1-p)[/tex]

so

[tex]H = N \left[ -p \log (p) - (1-p) \log(1-p) \right][/tex]

in particular, if p=1/2, [tex]H = N \log \left( 2\right)[/tex]

--------
Assaf
http://www.physicallyincorrect.com/"
 
Last edited by a moderator:
ozymandias said:
Entropy is a quantity of a system, not an instance of that system.
Your N-bit message is described by some probability distribution and you should specify that prob. distribution first.
If we take a "Bernoulli"-type distribution, where the bits are independent and each is 1 with a probability of p and 0 with a probability of q=1-p, then

H(X1,X2,X3,...,XN) = H(X1) + ... + H(XN) (by independence)

and [tex]H(X_i) = -p \log (p) - (1-p) \log(1-p)[/tex]

so

[tex]H = N \left[ -p \log (p) - (1-p) \log(1-p) \right][/tex]

in particular, if p=1/2, [tex]H = N \log \left( 2\right)[/tex]

--------
Assaf
http://www.physicallyincorrect.com/"

Thanks for showing me how you got to the answer. It seems we are to explicitly use the formula above, however, working out a function pi and summing.

I probably need to look at your post again more carefully...
 
Last edited by a moderator:
It seems we are to explicitly use the formula above, however, working out a function pi and summing

I don't quite understand what you mean by that, but if you'll elaborate I'll see if I can help.

--------
Assaf
http://www.physicallyincorrect.com/"
 
Last edited by a moderator:
ozymandias said:
I don't quite understand what you mean by that, but if you'll elaborate I'll see if I can help.

--------
Assaf
http://www.physicallyincorrect.com/"

Well, I have tried to work out a function pi to fit into [tex]S_{info} = - \sum_{i} p_{i}log_{2}p_{i}[/tex], and crunch the sum into something that is just a function of N. (apparently my function is mistaken, though it seemed logical?)
 
Last edited by a moderator:

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
7
Views
2K
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
16
Views
2K