1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Entropy of a message of N bits (help!)

  1. Nov 22, 2007 #1


    User Avatar


    This is a re-launch of an earlier question that I have narrowed down to just part a). I have also added my musings on it so far. This is all very new to me (and we haven't been given much in the lectures to go by [nor is there a course text]). I'd appreciate some help asap!

    1. The problem statement, all variables and given/known data

    Lack of information is defined as

    [tex]S_{info} = - \sum_{i} p_{i}log_{2}p_{i} [/tex]

    ([tex]p_{i}[/tex] is the probability of the outcome i). Calculate the entropy associated with the reception of N bits.

    3. The attempt at a solution

    (This is very much blind guess work, but I expect that will become obvious...)

    Ok. I take this to be a summation over an ensemble of possible arrangements, each with its own probability [tex]p_{i}[/tex].

    Well, a bit is either on or off (1 or 0). 2 choices. Suppose I define [tex]\Omega_{i}[/tex] to be the number of ways of arranging the bits (microstates) for a case where there are x in state 1 and y in state 0 (a macrostate).

    [tex]\Omega_{i} = \frac{N!}{x!y!}[/tex]

    There are N bits. The sum is rolling from the first to the last possible outcome (macrostate). So rewrite

    [tex]\Omega_{i} = \frac{N!}{(N-i)!i!}[/tex]

    (So for [tex]\Omega_{first}[/tex] we have all the bits in, say, state 0. For [tex]\Omega_{last}[/tex] we have all the bits in the other state - state 1).

    We're after the probability of each macrostate, I take it. So

    [tex]p_{i} = \frac{\Omega_{i}}{total no. microstates} = \frac{\Omega_{i}}{2^N} = \frac{N!}{(N-i)!i!2^{N}}[/tex]

    If that's true, I'd hope to be able to crunch

    [tex]S_{info} = - \sum_{i} p_{i}log_{2}p_{i} [/tex] into an expression involving just N.

    Am I right so far? Probably not.

    ( As it happens, I crunch it all the way down to

    [tex]-Nlog_{2}N + N[/tex]

    which is almost certainly wrong. But I need to work through it again... I'd better get this posted first :-)

    Last edited: Nov 22, 2007
  2. jcsd
  3. Nov 22, 2007 #2
    Entropy is a quantity of a system, not an instance of that system.
    Your N-bit message is described by some probability distribution and you should specify that prob. distribution first.
    If we take a "Bernoulli"-type distribution, where the bits are independent and each is 1 with a probability of p and 0 with a probability of q=1-p, then

    H(X1,X2,X3,...,XN) = H(X1) + ... + H(XN) (by independence)

    and [tex]H(X_i) = -p \log (p) - (1-p) \log(1-p) [/tex]


    [tex]H = N \left[ -p \log (p) - (1-p) \log(1-p) \right] [/tex]

    in particular, if p=1/2, [tex]H = N \log \left( 2\right) [/tex]

    http://www.physicallyincorrect.com/" [Broken]
    Last edited by a moderator: May 3, 2017
  4. Nov 22, 2007 #3


    User Avatar

    Thanks for showing me how you got to the answer. It seems we are to explicitly use the formula above, however, working out a function pi and summing.

    I probably need to look at your post again more carefully...
    Last edited by a moderator: May 3, 2017
  5. Nov 22, 2007 #4
    I don't quite understand what you mean by that, but if you'll elaborate I'll see if I can help.

    http://www.physicallyincorrect.com/" [Broken]
    Last edited by a moderator: May 3, 2017
  6. Nov 22, 2007 #5


    User Avatar

    Well, I have tried to work out a function pi to fit into [tex]S_{info} = - \sum_{i} p_{i}log_{2}p_{i} [/tex], and crunch the sum into something that is just a function of N. (apparently my function is mistaken, though it seemed logical?)
    Last edited by a moderator: May 3, 2017
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook