T-7
- 62
- 0
Hi,
This is a re-launch of an earlier question that I have narrowed down to just part a). I have also added my musings on it so far. This is all very new to me (and we haven't been given much in the lectures to go by [nor is there a course text]). I'd appreciate some help asap!
Lack of information is defined as
[tex]S_{info} = - \sum_{i} p_{i}log_{2}p_{i}[/tex]
([tex]p_{i}[/tex] is the probability of the outcome i). Calculate the entropy associated with the reception of N bits.
(This is very much blind guess work, but I expect that will become obvious...)
Ok. I take this to be a summation over an ensemble of possible arrangements, each with its own probability [tex]p_{i}[/tex].
Well, a bit is either on or off (1 or 0). 2 choices. Suppose I define [tex]\Omega_{i}[/tex] to be the number of ways of arranging the bits (microstates) for a case where there are x in state 1 and y in state 0 (a macrostate).
[tex]\Omega_{i} = \frac{N!}{x!y!}[/tex]
There are N bits. The sum is rolling from the first to the last possible outcome (macrostate). So rewrite
[tex]\Omega_{i} = \frac{N!}{(N-i)!i!}[/tex]
(So for [tex]\Omega_{first}[/tex] we have all the bits in, say, state 0. For [tex]\Omega_{last}[/tex] we have all the bits in the other state - state 1).
We're after the probability of each macrostate, I take it. So
[tex]p_{i} = \frac{\Omega_{i}}{total no. microstates} = \frac{\Omega_{i}}{2^N} = \frac{N!}{(N-i)!i!2^{N}}[/tex]
If that's true, I'd hope to be able to crunch
[tex]S_{info} = - \sum_{i} p_{i}log_{2}p_{i}[/tex] into an expression involving just N.
Am I right so far? Probably not.
( As it happens, I crunch it all the way down to
[tex]-Nlog_{2}N + N[/tex]
which is almost certainly wrong. But I need to work through it again... I'd better get this posted first :-)
Cheers!
This is a re-launch of an earlier question that I have narrowed down to just part a). I have also added my musings on it so far. This is all very new to me (and we haven't been given much in the lectures to go by [nor is there a course text]). I'd appreciate some help asap!
Homework Statement
Lack of information is defined as
[tex]S_{info} = - \sum_{i} p_{i}log_{2}p_{i}[/tex]
([tex]p_{i}[/tex] is the probability of the outcome i). Calculate the entropy associated with the reception of N bits.
The Attempt at a Solution
(This is very much blind guess work, but I expect that will become obvious...)
Ok. I take this to be a summation over an ensemble of possible arrangements, each with its own probability [tex]p_{i}[/tex].
Well, a bit is either on or off (1 or 0). 2 choices. Suppose I define [tex]\Omega_{i}[/tex] to be the number of ways of arranging the bits (microstates) for a case where there are x in state 1 and y in state 0 (a macrostate).
[tex]\Omega_{i} = \frac{N!}{x!y!}[/tex]
There are N bits. The sum is rolling from the first to the last possible outcome (macrostate). So rewrite
[tex]\Omega_{i} = \frac{N!}{(N-i)!i!}[/tex]
(So for [tex]\Omega_{first}[/tex] we have all the bits in, say, state 0. For [tex]\Omega_{last}[/tex] we have all the bits in the other state - state 1).
We're after the probability of each macrostate, I take it. So
[tex]p_{i} = \frac{\Omega_{i}}{total no. microstates} = \frac{\Omega_{i}}{2^N} = \frac{N!}{(N-i)!i!2^{N}}[/tex]
If that's true, I'd hope to be able to crunch
[tex]S_{info} = - \sum_{i} p_{i}log_{2}p_{i}[/tex] into an expression involving just N.
Am I right so far? Probably not.
( As it happens, I crunch it all the way down to
[tex]-Nlog_{2}N + N[/tex]
which is almost certainly wrong. But I need to work through it again... I'd better get this posted first :-)
Cheers!
Last edited: