Hi,(adsbygoogle = window.adsbygoogle || []).push({});

Some Qs on information -- something rather new to me. I'm not quite sure what to do with these:

1. The problem statement, all variables and given/known data

a) Lack of information is defined as

[tex]S_{info} = - \sum_{i} p_{i}log_{2}p_{i} [/tex]

([tex]p_{i}[/tex] is the probability of the outcome i). Calculate the entropy associated with the reception of N bits.

b) The human brain has approx. [tex]10^{10}[/tex] neurons each connected with 1000 neighbours. If one assumes that memory consists in activiting or deactivating each bond, what is the maximum amount of information which may be stored in the brain?

3. The attempt at a solution

a) Well, a bit is either on or off (1 or 0), so I suppose that each bit received has a probability of 0.5 attached to it (there are two states, and there is no mathematical reason to think one more probable than the other). So I would say all the [tex]p_{i}[/tex] are 0.5. If there are N bits, then that's N 'outcomes', each with probability 0.5.

So does this simply boil down to

[tex]S_{info} = - \sum_{1}^{N} 0.5 \times log_{2}(0.5) = N \times 0.5 [/tex] (?)

b) I guess it's a bit to a bond -- that is, each activation/deactivation corresponds to a bit. But are the 'neighbours' it refers to, do you think, supposed to be in addition to the [tex]10^{10}[/tex] collection? In which case, is it merely [tex]10^{10} \times 1000[/tex] bits of information? Too easy, I suspect... Hmm.

A little guidance would be appreciated!

Cheers.

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Information theory questions

**Physics Forums | Science Articles, Homework Help, Discussion**