Information Theory - Shannon's "Self-Information" units


by Mihel
Tags: bits, information theory, shannon
Mihel
Mihel is offline
#1
Mar27-12, 07:45 AM
P: 1
Hi,

I'm familiar with information and coding theory, and do know that the units of Shannon information content (-log_2(P(A))) are "bits". Where "bit" is a "binary digit", or a "storage device that has two stable states".

But, can someone rigorously prove that the units are actually "bits"? Or we should only accept it as a definition and then justify it with coding examples.

Thanks!
Phys.Org News Partner Engineering news on Phys.org
SensaBubble: It's a bubble, but not as we know it (w/ video)
WSU innovation improves drowsy driver detection
Faster computation of electromagnetic interference on an electronic circuit board
totinen
totinen is offline
#2
Apr8-12, 06:44 AM
P: 3
No, you can use what ever logarithmic base. For natural information you could use unit "nat" with base e and for binary information unit "bit" with base 2.


Register to reply

Related Discussions
Is it "Information" or "Interaction" that "Observes"? Quantum Physics 9
Review of Lenny Susskind's "Black Holes, Information, String Theory General Physics 0