Register to reply

Information Theory - Shannon's Self-Information units

by Mihel
Tags: bits, information theory, shannon
Share this thread:
Mar27-12, 07:45 AM
P: 1

I'm familiar with information and coding theory, and do know that the units of Shannon information content (-log_2(P(A))) are "bits". Where "bit" is a "binary digit", or a "storage device that has two stable states".

But, can someone rigorously prove that the units are actually "bits"? Or we should only accept it as a definition and then justify it with coding examples.

Phys.Org News Partner Engineering news on
DIY glove-based tutor indicates muscle-memory potential
Tricorder XPRIZE: 10 teams advance in global competition to develop consumer-focused diagnostic device
Study shows local seismic isolation and damping methods provide optimal protection for essential computing equipment
Apr8-12, 06:44 AM
P: 3
No, you can use what ever logarithmic base. For natural information you could use unit "nat" with base e and for binary information unit "bit" with base 2.

Register to reply

Related Discussions
Is it Information or Interaction that Observes ? Quantum Physics 9
Review of Lenny Susskind's Black Holes, Information, String Theory General Physics 0