Information Theory - Shannon's Self-Information units

  • Thread starter Mihel
  • Start date
Information Theory - Shannon's "Self-Information" units


I'm familiar with information and coding theory, and do know that the units of Shannon information content (-log_2(P(A))) are "bits". Where "bit" is a "binary digit", or a "storage device that has two stable states".

But, can someone rigorously prove that the units are actually "bits"? Or we should only accept it as a definition and then justify it with coding examples.

Re: Information Theory - Shannon's "Self-Information" units

No, you can use what ever logarithmic base. For natural information you could use unit "nat" with base e and for binary information unit "bit" with base 2.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving