|Mar27-12, 07:45 AM||#1|
Information Theory - Shannon's "Self-Information" units
I'm familiar with information and coding theory, and do know that the units of Shannon information content (-log_2(P(A))) are "bits". Where "bit" is a "binary digit", or a "storage device that has two stable states".
But, can someone rigorously prove that the units are actually "bits"? Or we should only accept it as a definition and then justify it with coding examples.
|Apr8-12, 06:44 AM||#2|
No, you can use what ever logarithmic base. For natural information you could use unit "nat" with base e and for binary information unit "bit" with base 2.
|bits, information theory, shannon|
|Similar Threads for: Information Theory - Shannon's "Self-Information" units|
|Is it "Information" or "Interaction" that "Observes"?||Quantum Physics||9|
|Review of Lenny Susskind's "Black Holes, Information, String Theory||General Physics||0|