Converting between bits, nats and dits

  • Context: Graduate 
  • Thread starter Thread starter Rasalhague
  • Start date Start date
  • Tags Tags
    Bits
Click For Summary
SUMMARY

The discussion focuses on the conversion of information entropy between different bases, specifically bits and nats. It clarifies that while both units measure the same quality of information, they are not directly compatible due to the lack of an affine relationship between them. The mathematical relationship is established through the conversion factor involving logarithms, specifically using the equation \log_e(x)=\frac{\log_2(x)}{\log_2(e)}. This indicates that converting entropy expressed in nats to bits requires careful consideration of the logarithmic base used in the calculations.

PREREQUISITES
  • Understanding of information theory concepts, particularly entropy
  • Familiarity with logarithmic functions and their properties
  • Knowledge of base conversions in mathematics
  • Basic proficiency in mathematical notation and summation
NEXT STEPS
  • Research the properties of logarithms, specifically the relationship between natural logarithms and base 2 logarithms
  • Explore the concept of entropy in information theory, focusing on its mathematical definitions and applications
  • Learn about the implications of using different bases in entropy calculations
  • Investigate the practical applications of converting between bits and nats in data compression and information transmission
USEFUL FOR

Mathematicians, information theorists, data scientists, and anyone involved in the fields of data compression and information transmission will benefit from this discussion.

Rasalhague
Messages
1,383
Reaction score
2
Given a number representing information entropy in some base, is there a well-defined way to convert this to the number which would represent the same entropy according to another base? Most of the definitions I've read so far use bits, but Mathematic uses nats, and Wolfram Alpha says "nats and bits are not compatible" in response to the input "convert nats to bits". But don't the two units measure the same quality? Perhaps, by not compatible, it means only there isn't an affine relationship between them, as between, say celsius and Fahrenheit.

It seems like there would usually be more than one way to express given number as a sum or integral. Suppose I found one way of expressing an entropy in nats as a sum or integral, then replaced the natural logarithms with base 2 logarithms, would the result be the same whichever way I found of expressing the original number of bits as a sum or integral?
 
Physics news on Phys.org
Just a simple conversion factor:

http://en.wikipedia.org/wiki/Bit#Other_information_units

Because

[tex]\log_e(x)=\frac{\log_2(x)}{\log_2(e)},[/tex]

so

[tex]- \sum_{x \in A_X} \log_2(P_X(\left \{ x \right \}))\cdot P_X(\left \{ x \right \})=-\log_2(e)\sum_{x \in A_X}\log_e(P_X(\left \{ x \right \}))\cdot P_X(\left \{ x \right \}).[/tex]
 
Last edited:

Similar threads

Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
4K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 6 ·
Replies
6
Views
989
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 85 ·
3
Replies
85
Views
9K
  • · Replies 18 ·
Replies
18
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K