# Converting between bits, nats and dits

1. Aug 12, 2011

### Rasalhague

Given a number representing information entropy in some base, is there a well-defined way to convert this to the number which would represent the same entropy according to another base? Most of the definitions I've read so far use bits, but Mathematic uses nats, and Wolfram Alpha says "nats and bits are not compatible" in response to the input "convert nats to bits". But don't the two units measure the same quality? Perhaps, by not compatible, it means only there isn't an affine relationship between them, as between, say Celcius and Fahrenheit.

It seems like there would usually be more than one way to express given number as a sum or integral. Suppose I found one way of expressing an entropy in nats as a sum or integral, then replaced the natural logarithms with base 2 logarithms, would the result be the same whichever way I found of expressing the original number of bits as a sum or integral?

2. Aug 12, 2011

### Rasalhague

Just a simple conversion factor:

http://en.wikipedia.org/wiki/Bit#Other_information_units

Because

$$\log_e(x)=\frac{\log_2(x)}{\log_2(e)},$$

so

$$- \sum_{x \in A_X} \log_2(P_X(\left \{ x \right \}))\cdot P_X(\left \{ x \right \})=-\log_2(e)\sum_{x \in A_X}\log_e(P_X(\left \{ x \right \}))\cdot P_X(\left \{ x \right \}).$$

Last edited: Aug 12, 2011