Given a number representing information entropy in some base, is there a well-defined way to convert this to the number which would represent the same entropy according to another base? Most of the definitions I've read so far use bits, but Mathematic uses nats, and Wolfram Alpha says "nats and bits are not compatible" in response to the input "convert nats to bits". But don't the two units measure the same quality? Perhaps, by not compatible, it means only there isn't an affine relationship between them, as between, say Celcius and Fahrenheit.(adsbygoogle = window.adsbygoogle || []).push({});

It seems like there would usually be more than one way to express given number as a sum or integral. Suppose I found one way of expressing an entropy in nats as a sum or integral, then replaced the natural logarithms with base 2 logarithms, would the result be the same whichever way I found of expressing the original number of bits as a sum or integral?

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Converting between bits, nats and dits

Loading...

Similar Threads for Converting between bits |
---|

I The difference between a family of sets and an indexed family of sets |

**Physics Forums | Science Articles, Homework Help, Discussion**