Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Why Is There No SI Unit For Information?

  1. Jul 13, 2017 #1

    Nim

    User Avatar

    Why is the quantity "information" missing from the SI system?

    Also, if they did add it, do you think it would be added as a base unit or dimensionless derived unit?
     
  2. jcsd
  3. Jul 13, 2017 #2
    I think the unit of information is the Bit.
     
  4. Jul 13, 2017 #3
    I don't think it is "missing", all scientifically pertinent information is standardized as SI values.
     
  5. Jul 13, 2017 #4

    jbriggs444

    User Avatar
    Science Advisor

    Information is a dimensionless quantity: the negative log of the probability of occurrence. The question of units comes down to the question of what base to use when taking the log. Euler's number is a good choice. Base 2 (i.e. the bit) is another viable choice.
     
  6. Jul 13, 2017 #5

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    That's only a binary digit. There's nothing fundamental about binary coding of information. Standard computer memory capacity can be described in numbers of bits and the bit is useful as a comparison unit but the possible information content is way beyond the memory capacity.
    This wiki link will give you some background to Information Theory. Interestingly, the original work on Information theory did not use 'bits' because binary computers were in their infancy. Shannon started off using Morse Code as a model.
     
  7. Jul 13, 2017 #6
    That article links to another on the "hartley" as a unit of information, relevant within the domain of information theory & application of that theory: https://en.m.wikipedia.org/wiki/Hartley_(unit)

    To quote from the article:

    The hartley (symbol Hart), also called a ban, or a dit (short for decimal digit), is a logarithmic unit which measures information or entropy, based on base 10 logarithms and powers of 10, rather than the powers of 2 and base 2 logarithms which define the bit, or shannon. One ban or hartley is the information content of an event if the probability of that event occurring is 1/10.[1] It is therefore equal to the information contained in one decimal digit (or dit), assuming a prioriequiprobability of each possible value . . .

    Though not an SI unit, the hartley is part of the International System of Quantities, defined by International Standard IEC 80000-13 of the International Electrotechnical Commission. It is named after Ralph Hartley.
    My math knowledge approaches the limit 0, but judging by the mention of certain terms, the above sounds related to what @jbriggs444 has suggested - yes? (EDIT - see @anorlunda's comment, below.) And note also the reference in the first paragraph to the bit, or Shannon, already mentioned.

    More generally, getting back to the OP's question, surely a unit implies a context; and so a question for @Nim might be, did you have in mind any particular context (e.g. something other than information theory, digital communications, etc.)?
     
    Last edited: Jul 13, 2017
  8. Jul 13, 2017 #7

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    In addition to the bit, there is the qubit as used in quantum computers. A qubit is a lot moe information than a bit but precisely how much is uncertain :-p

    @jbriggs444 's definition (which relates the the number of micro states) works in thermodynamics. But information is also related to unitarity (quantum physics), Liouville's theorum (classical physics), Shannon's information theory (as @sophiecentaur mentioned), to the rule that the sum of all possibilities in a physical system is identically one, and the reversibility of physical laws at the micro level that Leonard Susskind calls the minus first law of physics.

    I have yet to hear a definition of information that applies in all contexts.

    Information is closely related to (but not identical to) entropy. Wikipedia's disambiguation page for entropy links to 16 technical definitions of the word. I expect the same difficulty defining information.

    Information is synonymous with knowledge in natural language and in the dictionary, but not in physics. That greatly adds to the confusion because people think knowledge when they hear information.
     
  9. Jul 13, 2017 #8
    With regards to this, I found a couple of interesting posts; the author is Tom Schnieder, a cancer researcher with NIH, and the posts seem to be part of a FAQ for the news group bionet.info-theory, a forum for discussing information theory in biology.

    What's interesting is that rather than entropy, apparently Shannon preferred to speak of "conditional entropy", which relates to information reducing uncertainty - a rather different concept than entropy per se. I apologize for not being able to vet this content myself due to its math &a conceptual content, but wonder if it might be of interest nonetheless:

    https://schneider.ncifcrf.gov/information.is.not.uncertainty.html

    https://schneider.ncifcrf.gov/bionet.info-theory.faq.html#Information.Equal.Entropy
     
    Last edited: Jul 13, 2017
  10. Jul 13, 2017 #9

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    I read recently on PF an account about Shannon's choice of words. He didn't think of entropy, but a physicist at Bell Labs advised him to use entropy, because of the way it was defined. I apologize for not having a link to the thread or to the actual source of that conversation.

    Uncertainty is a word that carries lots of other baggage.
     
  11. Jul 13, 2017 #10
    It certainly does. Apropos of that, I am slowly making my way through Willful Ignorance, by Herbert Weisberg, which deals in part with the emergence of classical and modern probability & the shedding along the way of much older notions of the nature of uncertainty, including questions of causation, morality, etc.

    Regarding Shannon choosing to use "entropy", here is a link to a short bio of engineer Myron Tribus, in which he is quoted as saying Shannon told him Von Neumann was the one who suggested this: http://www.eoht.info/m/page/Myron+Tribus . Tribus wrote up the encounter in a 1971 Sci.Am. article as follows:

    What’s in a name? In the case of Shannon’s measure the naming was not accidental. In 1961 one of us (Tribus) asked Shannon what he had thought about when he had finally confirmed his famous measure. Shannon replied: ‘My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage.”​
     
    Last edited: Jul 13, 2017
  12. Jul 17, 2017 at 5:32 PM #11
    I have the seen the word "nit" for the natural log one.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Why Is There No SI Unit For Information?
  1. Si units (Replies: 1)

  2. Ln() & SI Units (Replies: 6)

Loading...