Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Are the two definitions of "bit" compatible?

  1. Feb 19, 2015 #1


    User Avatar
    Gold Member

    First, I am not sure that this is the right place for this post, because earlier Physics Forums broke up the "computing and technology" into several subheadings, such as information science as opposed to hardware, and so forth, that I don't see anymore. Anyway, so here is an elementary question: I see two different definitions of "bit": one, from (say) Wikipedia, that the bit is a unit of the quantity of information, and another, say from http://whatis.techtarget.com/definition/bit-binary-digit, that says that a bit can have one of two values, 0 or 1. There is of course a connection between the two definitions, but they are not the same. So, which one is the proper definition?
  2. jcsd
  3. Feb 19, 2015 #2


    User Avatar
    Gold Member

    Why would you want to restrict the word to a single definition? In the "0 or 1" definition, it is a technical term with a precise meaning. In the "quantity of (general) information" or "small amount" it is a general-usage English language term that preceded the technical term.

    EDIT: now that I think about it, I seem to recall that the technical term didn't come from the general term (well, maybe a little bit; pun intended) but rather is a contraction of "Binary digIT".
  4. Feb 19, 2015 #3
    Both definitions are correct. The connection is that the unit of information is expressed as either 0 or 1 by computers, which only recognise binary states.
  5. Feb 19, 2015 #4


    User Avatar
    Gold Member

    Thanks, phinds and sk1105.

    First, phinds: I did not mean "quantity" in the everyday English usage but rather the usage in Shannon entropy, where , according to http://en.wikipedia.org/wiki/Entropy_(information_theory), whereby you need "log2(n) bits to represent a variable that can take one of n values". That is, one bit is = the uncertainty of an event that is 0 or 1 with equal probability, or information that is gained when the outcome of such an event becomes known.

    sk1105: yes, computers will express everything in binary form, and the"an event that is 0 or 1 with equal probability" of the above definition is indeed a connection, but not an equivalence. So, when you say that both definitions are correct, do you mean that you consider them equivalent, or that the choice between the definitions is given by context?

    By the way, is Physics Forums' pun of "every bit helps" intentional?
  6. Feb 19, 2015 #5
    I would initially suggest that it depends on the context. However, whilst you could use the term to describe a general unit of information, the prevalence of computers nowadays means that units of information are generally assumed to be 0 and 1, so the computational definition is almost always the one used. Indeed, the word 'bit' actually derives from 'binary digit', and did not appear before the advent of modern computers.

    In summary, you could extend the use of 'bit' to quantify information outside the realm of computing, but be aware that your reader/listener will likely assume you are talking about the '0 or 1' definition.
  7. Feb 19, 2015 #6


    User Avatar
    Science Advisor

    The choice between the definitions is given by context? Yes, this example about Morse code (a variable length digital code with more Shannon’s information per bit of encoded message) explains about 'bits' in digital codes and 'bits' as binary states used to express those codes when transmitted.

    When I has a kid (long before modern computers) the word 'bit' was commonly used about money. Two bits were a quarter. "pieces of eight for a dollar"
    Last edited: Feb 19, 2015
  8. Feb 19, 2015 #7


    User Avatar
    Gold Member

    Thanks, sk1105 and nsaspook.

    sk1105: OK, that sounds reasonable, except that I am not the one who defined 'bit' in terms of amount of information, and that definition is not outside the realm of computing, as Shannon entropy is very much inside the realm of computing.

    nsaspook: hm, Morse code, with its five symbols, would be working with a nickle, or perhaps a nyckle....
  9. Feb 19, 2015 #8
    Yes you are right. I think I made a poor choice of words, but I was trying to get at the difference between 'bit' in a binary logic sense as used in computation, and 'bit' in a more general sense referring to information.
  10. Feb 20, 2015 #9
    Shannon introduced the term and used it in both manners (http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf [Broken]):

    The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the
    base 2 is used the resulting units may be called binary digits, or more briefly
    a word suggested by
    J. W. Tukey. A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit of

    So 'bit' is a measure of information and the bit of a modern digital computer has as much information as the bit of information theory.
    Last edited by a moderator: May 7, 2017
  11. Feb 21, 2015 #10


    User Avatar
    Gold Member

    Thank you, ScottSalley. That sums it up nicely. While I have the attention of those-in-the-know about information units, I take the liberty of extending the question. (I'm not sure if this requires a new thread.) Entropy in Information Theory is defined by wikipedia as the "average amount of information contained in each message received". The Shannon entropy for a random sample with a probability distribution of {i, pi} is then Σpilog2(1/pi) . But since information is quantised, why isn't it something like Σ pi(int(log2(1/pi)))? ("Int" for the floor function.)

    (for those getting email notifications: note that I edited this, so you may have received a slightly incorrect version in your email: I moved the "int" inwards.)
    Last edited: Feb 21, 2015
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook