Are the two definitions of "bit" compatible?

  • Thread starter Thread starter nomadreid
  • Start date Start date
  • Tags Tags
    Bit Definitions
AI Thread Summary
The discussion centers around the definitions of "bit," highlighting two primary interpretations: one as a unit of information and the other as a binary digit representing values of 0 or 1. Both definitions are deemed correct, with the first being a broader, general usage term and the second a precise technical term rooted in computing. The term "bit" is derived from "binary digit," emphasizing its connection to binary systems used by computers. The context in which "bit" is used influences its interpretation, with modern usage often leaning towards the binary definition due to the prevalence of computing. Additionally, the conversation touches on Shannon entropy, linking it to the concept of bits in information theory, and discusses the historical use of "bit" in other contexts, such as currency. The complexity of quantifying information is also mentioned, with a query about the nature of entropy and its mathematical representation in relation to bits.
nomadreid
Gold Member
Messages
1,748
Reaction score
243
First, I am not sure that this is the right place for this post, because earlier Physics Forums broke up the "computing and technology" into several subheadings, such as information science as opposed to hardware, and so forth, that I don't see anymore. Anyway, so here is an elementary question: I see two different definitions of "bit": one, from (say) Wikipedia, that the bit is a unit of the quantity of information, and another, say from http://whatis.techtarget.com/definition/bit-binary-digit, that says that a bit can have one of two values, 0 or 1. There is of course a connection between the two definitions, but they are not the same. So, which one is the proper definition?
 
Computer science news on Phys.org
nomadreid said:
First, I am not sure that this is the right place for this post, because earlier Physics Forums broke up the "computing and technology" into several subheadings, such as information science as opposed to hardware, and so forth, that I don't see anymore. Anyway, so here is an elementary question: I see two different definitions of "bit": one, from (say) Wikipedia, that the bit is a unit of the quantity of information, and another, say from http://whatis.techtarget.com/definition/bit-binary-digit, that says that a bit can have one of two values, 0 or 1. There is of course a connection between the two definitions, but they are not the same. So, which one is the proper definition?
Why would you want to restrict the word to a single definition? In the "0 or 1" definition, it is a technical term with a precise meaning. In the "quantity of (general) information" or "small amount" it is a general-usage English language term that preceded the technical term.

EDIT: now that I think about it, I seem to recall that the technical term didn't come from the general term (well, maybe a little bit; pun intended) but rather is a contraction of "Binary digIT".
 
  • Like
Likes nomadreid
Both definitions are correct. The connection is that the unit of information is expressed as either 0 or 1 by computers, which only recognise binary states.
 
  • Like
Likes nomadreid
Thanks, phinds and sk1105.

First, phinds: I did not mean "quantity" in the everyday English usage but rather the usage in Shannon entropy, where , according to http://en.wikipedia.org/wiki/Entropy_(information_theory), whereby you need "log2(n) bits to represent a variable that can take one of n values". That is, one bit is = the uncertainty of an event that is 0 or 1 with equal probability, or information that is gained when the outcome of such an event becomes known.

sk1105: yes, computers will express everything in binary form, and the"an event that is 0 or 1 with equal probability" of the above definition is indeed a connection, but not an equivalence. So, when you say that both definitions are correct, do you mean that you consider them equivalent, or that the choice between the definitions is given by context?

By the way, is Physics Forums' pun of "every bit helps" intentional?
 
I would initially suggest that it depends on the context. However, whilst you could use the term to describe a general unit of information, the prevalence of computers nowadays means that units of information are generally assumed to be 0 and 1, so the computational definition is almost always the one used. Indeed, the word 'bit' actually derives from 'binary digit', and did not appear before the advent of modern computers.

In summary, you could extend the use of 'bit' to quantify information outside the realm of computing, but be aware that your reader/listener will likely assume you are talking about the '0 or 1' definition.
 
  • Like
Likes nomadreid
The choice between the definitions is given by context? Yes, this example about Morse code (a variable length digital code with more Shannon’s information per bit of encoded message) explains about 'bits' in digital codes and 'bits' as binary states used to express those codes when transmitted.

http://en.wikipedia.org/wiki/Morse_code#Representation.2C_timing_and_speeds
Morse code is transmitted using just two states (on and off). Historians have called it the first digital code. Strictly speaking it is not binary, as there are five fundamental elements (see quinary). However, this does not mean Morse code cannot be represented as a binary code. In an abstract sense, this is the function that telegraph operators perform when transmitting messages. Working from the above definitions and further defining a "unit" as a bit, we can visualize any Morse code sequence as a combination of the following five elements:

When I has a kid (long before modern computers) the word 'bit' was commonly used about money. Two bits were a quarter. "pieces of eight for a dollar"
http://en.wikipedia.org/wiki/Bit_(money)
 
Last edited:
  • Like
Likes nomadreid
Thanks, sk1105 and nsaspook.

sk1105: OK, that sounds reasonable, except that I am not the one who defined 'bit' in terms of amount of information, and that definition is not outside the realm of computing, as Shannon entropy is very much inside the realm of computing.

nsaspook: hm, Morse code, with its five symbols, would be working with a nickle, or perhaps a nyckle...
 
Yes you are right. I think I made a poor choice of words, but I was trying to get at the difference between 'bit' in a binary logic sense as used in computation, and 'bit' in a more general sense referring to information.
 
  • Like
Likes nomadreid
Shannon introduced the term and used it in both manners (http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf ):

The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the
base 2 is used the resulting units may be called binary digits, or more briefly
bits,
a word suggested by
J. W. Tukey. A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit of
information.

So 'bit' is a measure of information and the bit of a modern digital computer has as much information as the bit of information theory.
 
Last edited by a moderator:
  • Like
Likes nomadreid and Merlin3189
  • #10
Thank you, ScottSalley. That sums it up nicely. While I have the attention of those-in-the-know about information units, I take the liberty of extending the question. (I'm not sure if this requires a new thread.) Entropy in Information Theory is defined by wikipedia as the "average amount of information contained in each message received". The Shannon entropy for a random sample with a probability distribution of {i, pi} is then Σpilog2(1/pi) . But since information is quantised, why isn't it something like Σ pi(int(log2(1/pi)))? ("Int" for the floor function.)

(for those getting email notifications: note that I edited this, so you may have received a slightly incorrect version in your email: I moved the "int" inwards.)
 
Last edited:
Back
Top