meBigGuy
Gold Member
- 2,325
- 406
There are two subjects being discussed, and confused.
1. How many bits to represent a quantity of symbols
2. How to code numbers in binary.
Maybe my wording is not precise, but how many symbols one might use, and what people decide the symbols represent are, for the most part, orthogonal concepts.
For example, let's design a 3 bit binary coding system using 8 symbols to represent 0,1,2,3,4,5,6,7.
Let's design a 3 bit binary coding system using 8 symbols to represent -4, -3, -2, -1 0, 1, 2, 3.
What people decide to have each symbol represent defines the coding system. (For example, unsigned, sign-mag, and 2's comp are commonly accepted methodologies, but one can define anything)
1. How many bits to represent a quantity of symbols
2. How to code numbers in binary.
Maybe my wording is not precise, but how many symbols one might use, and what people decide the symbols represent are, for the most part, orthogonal concepts.
For example, let's design a 3 bit binary coding system using 8 symbols to represent 0,1,2,3,4,5,6,7.
Let's design a 3 bit binary coding system using 8 symbols to represent -4, -3, -2, -1 0, 1, 2, 3.
What people decide to have each symbol represent defines the coding system. (For example, unsigned, sign-mag, and 2's comp are commonly accepted methodologies, but one can define anything)