## Does the first two-digit number have to be 10?

 Quote by arildno I wouldn't remember even half of them. Would you?
Unless there was a method to creating the symbol. Could there be such a method for rational and irrational numbers?

It is just interesting to think about.

 Quote by Diffy Unless there was a method to creating the symbol. Could there be such a method for rational and irrational numbers? It is just interesting to think about.
There is such a system in place now. The character "2343678" uniquely determines that number, and, by a clever method, we know exactly where that number is in relation to all the other natural numbers.

The rationals have a similar system, but the representation isn't unique (i.e. 1/1 = 2/2 = 3/3...).

Most irrationals can't be defined (see http://en.wikipedia.org/wiki/Definable_real_number). So we certainly can't put a method in place to name them all.
 Recognitions: Gold Member Science Advisor Staff Emeritus I, II, III, IIII, IIIII... is precisely such a system. In effect, the Roman numeral system is such a system. They are not, of course, place number system, which is what you need to get "10" as the "first two digit number". And it is very awkward, though, of course, not impossible, to write very large numbers or fractional numbers.

 Quote by Diffy Why not a number system were every single number has it's own unique symbol?
You would have to print large numbers in a large font or risk running out of symbols. In a 4x6 font you would only have 24 pixels and accordingly could only count up to 16,777,216.

In addition, some of those numerals would be rather difficult to distinguish from one another.

There is a notion of radix "efficiency" where the cost of each digit is taken as proportional to the number of possible digits at that position. The cost of expressing a number x in radix r is then approximated by r * ln(x)/ln(r). For purposes of comparison a relevant figure of merit is e * ln(r)/r

By this measure of efficiency:

Recognitions:
Gold Member
 Quote by Mark44 Not true. Quite a few computers in the early days used octal,]
Mark, you have a serious misunderstanding of computer architecture. As jbriggs444 pointed out, octal is just a shorthand for 3 bits, so that people can manipulate things more easily. All normal computers use binary. Some use octal representations and some use hexidecimal representations, but all that is just shorthand for PEOPLE, not for the comptuer.

There actually HAVE been computers that didn't use binary, specifically one that used decimal (using what's called BCD --- binary coded decimal) and actually performed decimal, not binary arithmetic,

 Quote by Benn There is such a system in place now. The character "2343678" uniquely determines that number, and, by a clever method, we know exactly where that number is in relation to all the other natural numbers. The rationals have a similar system, but the representation isn't unique (i.e. 1/1 = 2/2 = 3/3...). Most irrationals can't be defined (see http://en.wikipedia.org/wiki/Definable_real_number). So we certainly can't put a method in place to name them all.
That wiki article is fascinating. I've never really seen anything like that before. Thanks for posting it.

And yes, very clever saying that we have such a system in place now. I guess I should have been more specific in what I actually mean. Which is hard for me because I am not quite sure myself!

To the OPs point, we start repeating symbols in our numeral system. In essence, base 10 starts to repeat symbols for integers once we get to 10. The symbol one has been repeated for both 1 and 10. I consider here 10 a combination of two symbols 1 and 0.

Furthermore, Decimals like .2 reuse the symbol for 2. In the system I was trying to imagine, .2 would have reused the symbol for the number 2.

I am probably talking gibberish now. So I'll stop.

Thanks again for the article.

Recognitions:
Homework Help
 Quote by Diffy Why not a number system were every single number has it's own unique symbol?
Sure, if you can find a way of creating an infinite number of symbols that readers can remember (or at the very least deduce) and distinguish between.
I'm sure some primitive civilizations once used all different symbols for their number system, but people since then have learnt of the power of combinations of symbols.

In fact, doesn't Mandarin (Chinese) use all different characters in its language to represent syllables? It must be a weird feeling to be able to read road signs and such but then you pick up a newspaper and get stuck on the first few characters.

Recognitions:
Homework Help
 Quote by Mentallic In fact, doesn't Mandarin (Chinese) use all different characters in its language to represent syllables? It must be a weird feeling to be able to read road signs and such but then you pick up a newspaper and get stuck on the first few characters.
Not quite. Chinese characters are used to represent the digits and some of the powers of 10 -- you have characters for 0 (sort of), 1-9, 10, 100, 1000, 10000... Chinese numbers are grouped by ten-thousands (not thousands).

Then you have a separate set of Chinese characters used by financial institutions, because the standard characters are geometrically simple and are easy to forge. See this Wikipedia article.

Recognitions:
Homework Help
 Quote by phinds All normal computers use binary. Some use octal representations and some use hexidecimal representations, but all that is just shorthand for PEOPLE, not for the comptuer.
Sometimes I wonder what would be like if we used the hexidecimal numbers instead of decimal from the start. I believe that the ancient Chinese used a system of weights based on hexadecimal instead of decimal.

There is website that advocates the use of hexadecimal, even changing the way we tell time (hexclock). Advocacy to use hexadecimal numbers is not new. See this book, published in 1862.

Of course, it's not practical for the world to change the way we count and do math today, but I find it interesting to think about.

Recognitions:
Homework Help
 Quote by eumyang Not quite. Chinese characters are used to represent the digits and some of the powers of 10 -- you have characters for 0 (sort of), 1-9, 10, 100, 1000, 10000... Chinese numbers are grouped by ten-thousands (not thousands).
Actually I was referring to their alphabet, not their numerical system, but I didn't know that so thanks for sharing

 Quote by Diffy Why not a number system were every single number has it's own unique symbol?
This reminds me of a short-story by Jorge Luis Borges, "Funes the memorious", which can be found on the web. Some excerpts:
 [...] He told me that toward 1886 he had devised a new system of enumeration and that in a very few days he had gone before twenty-four thousand. He had not written it down, for what he once meditated would not be erased. The first stimulus to his work, I believe, had been his discontent with the fact that "thirty-three Uruguayans" required two symbols and three words, rather than a single word and a single symbol. Later he applied his extravagant principle to the other numbers. In place of seven thousand thirteen, he would say (for example) Máximo Perez; in place of seven thousand fourteen, The Train; other numbers were Luis Melián Lafinur, Olimar, Brimstone, Clubs, The Whale, Gas, The Cauldron, Napoleon, Agustín de Vedia. In lieu of five hundred, he would say nine. [...] Locke, in the seventeenth century, postulated (and rejected) an impossible idiom in which each individual object, each stone, each bird and branch had an individual name; Funes had once projected an analogous idiom, but he had renounced it as being too general, too ambiguous. In effect, Funes not only remembered every leaf on every tree of every wood, but even every one of the times he had perceived or imagined it. [...] [...] It was not only difficult for him to understand that the generic term dog embraced so many unlike specimens of differing sizes and different forms; he was disturbed by the fact that a dog at three-fourteen (seen in profile) should have the same name as the dog at three-fifteen (seen from the front). [...] Without effort, he had learned English, French, Portuguese, Latin. I suspect, nevertheless, that he was not very capable of thought. To think is to forget a difference, to generalize, to abstract. In the overly replete world of Funes there were nothing but details, almost contiguous details.