agentredlum
- 484
- 0
Hurkyl said:So what? This criticism only has relevance if, for some strange reason, I need to use a notation where everything is notated in a unique way.
While such a property is nice and occasionally useful, it is nowhere near as important as you are making it out to be.
As an aside, it is a trivial exercise to tweak decimal notation for real numbers so that every real number really does have a unique numeral form. (the two most common ways are to forbid decimals ending in repeated 0's, or to forbid decimals ending in repeated 9's).
I do know the answer; I did the division and I got the result "2/3". This isn't decimal notation, but you didn't ask for that.
Notating things as arithmetic expressions has the advantage that arithmetic is very, very easy. One practical application is that this notation is of absolutely crucial importance in efficient C++ linear algebra packages -- when you add two vectors v and w, it effectively stores the result as a triple "(plus, v, w)". It doesn't convert the result into an actual vector unless you (or some library routine) ask it to store the result in a vector.
(why is it crucial? Because if you wrote a C++ program to do x = u + v + w in a naive way, you would waste a lot of time and memory creating unnecessary intermediate value vectors)
I am not disputing any of your points about keeping numbers as expressions in computer programs because writing it as a decimal will produce rounding errors. Example, replacing 1/7 with a decimal aapproximation is not wise for at least 2 reasons. 1) more memory is needed to store it as a decimal, depending on how much precision you want. 2) the decimal will produce rounding errors if the program uses it hundreds or even millions of times in the same calculation.
It is much better to keep it as 1/7 and after all the algebra is done and the final result stored and displayed on your screen, then you ask the computer to perform the long division of fractions in the final result. This will minimize errors and save memory. That is a good point, I agree with you on that.
Our difference of opinion boils down to this... you believe that in writing down 2/3 you have performed division (long division), I believe you have not.
Or maybe you believe fractions are more important than decimals?
I'm not trying to change your beliefs. I made what I thought were clever arguments to support my beliefs about the seemingly unimportant perceived inadequacies of long division.
If you asked me to divide 2 by 3 and i wrote down 2/3 you wouldn't be annoyed?
If i write down 2÷3 as my answer you would accept it?
If i wrote down the problem, the way schoolchildren do, 3 on the outside as divisor, 2 on the inside as dividend and then stopped without doing a single calculation to get the quotient or remainder or long division to get the decimal approximation, would you be happy or would you think i was a smart-aleck?
When someone writes down 2/3 they haven't done a single calculation, how can they know the answer without calculating it?
Is the answer 2/3 ? Absolutely! Then the next question becomes 'what does 2/3 mean?'
You are right, it's not that important, but it is curious to me how one can start with the set of positive integers where addition and multiplication don't force you to make corrections however subtraction and division force you to make corrections. Subtraction forces you to extend the positive integers to include zero and the negatives, while division forces you (among other things) to accept a very non-intuitive result such as 1 = .9999... = 1.000...= 4.9999.../5.000...etc. Like micromass pointed out, the representations are infinite in number.