Hurkyl
Staff Emeritus
Science Advisor
Gold Member
- 14,922
- 28
I don't recall having trouble with infinite decimals before taking calculus.If you protest that you want an explanation that does not require infinite series, I would counter that you can't understand non-terminating decimal expanasions without using infinite series.

One learns how to do arithmetic with terminating decimals early; if you learn the variations that work left to right, then those same elementary school algorithms for doing arithmetic can be used for non-terminating decimals too.
I've even seen people define the real numbers in this manner: numbers are strings of digits (modulo relations like 0.999... = 1.000...), and the arithmetic operations are given as explicit algorithms that operate on strings of digits.
The algorithms for addition and subtraction give yet another reason why 0.999... = 1.000...; for example, when subtracting 1 from 2, you can either have no borrows, or you can have a borrow in every place.
Of course, one could resolve the ambiguity by disallowing decimals ending in all 9's Interestingly, in the text I saw use this approach, I believe it did exactly the opposite: it disallowed terminating decimals! So "0.999..." was a legal decimal, but "1" was not.