uart
Science Advisor
- 2,797
- 21
ok Robokapp. If you want a really unsophisicated argument then think about it like this. Let's just say for one moment that 0.999... and 1 are different ok. Then by what amount do they differ?
The usual punter will think about this for a while and then reply that they differ by 0.000...1
Now let's look at the "number" represented by 0.000...1, that's a decimal point followed by an infinite number of zeros with a one placed right on the end. Really think about what that means, the zeros go on for ever, they never ever end, and then when they end you want to place a one. Can you see the logical falacy there, the one on the end is pointless, if the zeros never end then you simple *cant* place a one on the end. In other words 0.000...1 is not a valid representation of a number, there's really no point in even writting it, but if we do write it then we must at least agree the it equals zero exactly.
So 0.999... and 1 are different and the amount that they differ by is precisely zero. Ok well maybe we should just say they're equal then. You know it makes sense.
The usual punter will think about this for a while and then reply that they differ by 0.000...1
Now let's look at the "number" represented by 0.000...1, that's a decimal point followed by an infinite number of zeros with a one placed right on the end. Really think about what that means, the zeros go on for ever, they never ever end, and then when they end you want to place a one. Can you see the logical falacy there, the one on the end is pointless, if the zeros never end then you simple *cant* place a one on the end. In other words 0.000...1 is not a valid representation of a number, there's really no point in even writting it, but if we do write it then we must at least agree the it equals zero exactly.
So 0.999... and 1 are different and the amount that they differ by is precisely zero. Ok well maybe we should just say they're equal then. You know it makes sense.
Last edited: