Fredrik said:
A "proof of 0.999...=1" only tells us that the series ##\sum_{n=1}^\infty 9\cdot 10^{-n}## is convergent and has the sum 1. This is interpreted as a proof of 0.999...=1, because the standard definition of 0.999... says that the string of text "0.999..." represents the sum of that series.
I think that a person who struggles with the equality 0.999...=1 would benefit as much or more from hearing an explanation of why that definition is standard, as from seeing a proof of ##\sum_{n=1}^\infty 9\cdot 10^{-n}=1##. They probably think that this is an entirely different problem.
A few more thoughts along these lines. I think almost everyone will agree that 0.999... should be defined as the sum of that series (if such a sum exists and is unique). And I think almost everyone who's still with us after that will agree that the sum of that series should be defined as the limit of the sequence 0.9, 0.99, 0.999,... (if such a limit exists and is unique). This brings us to the definition of the limit of a sequence. I like to state it in the following way:
A real number L is said to be a limit of a sequence S if every open interval that contains L contains all but a finite number of terms of S.
Since this definition takes a while to understand if you haven't seen it before, I can imagine that some students won't agree that this definition is a good idea. So let's return to that in a while. First, here's a proof that 1 is a limit of that sequence, and that no other real number is.
Let M be any real number other than 1. Define t=|M-1|/2 and consider the intervals (1-t,1+t). and (M-t,M+t). Note that they are disjoint. The former contains all but a finite number of terms of the sequence, so the latter contains at most a finite number of terms of the sequence. This means that 1 is a limit, and M is not.
Now all that remains is to explain why we're choosing that particular definition of "limit". There are of course many reasons why we're doing it, but the one that's the most relevant here is that we
want the sequence 0.9, 0.99, 0.999,... to have 1 as its unique limit.
This brings us to the point that I think isn't emphasized enough in these discussions. The real reason why 0.999...=1 is that we
want this equality to hold. There was never any chance that it wouldn't hold, because if our first choice of definitions would have given us another result, we wouldn't have accepted the result. We would have changed the definitions.