It seems to me that convergence rounds away the possibility of there being a smallest constituent part of reality. For instance, adding 1/2 + 1/4 + 1/8 . . . etc. would never become 1, since there would always be an infinitely small fraction that made the second half unreachable relative to the former fraction. Unless, of course, there's a smallest value that eventually gets added twice at the end. Is that the implication in math or is it just rounded by convention regarding infinity? Or is there some other reason?