gottfried
- 118
- 0
Reading through a real analysis textbook I noticed that \sum 1/K
diverges but \sum 1/K1+\epsilon converges for all \epsilon > 0.
This is confusing because 1/k will eventually be equally as small as the terms in 1/K1+\epsilon and therefore it should also converge. It may take much longer but surely the numbers eventually become equally small and adding more on becomes pointless.
I also saw that one can prove this using the integral test but the integral test isn't the 'decider' there must a more rigorous reason.
Essentially what I'm asking is who or what mathematical concept determined the required rate for convergence because it seems arbitrary to me.
diverges but \sum 1/K1+\epsilon converges for all \epsilon > 0.
This is confusing because 1/k will eventually be equally as small as the terms in 1/K1+\epsilon and therefore it should also converge. It may take much longer but surely the numbers eventually become equally small and adding more on becomes pointless.
I also saw that one can prove this using the integral test but the integral test isn't the 'decider' there must a more rigorous reason.
Essentially what I'm asking is who or what mathematical concept determined the required rate for convergence because it seems arbitrary to me.
Last edited: