gottfried
- 118
- 0
Ʃ(\frac{1}{n}) diverges but Ʃ(\frac{1}{n^2}) converges but both corresponding sequences converge (I think). So at what rate does a sequence have to converge for the corresponding series to converge.
Or asked differently what is the largest value x can be for Ʃ(\frac{1}{n^x}) to diverge
Or asked differently what is the largest value x can be for Ʃ(\frac{1}{n^x}) to diverge