muzak
- 42
- 0
Convergence of Divergent Series Whose Sequence Has a Limit
Suppose ∑a_{n} is a series with lim a_{n} = L ≠ 0. Obviously this diverges since L ≠ 0. Suppose we make the new series, ∑(a_{n} - L). My question is this: is there some sufficient condition we could put solely on a_{n} (or maybe its sum) without calculating the limit or referencing it other than the existence of a limit so that the new series converges?
I can't think of anything here; I could list all of the convergence tests from Ch. 3 of Baby Rudin but doesn't seem wholly relevant with the way I phrased the question. Feels like this is probably something trivial from some higher level analysis course or analytic number theory course that I haven't had the chance to take yet.
There's one thing that seems possibly relevant (from Rudin):
Theorem Suppose a_{1}≥a_{2}≥a_{3}≥...≥0. Then the series ∑a_{n} converges if and only if the series ∑2^{k}a_{2^{k}} converges.
I really don't know how to approach this without introducing the L somehow into the proof which isn't what I really wish to explore.
I thought to make some simple conditions first to try and reduce the nature of the problem, like making the sequence monotonic and non-negative. With the theorem, I was thinking of maybe introducing some sort of scaling maybe with the 2^{k} factor? I don't know.
I think the root test gives convergence for 0 < L < 1, but what about for L > 1? But this sort of references the limit.
Homework Statement
Suppose ∑a_{n} is a series with lim a_{n} = L ≠ 0. Obviously this diverges since L ≠ 0. Suppose we make the new series, ∑(a_{n} - L). My question is this: is there some sufficient condition we could put solely on a_{n} (or maybe its sum) without calculating the limit or referencing it other than the existence of a limit so that the new series converges?
Homework Equations
I can't think of anything here; I could list all of the convergence tests from Ch. 3 of Baby Rudin but doesn't seem wholly relevant with the way I phrased the question. Feels like this is probably something trivial from some higher level analysis course or analytic number theory course that I haven't had the chance to take yet.
There's one thing that seems possibly relevant (from Rudin):
Theorem Suppose a_{1}≥a_{2}≥a_{3}≥...≥0. Then the series ∑a_{n} converges if and only if the series ∑2^{k}a_{2^{k}} converges.
The Attempt at a Solution
I really don't know how to approach this without introducing the L somehow into the proof which isn't what I really wish to explore.
I thought to make some simple conditions first to try and reduce the nature of the problem, like making the sequence monotonic and non-negative. With the theorem, I was thinking of maybe introducing some sort of scaling maybe with the 2^{k} factor? I don't know.
I think the root test gives convergence for 0 < L < 1, but what about for L > 1? But this sort of references the limit.
Last edited: