# Series and Sequence Proof

1. Jun 11, 2009

### JG89

1. The problem statement, all variables and given/known data

Prove that if the infinite series a_1 + a_2 + ... + a_v converges to a value A and s_n = a_1 + a_2 + a_3 + ... + a_n, then the sequence:

(s_1 + s_2 + ... + s_N)/N also converges, and has the limit A.

2. Relevant equations

3. The attempt at a solution

Since s_n represents the n'th partial sum, then
s_1 + s_2 + s_3 + ... s_N = a_1 + (a_1 + a_2) + (a_1 + a_2 + a_3) + ...+ (a_1 + ... + a_N)

=

N*a_1 + (N-1)*a_2 + (N-2)a_3 + ... + a_N

So our sequence looks like

(N*a_1 + (N-1)*a_2 + (N-2)a_3 + ... + a_N)/N

Notice that the terms in front of the a_i form a monotonic decreasing sequence which converges to 0: N/N = 1, (N-1)/N, (N-2)/N, ... , 1/N

So now I use Abel's Test: "Let a_1 + a_2 + ... be an infinite series whose partial sums are bounded independent of n. Let p_1, p_2, ... be a sequence of positive numbers decreasing monotonically to the value 0, then the infinite series p_1*a_1 + p_2*a_2 + ... converges"

Since my original series converges, its partial sums are bounded, and the terms in front of the a_i form a monotonically decreasing sequence going to 0, just like I said, and so the infinite series converges...

I'm just having trouble proving that it converges to the value A. I have no idea where to start.

2. Jun 11, 2009

### Office_Shredder

Staff Emeritus
That's not right. You don't have set $p_i$ for each $a_i$ instead they're changing and actually increasing. Maybe you can make it work with some fiddling but right now, you don't actually have an infinite series being multiplied by set decreasing numbers

You might have better luck noticing that

$$\lim_{N \rightarrow \infty} A - \sum_{n=0}^{N}a_n = 0$$ and then try to compare that to $$A - \frac{s_1 + s_2.... + s_N}{N}$$

3. Jun 11, 2009

### JG89

I just can't see where to go with that...

I've tried to bound $$A - \frac{s_1 + s_2.... + s_N}{N}$$ above by
$$A - \sum_{n=0}^{N}a_n$$ but have had no luck. Thinking about it more, it being bounded above by that probably depends on whether the series has both positive and negative terms, and how many of each, so it's better for me to abandon that idea...

4. Jun 11, 2009

### Dick

You know the s_i->A. So for every epsilon there is an M such that |s_i-A|<epsilon for all i>M, right? Let B=sum(s_i for i=1 to M). Look at S_N=(B+s_{M+1}+...+s_N)/N (which is just your original sum). Can you show that for N sufficiently large that |S_N-A|<2*epsilon?

5. Jun 11, 2009

### JG89

I'll get back to this question tomorrow. I've been thinking about it the last two days, I need a break.

6. Jun 13, 2009

### JG89

I've been thinking about it, and this is what I have. For every positive epsilon there exists a positive integer M such that |s_i - A| < epsilon. Let epsilon_i be the smallest possible epsilon (I know there is no such thing, but take it small enough so that epsilon_i - |s_i - A| = 0.000000001) for |s_i - A|

Now $$\frac{|s_1 - A| + |s_2 - A| + ... + |s_N - A|}{N} \ge \frac{|s_1 - A + s_2 - A + ... + s_N - A|}{N} = \frac{|s_1 + ... + s_N - NA|}{N} = | \frac{s_1 + ... + s_N}{N} - A|$$.

We also know that |s_1 - A| < epsilon_1, ..., |s_N - A| < epsilon_N, so, $$\frac{|s_1 - A| + |s_2 - A| + ... + |s_N - A|}{N} < \frac{ \epsilon_1 + \epsilon_2 + ... + \epsilon_N}{N}$$ and so we have:

$$| \frac{s_1 + ... + s_N}{N} - A| < \frac{ \epsilon_1 + \epsilon_2 + ... + \epsilon_N}{N}$$.

That's all I could come up with...

7. Jun 13, 2009

### Dick

That is a try, but not a very good one. Look back at my hint in post 4. Don't try to control the value of s_i for i<=M. They are completely out of control. They could be anything. But their sum is B. And you do know |s_i-A|<epsilon for i>M. Control each part separately. Hint: (N-M)/N goes to 1 as N->infinity and B/N goes to zero.

Last edited: Jun 14, 2009
8. Jun 14, 2009

### JG89

It seems to me like you're saying that M is a fixed number. Surely M has to increase to infinity as epsilon gets smaller and smaller?

9. Jun 14, 2009

### Dick

Absolutely right. But first fix an epsilon and prove the difference between your sum and A can be made less than some multiple of epsilon. Sure, M will depend on epsilon in the end. But it gets really confusing if you take all of the limits at the same time. Do them one by one.

Last edited: Jun 14, 2009
10. Jun 14, 2009

### JG89

Okay, here is what I've got:

Let's epsilon be a fixed positive value, and let M be the positive integer such that |s_i - A| < epsilon if i > M.

$$| \frac{B + s_{m+1} + ... + s_N - (N-M)A}{N} | = | \frac{B + s_{m+1} - A + s_{m+2} - A + ... + s_N - A}{N} | \le \frac{|B| + |s_{m+1} - A| + |s_{m+2} - A| + ... + |s_N - A|}{N}$$.

Remembering that |s_{m+1} - A| < epsilon, |s_{m+2} - A| < epsilon, ... , |s_N - A| < epsilon, we now have:

$$\frac{|B| + |s_{m+1} - A| + |s_{m+2} - A| + ... + |s_N - A|}{N} < \frac{|B| + \epsilon + ... + \epsilon}{N} = \frac{|B|}{N} + \frac{(N-M) \epsilon}{N}$$.

Now, (N-M)/N = 1 - M/N < 1 => (N-M)(epsilon)/N < epsilon. Also, B is just the finite sum of the s_i up to i = M (remember that M is fixed right now), and so |B/N - 0| = |B/N| < epsilon*
for all positive epsilon*, provided N is large enough.

Now we have: $$\frac{|B| + \epsilon + ... + \epsilon}{N} = \frac{|B|}{N} + \frac{(N-M) \epsilon}{N} < \epsilon* + \epsilon < max(2 \epsilon*, 2 \epsilon)$$

Implying that:

$$| \frac{B + s_{m+1} + ... + s_N - (N-M)A}{N} |= | \frac{B + s_{m+1} + ... + s_N}{N} - (1 - \frac{M}{N} ) A| < max(2 \epsilon*, 2 \epsilon)$$.

1 - M/N goes to 1 as N goes to infinity, then (1 - M/N)A goes to A, and so we can always make my original sum within the (fixed) 2*epsilon distance of a value that is approaching A. However, we can do this with any sized 2*epsilon we please, just provided we fix M large enough and let N increase beyond all bounds. And so the limit of the original sum is equal to the limit of (N-M)A/N = (1-M/N)A which is A.

Last edited: Jun 14, 2009
11. Jun 14, 2009

### Dick

You don't want the max of two epsilons there. The idea is pick a N such that |B/N|<epsilon and since the other term is <epsilon, then your whole sum is less than 2*epsilon. Since epsilon is arbitrary you are all done, right?

12. Jun 14, 2009

### JG89

Finally...that was pretty difficult...

Just a question, is it really valid that I said my sum is always within an epsilon distance of (1-M/N)A and since that is always within an arbitrary small distance of A (provided N is large enough) then the sum of my series is A?

I've never seen that done before, so I was not too confident in my answer

13. Jun 14, 2009

### Dick

Well, yeah. The idea was to show the difference between (s_1 + s_2 + ... + s_N)/N and A can be made arbitrarily small for sufficiently large N, right? Isn't that what convergence means?

14. Jun 14, 2009

### JG89

Yup, that's exactly what it means. Thanks for the help :)