Proving the Divergence of a Sequence

Click For Summary
The discussion revolves around proving the divergence of the sequence defined by xn = (-1)^n. The initial approach involved assuming convergence and applying the epsilon definition, but the user encountered difficulties in demonstrating a contradiction. Key points include recognizing that the sequence has two convergent subsequences—one tending to 1 and the other to -1—indicating divergence. The conversation also touches on the formal definition of convergence and provides a method for proving divergence by contradiction, emphasizing the use of specific epsilon values. The thread concludes with a question about proving the limit of the absolute value of a sequence, highlighting the relevance of the triangle inequality in this context.
Claire84
Messages
218
Reaction score
0
Hey there, one of our h/work questions is to prove that a certain sequence is divergent, where xn=(-1)^n for every natural n. I started off by assuming that it was infact convergent so wrote that mod(xn/l)<e where e is any real number greater than zero, and this holds for any n>no. But from the one example we did in class, it seems you can choose your e to make things a little easier for yourself and then choose k and k+1 both greater than no to allow you to obtain the contradiction that you need. I mean I started to try things like mod(xn/l)<1 and then choose mod(xk-l)<1 and (xk+2-1)<1. Then I had 2=mod((k+2)-k) which was equal to mod(xk+2 +xk) but this clearly doesn't seem to get me anywhere (well not from where I'm sitting). I've been trying stuff like this but can't seem to get anything to work. Could someone please put me in the right direction? Thanks!

Btw, when I've written stuff like xk+2 I just mean xsubscript k+2 :smile:
 
Physics news on Phys.org
it has a convergent subsequence - for n even the subsequence tends to 1. It also has the odd terms converging to -1. thus the sequence cannot converge or any two subsequences must converge and converge to the same thing. No need to use epsilon, but we can do it that way if we must, it's just messy, the start there is to suppose ti converges to x, if x is positive do something with e= 1/2 and the odd terms, if x is -ve do it with e=1/2 and the even terms.
 
Pfft why do they give us a messy one to prove? We're only analysis babies!

So I've said mod(xn-l)<1/2 for all n>no, so l is what it's converging to (just what I'm used too).
Now we take mod(xk-l)<1/2 and mod(xk+1)<1/2

Taking the situation first where k is an even power so the limit is negative we have

mod(1+l)<1/2 and mod(-1+l)<1/2

Then have 1=mod((1+l)-l)<mod((1+l)+(-l+1))<mod(1+l)+mod(-l+1)<1/2 +1/2<1 which obv doesn't make sense, but I think this is v.wrong. Ooops.
 
Remember you order of qunatifiers.

to show x(n) converges means

there is an x

such that

for all e>0

there is an N(e)

such that

for all n>N(e)

|x(n) - x| <e


To show it does not converge

for all x

there is an e

such that

for all N(e)

there is an n>N(e)

with |x(n)-x| >e


So to show it does not converge (without using our brains) we say, suppose x is arbitary.

case 1 x positive, let e =1/2, then for any N, take the next odd number greater than N, say it is n, then |x_n-x| >1/2 cos x_n is -1 and every postive number is more than 1/2 away from -1

case 2 if x is negative, let e=1/2, then for any N take the next even number after N, call it n, then |x_n-x|>1/2 cos every negative number is mroe than 1/2 away from 1.
 
If you want to be formal:

Assume (by contradiction) that the series converges to some value s. Then given any \epsilon &gt; 0 there must be some N such that n&gt;N \rightarrow |s-\sum_{i=1}{n}(-1}^i|&lt;e. Now if \epsilon&lt;0.5 consider that |\sum_{i=1}{N+1}(-1)^i-\sum_{i=1}{N+1}(-1)^i|=1 and that \sum_{i=1}{N+1}(-1)^i = s + e_1 and \sum_{i=1}{N+1}(-1)^i=s-e_2 where |e_1|,|e_2| &lt; \epsilon so by the triangle inequality |e_1+e_2|&lt;2 \epsilon &lt; 1 but by substitution we also have |\sum_{i=1}{N+1}(-1)^i-\sum_{i=1}{N+1}(-1)^i|=|s + e_1 - (s - e_2)| = |e_1 + e_2| = 1 this is a contradiction therefore the inital assumption must be incorrect.

If you understand this, you should have no trouble generalizing it to include all series \sum a_i where \lim_{i \rightarrow \infty} a_i \neq 0
 
but the question wasn't on series.
 
Okies, thanks for that I'm pleasantly surprised by how much simpler it was than I first thought!

Just one more ickle quiestion. If we have xn tending to l as n tends to infinity, how do we prove that modxn tends to mod(l) as n tends to infinity.

So we have mod(xn-l)<e for all n>N

Then I put mod(xn-l)>mod(xn)-mod(l) although that was a tad pointless I think... Then I put mod(xn-l)>=mod(mod(xn)-mod(l)) but I'm not sure if that's valid or not...
 
Use the folllowing inequality:

||x|-|y|| < |x-y|
 
Thanks, I didn't know that existed. I'd only ever seen the triangle one before, but never that one. Is there any way of proving it? Sorry, I'm being a pain. Thanks again! :wink:
 
  • #10
It follows from the triangle inequality:

|x| = |x-y+y| < |x-y| + |y|

so |x|-|y| < |x-y|

and by symmetry

|y|-|x|<|y-x|=|x-y|

so -|x-y| <|x|-|y| < |x-y|


ie ||x|-|y|| < |x-y|
 

Similar threads

  • · Replies 17 ·
Replies
17
Views
5K
  • · Replies 1 ·
Replies
1
Views
6K
  • · Replies 24 ·
Replies
24
Views
6K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 9 ·
Replies
9
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K