Is the following criterion sufficient for Convergence?

  • Thread starter Thread starter Quantumpencil
  • Start date Start date
  • Tags Tags
    Convergence
Quantumpencil
Messages
96
Reaction score
0

Homework Statement



Say {p_n} is a sequence in R, abs(p_n-p_n+1) -> 0, and {p_n} is bounded. Is it true that {p_n} must converge?



Homework Equations





The Attempt at a Solution



Intuition: Yes; in my attempts to find a counterexample I found sequences which diverged even though successive terms were closer (Partial sums of the Harmonic series), and were not bounded. From diagramming it seems to me that if you have the range of p_n bounded above and below, and the distance between successive terms must decrease, then the boundary will continue to close in from both sides until the sequence converges.

The only thing is I'm having trouble proving this without the assumption that the sequence is monotonic (I can't get a contradiction out of the existence of sup p_n or inf p_n)

So I'd like to make sure, before I continue with the proving, that this is actually true.

EDIT: I actually think this is false. Don't have a counter-example yet, but I think you might could have like, a sequence defined which oscillates around two points... I'm just not sure if I can do that without the oscillation being a convergent one.
 
Last edited:
Physics news on Phys.org
You are wise to look hard for counterexamples before you start trying to prove. What about things like p_n=sin(sqrt(n))?
 
Ah, so you do in fact need the Monotonicity assumption. That function is bounded. And as you increase the argument, the difference between successive terms decreases, but it still eventually oscillates back, bounded by 1 and -1 (sqrt 10000) and sqrt (10001) are closer together than sqrt (10) and sqrt (11), so even though the arguments start varying more slower, there is never any convergence.

You can show the divergence just by taking a divergent sub-sequence, and show the the closeness of successive terms with a bit of algebra.

Good stuff.

That explains my difficulty with trying to prove this... lol.
 
Exactly. Oscillate between two points. Just do it slower and slower.
 
The simplest and most famous example is the harmonic series:
\sum_{n=1}^\infty \frac{1}{n}

The difference between successive terms is
\frac{1}{n}- \frac{1}{n+1}= \frac{1}{n(n+1}
which goes to 0 (quadratically) as n goes to infinity.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top