- #1
analysis001
- 21
- 0
Homework Statement
Consider the sequence {an}[itex]\subset[/itex]R which is recursively defined by an+1=f(an). Prove that if there is some L[itex]\in[/itex]R and a 0≤c<1 such that |[itex]\frac{a_{n+1}-L}{a_{n}-L}[/itex]|<c for all n[itex]\in[/itex]N then limn[itex]\rightarrow[/itex]∞an=L.
Homework Equations
Definition of convergence: Suppose (X,d) is a metric space and let {a_{n}}[itex]\subseteq[/itex]R denote a sequence in R. We will say {a[itex]_{n}[/itex]} converges in R if there is an L[itex]\in[/itex]R such that for every ε>0 there is an N'[itex]\in[/itex]N so that n'≥N' implies that d(a[itex]_{n}[/itex],L)<ε [itex]\Rightarrow[/itex] limn[itex]\rightarrow[/itex]∞an=L.
The Attempt at a Solution
Take ε>0. Since |an+1-L|< |an-L| [itex]\Rightarrow[/itex] |[itex]\frac{a_{n+1}-L}{a_{n}-L}[/itex]|<c. If n'=n+1 then N'=n. If 0<ε<|an+1-L| then d(an,L)=|an-L|<ε which by definition implies limn[itex]\rightarrow[/itex]∞an=L.
I'm not really sure if this is right. If anyone could tell me if anything is wrong with it that would be great! Thanks.