math8
- 143
- 0
We assume that f(x), f'(x) and f''(x) are continuous in [a,b], and that for some \alpha \in (a,b) , we have f( \alpha )= 0 and f'( \alpha ) \neq 0. We show that if x_{0} is chosen close enough to \alpha, the iterates
x_{n+1} = x_{n}- \frac{f(x_{n})}{f'(x_{n})}
converge to \alpha.
I tried to use Taylor's expansion for f( \alpha ) (centered at x_{n}), and I got to this expression
lim_{n \rightarrow \infty} (\alpha -x_{n+1})= lim_{n \rightarrow \infty} - \frac{1}{2} f''(c) \frac{( \alpha - x_{n} )^{2}}{f'(x_{n})}
where c \in ( \alpha , x_{n} )
and I guess I want the right hand side to be 0 to get to the answer. But I am not sure how to prove this.
x_{n+1} = x_{n}- \frac{f(x_{n})}{f'(x_{n})}
converge to \alpha.
I tried to use Taylor's expansion for f( \alpha ) (centered at x_{n}), and I got to this expression
lim_{n \rightarrow \infty} (\alpha -x_{n+1})= lim_{n \rightarrow \infty} - \frac{1}{2} f''(c) \frac{( \alpha - x_{n} )^{2}}{f'(x_{n})}
where c \in ( \alpha , x_{n} )
and I guess I want the right hand side to be 0 to get to the answer. But I am not sure how to prove this.
Last edited: