- #1

mcastillo356

Gold Member

- 472

- 208

- TL;DR Summary
- Would like to understand this theorem, or at least the statement. The textbook says demonstration lies on Mean-Value Theorem, but doesn´t provide, because "is of little practical use". The question I make is to just understand why, if ##K/(2L)<1##, the theorem shows that ##x_n## converges quickly to ##r## once ##n## becomes large enough that ##|x_n-r|<1##

Hi, PF

Sometimes it is not easy to find roots of functions. Newton gave a nice clue: the Newton's Method formula: ##x_{n+1}=x_n-\dfrac{f(x_n}{f'(x_n)}##. My concern is, now that I have understood and practiced it, comprehend what I've sketched in the summary. This is all taken from "Calculus, 9th edition, by Robert A. Adams and Christopher Essex".

The following theorem gives sufficient conditions for the Newton approximations to converge to a root ##r## of the equation ##f(x)=0## if the initial guess ##x_0## is sufficiently close to the root ##r##.

Suppose ##f##, ##f'## and ##f''## continuous on an interval ##I## containing ##x_n##, ##x_{n+1}## and a root ##x=r## of ##f(x)=0##. Suppose also that exist ##K>0## and ##L>0## such that for all ##x## in ##I## we have

(i) ##|f''(x)|\leq{K}## and

(ii) ##|f'(x)|\geq{L}##

Then

(a) ##|x_{n+1}-r|\leq{\dfrac{K}{2L}|x_{n+1}-x_n|^2}## and

(b) ##|x_{n+1}-r|\leq{\dfrac{K}{2L}|x_n-r|^2}##

(i) y (ii) assert that near ##r## the slope of ##y=f(x)## is not too small in size and doesn't change too rapidly. If ##K/(2L)<1##, the theorem shows that ##x_n## converges quickly to ##r## once ##n## becomes large enough that ##|x_n-r|<1##.

Thanks in advance!

Sometimes it is not easy to find roots of functions. Newton gave a nice clue: the Newton's Method formula: ##x_{n+1}=x_n-\dfrac{f(x_n}{f'(x_n)}##. My concern is, now that I have understood and practiced it, comprehend what I've sketched in the summary. This is all taken from "Calculus, 9th edition, by Robert A. Adams and Christopher Essex".

**Theorem 2 Error bounds for Newton's Method**The following theorem gives sufficient conditions for the Newton approximations to converge to a root ##r## of the equation ##f(x)=0## if the initial guess ##x_0## is sufficiently close to the root ##r##.

Suppose ##f##, ##f'## and ##f''## continuous on an interval ##I## containing ##x_n##, ##x_{n+1}## and a root ##x=r## of ##f(x)=0##. Suppose also that exist ##K>0## and ##L>0## such that for all ##x## in ##I## we have

(i) ##|f''(x)|\leq{K}## and

(ii) ##|f'(x)|\geq{L}##

Then

(a) ##|x_{n+1}-r|\leq{\dfrac{K}{2L}|x_{n+1}-x_n|^2}## and

(b) ##|x_{n+1}-r|\leq{\dfrac{K}{2L}|x_n-r|^2}##

(i) y (ii) assert that near ##r## the slope of ##y=f(x)## is not too small in size and doesn't change too rapidly. If ##K/(2L)<1##, the theorem shows that ##x_n## converges quickly to ##r## once ##n## becomes large enough that ##|x_n-r|<1##.

Thanks in advance!