eptheta said:
Darn, that's too bad, I'm never too good with graphic approximations...
If you haven't got a graphing calculator handy and just have no chance of getting a decent approximation by hand-drawing it, you can use the intermediate value theorem. This is pretty simple. Say you have y=x
2-2 and want to find where y=0, thus x=sqrt(2), by your graph you estimate it is between 1 and 2, checking to make sure, you find the value of y at x=1, f(1) and notice that it is less than zero, so you check f(2) and notice it is more than zero. So this means the root is somewhere in between x=1 and x=2, take x=1.5, it is slightly more, so take x=1.25, slightly less... so take half way between 1.25 and 1.5.. etc. It's a slow process but it will give you a good head start to kick off your Newton's method approximation.
Is there a threshold range for which Newton's method will return a particular value of x?
That is, if f(x)=sin(x), what is the range of values for xn for which Newton's method will give me x=0 and not jump to x=+\pi or x=-\pi
Is there a derivation for this range based on the function f(x) ?
Thanks
I don't believe there is. The way Newton's method works is that with your first approximation, it finds where the function actually is at that x value, and then calculates the x value of the tangent line that cuts the x-axis. This is the next approximation and it just keeps doing this over and over.
Say for sin(x), if you take x=\pi/4\approx0.78 as your first approximation, then Newton's will bring back the value x=\pi/4-1\approx -0.21 and it is clear that since the second value brought back something closer to the true root, it will converge to x=0. If you took say x=\pi/2-0.001 then the tangent line from this point is nearly flat so it will reach a very long distance before touching the x-axis again to give the next approximation. Your next value would be x\approx -998 so obviously you're screwed here. But you can even take x=4.5 and return x=-0.1 so yeah, just keeping the rule of thumb that you need a decent first approximation will come in handy. It really depends on the function too,
say, you want to find the root x=0 of y=x^{1/3} then applying Newton's method: x_{n+1}=x_n-\frac{x_n^{1/3}}{\frac{1}{3}x_n^{-2/3}}=x_n-3x_n=-2x_n So whatever first value you use, the next will always double and become negative of it, thus never converging to 0.