Why Does Newton Raphson Method Fail for Some Functions?

  • Thread starter Thread starter relinquished™
  • Start date Start date
  • Tags Tags
    Newton
Click For Summary

Homework Help Overview

The discussion revolves around the Newton-Raphson method and its potential failures when applied to certain functions. Participants explore various conditions under which the method may not converge or may produce erroneous results, particularly focusing on the implications of the function's derivatives.

Discussion Character

  • Conceptual clarification, Assumption checking, Exploratory

Approaches and Questions Raised

  • The original poster outlines specific scenarios where the Newton-Raphson method may fail, such as when the derivative is zero or when the initial guess is near critical points like maxima, minima, or points of inflection. Participants discuss the implications of these conditions and seek deeper understanding.

Discussion Status

Participants are actively engaging with the original poster's queries, offering insights and personal interpretations regarding the behavior of the method near critical points. Some have provided examples and analogies to illustrate their points, while others express uncertainty about specific scenarios, particularly regarding points of inflection.

Contextual Notes

There is a mention of a specific function where the derivative is not continuous, which raises further questions about convergence. Additionally, some participants reference external resources for further analysis of the Newton-Raphson method.

relinquished™
Messages
79
Reaction score
0
Hello. I've been approached with a problem of explaining why Newton Raphson method fails for some functions. I came across a book in Numerical Analysis (Kellison's book) that the method may fail if

1.) f'(x)=0
2.) The initial value is taken at a maximum or minumum point,
3.) The initial value is taken at a point of inflection,
4.) The initial value is taken near a maximum point and a minimum point,
5.) The initial value is taken near a point of inflection.

Now I can explain (1). Newton's Method will fail since the iteration is given by

[itex] x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}[/itex]

Therefore making the whole thing undefined.

As for (2), since minima and maxima have f'(x)=0, they will fail for the same reasons as (1)

As for (3), ... I'm totally clueless. I don't know how a point of inflection (f''(x)=0) could be related to the Iteration that uses f'

As for (4), I also know that choosing near a minima or maxima might make the method "oscillate", but what I'm looking for are more concrete answers; something that can be related to f'(x) or something that resembles a proof.

As for (5), I have no idea why it may fail for an initial value near a point of inflection..

All help is appreciated,

reli~
 
Last edited:
Physics news on Phys.org
I'm still in the early stages of calc myself but...

for 5 I try to explain it to myself like this: imagine you're using Newton on a function that behaves a bit like x^2 - C as it heads for the x-axis but with a couple of inflexion points near the area you choose as your guess...If for your first guess the concavity of the curve is up then the tangent at the curve will be steeper than the secant between the root you want and the value of your first guess...so as you follow the tangent you are moving in the same direction as the secant but not as fast (along the x-axis)...such that your guess takes you closer to the root, future guesses under favourable conditions such as this would get you closer still.

But now suppose that you're unlucky for the second guess. The function has changed it's concavity between the x value of our first guess and the x value of our second guess. Now it's the secant that is steeper than the tangent (unless the tangent changes it's direction), for any point on this tangent you will be above the curve...and you will hit the x-axis at the other side of your root or further away from it...from this point onwards you might find yourself attracted to a different root or worse still sent on a wild goose chase

http://img132.imageshack.us/img132/24/Newtonspy8.gif
 
Last edited by a moderator:
Also if the derivative is not continuous at the root, then convergence may fail to occur.

Indeed, let f(0) = 0 and f(x) = x + x^2\sin(2/x) elsewhere.
 
For 4/, when the initial value x0 is near the minima, or maxima, then, it may be possible that: [tex]f'(x_0) \approx 0[/tex]. Hence, it will make x1 considerably large: [tex]x_1 = x_0 - \frac{f(x_0)}{f'(x_0)}[/tex]
Not really sure about 3/, and 5/...
 
http://www.karlscalculus.org/NRbox.html makes a pretty good analysis of the method. There you will find answers to your questions.
 
Last edited by a moderator:

Similar threads

Replies
34
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
6
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K