Newton–Raphson method - Finite difference method

Click For Summary
SUMMARY

The discussion focuses on the application of the Newton-Raphson method in conjunction with the finite difference method for solving nonlinear differential equations. It clarifies that while the Newton-Raphson method is typically used for solving nonlinear algebraic equations, it can be integrated within implicit finite difference methods, such as Adams-Moulton. The conversation highlights the importance of differentiating between explicit and implicit methods, noting that explicit methods like Forward Euler and Trapezoid do not require the Newton-Raphson approach. Additionally, the use of finite difference approximations for derivatives is emphasized as a means to enhance convergence.

PREREQUISITES
  • Understanding of finite difference methods
  • Familiarity with nonlinear algebraic equations
  • Knowledge of explicit vs. implicit numerical methods
  • Basic calculus, particularly differentiation
NEXT STEPS
  • Research the implementation of the Newton-Raphson method in implicit finite difference methods
  • Explore the Adams-Moulton method for solving nonlinear differential equations
  • Study the differences between explicit and implicit numerical methods
  • Learn about finite difference approximations for derivatives
USEFUL FOR

Mathematicians, engineers, and computational scientists involved in numerical analysis and solving differential equations, particularly those interested in advanced methods for nonlinear problems.

Excom
Messages
58
Reaction score
0
Hi

I am trying to solve a nonlinear differential equation with the use of the finite difference method and the Newton-Raphson method. Is there anyone that knows where I can find some literature about the subject?

I am familiar with the use of the finite difference method, when solving linear differential equations. It is the Newton-Raphson method when using the finite difference method that is new for me.

Thanks in advance
 
Physics news on Phys.org
Hello Excom,

finite difference methods (simple one-step methods such as Euler, Trapezoid, Midpoint, or more complex multi-step methods like the Adams' families, or non-linear methods such as Runge-Kutta, etc etc) can ALL be used to solve both linear and non-linear ordinary differential equations (obviously depending on the kind of differential system there are methods that will perform better than others..) but they're all used for solving a general IVP of the form:

<br /> <br /> \left. \begin{array}{l}<br /> \frac {dy} {dx} = f(x,y) \\<br /> y( x_{0} ) = y_{0}<br /> \end{array} \right\} \mbox{ze IVP :p}<br /> <br />

(which may be a scalar equation or a system of equations), regardless of whether f is linear or not.

Newton-Raphson is for solving non-linear algebraic equations, not differential equations. You will have to use Newton-Raphson (or any other technique for solving non-linear equations) within your finite difference method if the said method is implicit, that is, to solve for the current time-step of the solution as a function of the values at previous time-steps. For example, Adams-Moulton methods are implicit so you will have to solve a non-linear algebraic equation (or system of equations) at each time-step. . . but Forward Euler or Trapezoid or even Runge-Kutta or Adams-Bashforth are all explicit difference methods, and there's no need to solve non-linear equations within the method, so no need for Newton-Raphson :)

If you're still interested in Newton-Raphson, there are loads of resources on the net, just search on google:) eg one link I found:
http://www.math.ubc.ca/~clarkson/Newtonmethod.pdf"

Hope I could be of help, good luck with your non-linear differential equation! xD
 
Last edited by a moderator:
Tanks for your help
 
please give me
 
In my case, you can approximate the denominator term,
f'(x)
with a forward, backward, or central difference.

So, just to elaborate, if you have,
f(x) = x^2
and,
f(x+h) = (x+h)^2
f(x-h) = (x-h)^2
for some small h (gridspace)

Then, using Central Difference your Newton-Raphson equation becomes,
x[i+1] = x - f(x)/f'(x)
= x - f(x) / ( (f(x+h)-f(x-h) )/(2*h) )
= x - x^2/( ( (x+h)^2 - (x-h)^2 ) / (2*h) )

For other example, e.g. f(x) = x^4 + x^3 + x + 5, I'm getting faster convergence via the finite difference version of f'(x) than using the analytical version of it.

I have not encounter any reference for this, but I don't see anything wrong with this.

All the best! :)
 
Take a look at this I don't know if it can help you :
www.firavia.com/Newton.pdf[/URL]
 
Last edited by a moderator:

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 12 ·
Replies
12
Views
7K
  • · Replies 7 ·
Replies
7
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
2K