# Picard's Iteration

How do you use Picard's iteration to solve the solution for two coupled ODE's, given initial conditions?

HallsofIvy
Homework Helper
Treat them as two separate equations, using both values in the calculations.

For a "one equation" Picard method, if you are solving the initial value problem dx/dt= f(x,t), with x(t0)= x0, you start by replacing x in f with $x_0$ and get $x_1(t)= x_0+ \int_{t_0}^t f(x_0,\tau)d\tau$, then $x_2(t)= x_0+ \int_{t_0}^t f(x_1(\tau),\tau)d\tau$, etc.

With two equations, say dx/dt= f(x,y,t) and dy/dt= g(x,y,t), with x(t0)= x0, y(t0)= y0, start by letting x and y in those function be x0, y0 and integrate to get $x_1(t)= \int_{t_0}^t f(x_0, y_0, \tau) d\tau$ and $y_1(t)= \int_{t_0}^t g(x_0, y_0, \tau)d\tau$, then $x_2(t)= x_0+ \int_{t_0}^t f(x_1, y_1, \tau)d\tau$, $y_2(t)= \int_{t_0}^t g(x_1,y_1,\tau)d\tau$, etc.

(Note you can also do: $x_1(t)= \int_{t_0}^t f(x_0, y_0, \tau) d\tau$ and $y_1(t)= \int_{t_0}^t g(x_1, y_0, \tau)d\tau$, $x_2(t)= x_0+ \int_{t_0}^t f(x_1(\tau),y_1(\tau),\tau)d\tau$, $y_2(t)= y_0+ \int_{t_0}^t f(x_2(\tau),y_1(\tau),\tau)d\tau$, etc., using each new value as soon as we have it. That will give a sightly different answer but still a valid approximation to the true solution.)

Is there a name for the later? How come the later is true?

Is there a reference for these methods? Most elementary textbooks on ODE's that I know of don't cover Picard's method.

, then $x_2(t)= x_0+ \int_{t_0}^t f(x_1, y_1, \tau)d\tau$, $y_2(t)= \int_{t_0}^t g(x_1,y_1,\tau)d\tau$, etc.

You meant:

, then $x_2(t)= x_0+ \int_{t_0}^t f(x_1, y_1, \tau)d\tau$, $y_2(t)= y_0 + \int_{t_0}^t g(x_1,y_1,\tau)d\tau$, etc.

Right?

HallsofIvy
Homework Helper
Oh, yes! Forgot all about the $y_0$! Thank you.

HallsofIvy
Homework Helper
Is there a name for the later? How come the later is true?

Is there a reference for these methods? Most elementary textbooks on ODE's that I know of don't cover Picard's method.
Actuall most elementary textbooks mention Picard's method in reference to the "Existence and Uniqueness Theorem" for initial value problems. They don't "cover" it because it has a very slow convergence rate. There are much better methods for approximate solution to differential equations.

HallsofIvy
Homework Helper
Is there a name for the later? How come the later is true?

Is there a reference for these methods? Most elementary textbooks on ODE's that I know of don't cover Picard's method.
Actuall most elementary textbooks mention Picard's method (perhaps not by that name) in reference to the "Existence and Uniqueness Theorem" for initial value problems. It is Picard's method that gives the fixed point formula needed for the proof. Actually, I believe Picard himself developed it for that purposeThey don't "cover" it because it has a very slow convergence rate. There are much better methods for solving differential equations.

For example, to solve the problem x'(t)= x, with x(0)= 1, you take as your "first approximation" $x_0(t)= 1$ so that x'(t)= dx/dt= 1 and, integrating both sides, $x_0= y_0+ \int_0^t 1 dt= 1+ t$.

Now, take $x_1(t)= 1+ t$ so that x'= dx/dt= 1+ t and, integrating both sides, $x_1= y_0+ \int_0^t (1+ t)dt= 1+ t+ (1/2)t^2$.

Now, take $x_2= 1+ t+ (1/2)t^2$ so that $x'= dx/dt= 1+ t+ (1/2)t^2$ and, integrating both sides
$$x_3= 1+ \int_0^t (1+ t+ (1/2)t^2)dt= 1+ t+ (1/2)t^2+ (1/6)t^3$$.

At this point you should be able to see that if you continued this forever, you would get the MacLaurin series expansion of $e^x$ which is,in fact, the solution to this problem- but you are going to take an unGodly long time getting their!

So if you wanted to find a solution can be obtained for a value for, say, |x| <0.5 how would you do this?