roldy
- 206
- 2
Homework Statement
Consider dy/dx=x+y, a function of both x and y subject to initial condition, y(x0)=y0.
Use Taylor series to determine y(x0+\Deltax) to 4th order accuracy.
Initial condition: x0=0, y(x0)=1
step size: \Deltax=0.1
Show 5 significant digits in the answer.
Homework Equations
\epsilon=O(\Deltax5)
Do the calculations for only one step.
The Attempt at a Solution
dy/dx=f(x,y)
Taylor series:
y(x0+\Deltax)=y(x0)+\Deltaxf(x0,y(x0))+\epsilon
My solution:
f(x0,y(0))=f(0,1)=0+1=1
y(0+0.1)=1+0.1(1)+.00001=1.10001
Does this seem correct. It feels like I missed something. On the other hand it makes sense. Did I miss something or mess up a step?