1. The problem statement, all variables and given/known data Consider dy/dx=x+y, a function of both x and y subject to initial condition, y(x0)=y0. Use Taylor series to determine y(x0+[itex]\Delta[/itex]x) to 4th order accuracy. Initial condition: x0=0, y(x0)=1 step size: [itex]\Delta[/itex]x=0.1 Show 5 significant digits in the answer. 2. Relevant equations [itex]\epsilon[/itex]=O([itex]\Delta[/itex]x5) Do the calculations for only one step. 3. The attempt at a solution dy/dx=f(x,y) Taylor series: y(x0+[itex]\Delta[/itex]x)=y(x0)+[itex]\Delta[/itex]xf(x0,y(x0))+[itex]\epsilon[/itex] My solution: f(x0,y(0))=f(0,1)=0+1=1 y(0+0.1)=1+0.1(1)+.00001=1.10001 Does this seem correct. It feels like I missed something. On the other hand it makes sense. Did I miss something or mess up a step?