- #1
rclakmal
- 76
- 0
Homework Statement
I want to know that how to calculate the required number of terms to obtain a given decimal accuracy in two variable Taylor series .
In one variable case i know there is an error term R(n)=[ f(e)^(n+1)* (x-c)^(n+1)] / (n+1)! where 'e' is between x and c (c is the center value ) so if we want a 3 decimal place accuracy we can have the inequality below
R(n)<=5*10^(-4) then we can obtain particular ' n ' to have that accuracy
But i don't know how to use this in two variable case .please give some explanation or a good tutorial where i can learn it .