I was wondering how to approximate definite integrals to within a specific accuracy. For example, how would I go about approximating the integral from 0 to 1 of sin(x^3) dx to within an accuracy of 0.001? I think I'm supposed to use the remainder estimate for the integral test, but I'm confused because that seems to apply to indefinite integrals. Any ideas?
Substitute the "x^3" factor instead of "x" in the Taylor expansion of sin x.Then integrate each term of the series from 0 to 1.Evaluate each integral with maximum 4 decimals.You'll stop evaluating the integrals,once the numbers added are less than 0.001. Daniel.