- #1
c299792458
- 71
- 0
I have written a computer program that uses binary search to find the root to f(x) = 0, where f is an arbitrary (user-defined) function.
If the rounding error [itex]\leq \epsilon[/itex] and the truncation error [itex]\leq \delta[/itex], what is the estimated accuracy of the output?
Would the following reasoning be correct?
Let [itex]r[/itex] be s.t. [itex]f(r) = 0[/itex]. Then [itex]f(x) = f(r) + (x-r)f'(r) + O((x-r)^2)[/itex]. So the estimated error is [itex]|x-r| \approx |\frac{f(x)}{f'(r)}| = \frac{\epsilon+\delta}{|f'(r)|}[/itex]?
Thanks.
Edited: I meant expanded till the divisor is NOT 0.
If the rounding error [itex]\leq \epsilon[/itex] and the truncation error [itex]\leq \delta[/itex], what is the estimated accuracy of the output?
Would the following reasoning be correct?
Let [itex]r[/itex] be s.t. [itex]f(r) = 0[/itex]. Then [itex]f(x) = f(r) + (x-r)f'(r) + O((x-r)^2)[/itex]. So the estimated error is [itex]|x-r| \approx |\frac{f(x)}{f'(r)}| = \frac{\epsilon+\delta}{|f'(r)|}[/itex]?
Thanks.
Edited: I meant expanded till the divisor is NOT 0.
Last edited by a moderator: