# Binary Search Error

1. Aug 9, 2011

### c299792458

I have written a computer program that uses binary search to find the root to f(x) = 0, where f is an arbitrary (user-defined) function.

If the rounding error $$\leq \epsilon$$ and the truncation error $$\leq \delta$$, what is the estimated accuracy of the output?

Would the following reasoning be correct?
Let $$r$$ be s.t. $$f(r) = 0$$. Then $$f(x) = f(r) + (x-r)f'(r) + O((x-r)^2)$$. So the estimated error is $$|x-r| \approx |\frac{f(x)}{f'(r)}| = \frac{\epsilon+\delta}{|f'(r)|}$$?

Thanks.

Edited: I meant expanded till the divisor is NOT 0.

Last edited: Aug 9, 2011