# Finding error of secant method empirically

1. Sep 14, 2014

### jjr

1. The problem statement, all variables and given/known data

I need to find the roots of a given equation using the secant method and matlab. I have already found all the roots, but I am also asked to find the rate of convergence for the method empirically, meaning that they want me to find the order of convergence through the set of errors generated in by the program.

2. Relevant equations

3. The attempt at a solution

I know very well that the order of convergence of the secant method is α ≈ 1.618, but I am not sure how I should go about showing this empirically. In my case the roots are found within the desired error tolerance (1e-10) in about 6-8 iterations. I've tried taking the logarithms of the errors and plotting them versus the iteration number n, to draw out the exponential behaviour, but I can't get any one clear answer... Any suggestions for putting me on the right track? Let me know if I need to clarify anything.

J

2. Sep 14, 2014

### BvU

Check out these notes which Anatolii Grinshpan put on the net.

3. Sep 15, 2014

### jjr

Thank you! I'm still not sure how I would go about finding the exponent given a set of errors, but I think it will be sufficent to show that $\frac{|x_{n+1}-α|}{|x_n-α|^p} ≈ C$ , making $p ≈ 1.618$ a starting assumption.

4. Sep 15, 2014

### BvU

I think so. What I wanted to point out is that the analysis is based on ignoring higher order terms, and even then you need $f''$ and $f'$ in $\alpha$. Depending on your test functions (and on whether you have an exact solution for $\alpha$ or only the last iteration) you end up in the noise very quickly. So with only a few iterations and any deviation popping up twice (in point n and also in point n+1), you can't expect a perfectly smooth log-log plot.

so there's no need to be so precise about the 1.61803398874989...

Conceptually I always remembered this as: Newton is quadratic, secant approaches Newton if the secant approaches the derivative. But it's a numerical derivative, so before it's really good enough you are in the noise anyway. So somewhere between 1 and 2.

Last edited: Sep 15, 2014