- #1

- 39

- 0

Thanks.

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter erkokite
- Start date

- #1

- 39

- 0

Thanks.

- #2

- 5,441

- 9

Suppose you were to chose a starting point greater than any +ve root?

- #3

Borek

Mentor

- 28,961

- 3,566

As far as I know there is no method that can guarantee anything.

- #4

hotvette

Homework Helper

- 996

- 5

I agree with both comments already posted. Pure Newton's method, if it finds a solution (not guaranteed to find one), will find the one closest to the starting point, which isn't necessarily the desired one.

Having said that, there are things you can potentially do, depending on the details of the specific problem you have. One is to try several different starting points and hope that you'll eventually find one that produces the result you want. If the problem has only 1 variable, it should be straightforward to plot f(x) vs x and graphically (or analytically) find an approximate root that you want, and use it as a starting point in Newton's Method.

I believe there is a fair body of publication material on the general topic of optimization with inequality constraints (i.e. minimize f(x) where x >= 0). There might be similiar info on finding zeros of functions, I really don't know much about them.

A couple of methods I'm aware of are based on a user defined range for the solution. One is Brent's Method:

http://en.wikipedia.org/wiki/Brent's_method

Another is a random method, where f(x) is calculated for randomly generated values for x (within user specified limits), ranked according to value of f(x), a new range defined from the ranking, then repeat.

In both cases, the result could be used as a starter for Newton's Method, but does require you to specify an upper bound on the solution (lower bound would be zero), which may not be straightfoward.

Having said that, there are things you can potentially do, depending on the details of the specific problem you have. One is to try several different starting points and hope that you'll eventually find one that produces the result you want. If the problem has only 1 variable, it should be straightforward to plot f(x) vs x and graphically (or analytically) find an approximate root that you want, and use it as a starting point in Newton's Method.

I believe there is a fair body of publication material on the general topic of optimization with inequality constraints (i.e. minimize f(x) where x >= 0). There might be similiar info on finding zeros of functions, I really don't know much about them.

A couple of methods I'm aware of are based on a user defined range for the solution. One is Brent's Method:

http://en.wikipedia.org/wiki/Brent's_method

Another is a random method, where f(x) is calculated for randomly generated values for x (within user specified limits), ranked according to value of f(x), a new range defined from the ranking, then repeat.

In both cases, the result could be used as a starter for Newton's Method, but does require you to specify an upper bound on the solution (lower bound would be zero), which may not be straightfoward.

Last edited:

- #5

Borek

Mentor

- 28,961

- 3,566

Pure Newton's method, if it finds a solution (not guaranteed to find one), will find the one closest to the starting point

I don't think so.

Starting point x, yet solution found is x1, not x0. You need additional conditions for the statement to be true.

- #6

hotvette

Homework Helper

- 996

- 5

I don't think so.

Ah, right you are. I stand corrected.

- #7

Borek

Mentor

- 28,961

- 3,566

You're welcome

- #8

hotvette

Homework Helper

- 996

- 5

To elaborate on Borek's comment, you can take additional steps to increase the odds of finding the root nearest to the starting point. One option is to take partial Newton steps, that is, use x_{i+1} = x_{i} - mu*f/f', where mu is between 0 and 1. The smaller mu is, the more steps are needed to converge, but the less likely to miss the nearest root.

Another option would be to use a higher order method such as Halley's method that utilizes the 2nd derivative as well as the first derivative.

http://en.wikipedia.org/wiki/Halley's_method

Another option would be to use a higher order method such as Halley's method that utilizes the 2nd derivative as well as the first derivative.

http://en.wikipedia.org/wiki/Halley's_method

Last edited:

- #9

- 5,441

- 9

You haven't told us about your function or even if it is continuous as you require derivatives for N.

If you can get close enough to the root to start with real world functions will, in general, converge.

So if the distance between the greatest root and X

I am aware that it is always possible to construct counter examples, but you should stake out your territory beforehand.

Share:

- Replies
- 4

- Views
- 3K