Golden section, levenberg-Marquadt

  • Thread starter a.mlw.walker
  • Start date
  • Tags
    Section
In summary, the conversation is about using optimization methods, specifically the Levenberg-Marquardt algorithm, to estimate unknown parameters in a least squares curve fitting problem. The different optimization techniques discussed include binary search, Newton-Raphson, and the secant method, each with their own strengths and weaknesses. The person is also looking for help to find an estimation for a and b using the image provided.
  • #1
a.mlw.walker
148
0
Hi I am a third year mechanical engineer, and have been reading around the course. I have never heard of the two methods above, and they have never been mentioned on the course - and therefore arent relevent, but I am reading an article that seems to use them. The following is what I have:

It is some form of least squares method for curves, used to estimate a few unknown parameters.

I am trying to find out a and b, although c0 is included in the brackets of S:

The first image, is the first equation, with a, b and c0 although the equation for c0 is shown. The second equation I think is how to solve for a?

Anyway, i have five or 6 values for T0 which apparently using one of the above methods means I can approximate using these equations, the values for a and b?

Anyone know how to do this and what they are talking about?

THanks

Alex
 
Last edited:
Mathematics news on Phys.org
  • #2
a.mlw.walker said:
It is some form of least squares method for curves, used to estimate a few unknown parameters.
It looks like you have been reading the wikipedia articles. These are optimization methods. The Levenberg–Marquardt wiki article says "The primary application of the Levenberg–Marquardt algorithm is in the least squares curve fitting problem." That's true in the sense that least squares curve fitting is one of the most widespread uses of all optimization techniques. It is also true in the sense that Levenberg–Marquardt is fairly well-suited to non-linear least squares problems. It is not true in the sense that this Levenberg–Marquardt cares in the least what motivates the function to be minimized.

Back up to the problem of finding a zero of a scalar function of one variable. One easy way: Find a pair of values that bracket the zero. For example, assume you have magically found x1 and x2 such that f(x1)<0 and f(x2)>0. You can find the zero without taking any derivatives of f(x) by looking at the point halfway between x1 and x2. Call this point x3. Either f(x3) is zero, the zero lies between x1 and x3, or the zero lies between x3 and x2. All you need to do is evaluate f(x3). This is binary search. It's not very fast, but it is very robust, and it doesn't use any derivative information. If you know the derivative f'(x) you can take advantage of this and zoom to the zero using Newton-Raphson. If you don't know the derivative you can approximate it using; this is what secant method does. Newton-Raphson can send your search to never-never land if the function isn't parabolic. The secant method is more robust but slower than Newton-Raphson, less robust but faster than binary search.

Now back to the problem at hand: Optimization. Think of golden ratio search as the equivalent of binary search, Gauss-Newton as the equivalent of Newton-Raphson, and Levenberg–Marquardt as the equivalent of the secant method. Each technique has certain strengths and weaknesses compared to the others. Golden ratio doesn't use derivative information at all, is quite robust, but converges quite slow. Gauss-Newton requires that you know the gradient and the Hessian, is very fast to converge (if it works), but is not robust. Levenberg–Marquardt estimates the the gradient and the Hessian, is fairly fast to converge (if it works), and is intermediary in terms of robustness.
 
  • #3
OK, thank you DH for the intro to Levenberg–Marquardt. So looking at simage.bmp can you see how it is possible to find an estimation for a and b, if I have a few values for Tk?
 
  • #4
a.mlw.walker said:
OK, thank you DH for the intro to Levenberg–Marquardt. So looking at simage.bmp can you see how it is possible to find an estimation for a and b, if I have a few values for Tk?

We might be of more help if you post the image.

Also, it may not be necessary to use Levenberg-Marquardt if the problem is well behaved. Standard nonlinear least squares algorithm might work just fine.
 
  • #5

1. What is the golden section?

The golden section, also known as the golden ratio, is a mathematical concept that describes the relationship between two quantities where the ratio of the sum of the two quantities to the larger quantity is equal to the ratio of the larger quantity to the smaller quantity. It is approximately equal to 1.618 and has been observed in various natural and man-made structures.

2. What is the Levenberg-Marquardt algorithm?

The Levenberg-Marquardt algorithm is a mathematical optimization method used in data fitting and curve fitting. It is commonly used in nonlinear least squares problems to find the optimal parameters that minimize the sum of the squares of the differences between the predicted and observed values.

3. How is the golden section used in the Levenberg-Marquardt algorithm?

The golden section is often used in the Levenberg-Marquardt algorithm to determine the step size during the optimization process. It helps to balance the trade-off between the convergence rate and stability of the algorithm, ultimately leading to faster and more accurate optimization results.

4. What are the advantages of using the golden section in the Levenberg-Marquardt algorithm?

The use of the golden section in the Levenberg-Marquardt algorithm has several advantages. It can improve the convergence rate of the algorithm, reduce the number of iterations needed to reach a solution, and provide a more stable and reliable optimization process.

5. Are there any limitations to using the golden section in the Levenberg-Marquardt algorithm?

While the golden section can be beneficial in the Levenberg-Marquardt algorithm, it also has some limitations. It may not be suitable for all optimization problems, and the choice of the golden section parameter can greatly affect the performance of the algorithm. Additionally, it may not be as efficient in high-dimensional optimization problems.

Similar threads

Replies
131
Views
4K
  • General Math
Replies
1
Views
2K
  • General Math
Replies
5
Views
732
Replies
4
Views
656
  • General Math
Replies
13
Views
1K
Replies
3
Views
657
Replies
5
Views
334
Replies
2
Views
1K
Replies
8
Views
2K
  • Differential Equations
Replies
6
Views
2K
Back
Top