Register to reply 
Spectral Method for optimization 
Share this thread: 
#1
Aug2211, 05:42 PM

P: 209

trying to find a global min/max while avoiding saddle points ..... If we expand a polynomial of several dimensions we get the following:
f(x + del x) = f(x) + del x * g + .5 (del x) H (del x)^t + O(x^3) : g is the gradient, H is the Hessian. then we can take the eigenvectors of H and make them into a matrix U s.t. UU^t = I, and use the eigenvalues to construct a diagonal matrix, D; so that H = UDU^t. then define: z = (del x ) * U and define d = (U^t)* g so that our original equation reduces to f(x+del x ) = f(x) + z*d +.5*z*D*z^t + O(x^3) ; equivalently, f(x+del x ) ~ f(x) + z*d +.5*z*D*z^t (got all of this from: http://linneus20.ethz.ch:8080/3_7.html) Just curious how we are supposed to update the iterations? I get to the step where if the method finds a saddle point it avoids the saddle point by going in the opposite direction for the given dimension and then scaling by a factor, h , such that the original function f(x+del x ) is optimized as best as possible.... in other words, after the Ndimension problem is put through this method, we find all the "saddle point dimensions" and reduce the problem to 1dimension and solve for h. But i get a little confused from the websites examples, as the numbers don't work out in some of the problems, and also i'm not completely clear how the update works in this method. Finally, what's the best method to solve for h? A subroutine of 1D Newton's Method? Golden Search? Bisection? etc..... 


Register to reply 
Related Discussions  
The best method for a many many variable optimization problem?  General Engineering  1  
Newton's Optimization Method  Differential Geometry  2  
Spectral method related  General Math  1  
Numerical Optimization ( steepest descent method)  Calculus & Beyond Homework  0  
Pseudospectral method vs spectral method  Quantum Physics  2 