Gradient descent, anything better than golden section line search

Click For Summary
SUMMARY

The discussion focuses on optimizing a third-degree polynomial matrix using gradient descent techniques in MATLAB, specifically addressing the limitations of the golden section line search. While the golden section line search effectively determines step sizes, it fails to prevent convergence to local minima. The user explores alternative methods, including Quasi-Newton and stochastic approaches like simulated annealing, but encounters challenges with memory and convergence criteria. The need for a robust stopping condition in simulated annealing is emphasized to avoid premature termination far from the global minimum.

PREREQUISITES
  • Understanding of gradient descent algorithms
  • Familiarity with MATLAB programming and optimization functions
  • Knowledge of third-degree polynomial equations
  • Experience with simulated annealing techniques
NEXT STEPS
  • Research advanced gradient descent techniques to avoid local minima
  • Learn about MATLAB's fminunc function and its applications
  • Explore the implementation of trust region methods in optimization
  • Investigate convergence criteria and stopping conditions for simulated annealing
USEFUL FOR

Data scientists, machine learning engineers, and optimization specialists seeking to enhance their understanding of polynomial optimization and gradient descent methodologies in MATLAB.

alecrimi
Messages
18
Reaction score
0
Hi
This is a long story, I make it short:
I am working in a project where I need to find a matrix defined by a third degree polynomial, the solution can be found iteratively using a gradient descent technique, I am using the golden section line search already implemented in MATLAB (with the code described below).
The algorithm looks powerful (it finds automatically the perfect step), but unfortunately the golden section line search does not avoid being stuck in local minima. How can I implement more efficiently this. The problem is that sometime it converges and sometimes no (sometimes is not science :-).

%Initial third degree polynomial
Cest = initial_guess;
normdev=inf ;
stepsize=inf;

%Stopping condition
stopping_condition = 10^(-5) *norm(X*X'/no_samples,'fro');

while abs(normdev*stepsize) > stopping_condition
%Third degree polynomial
dnew = Cest - 1/no_samples*(X*X' - 2/sigma^2 * (Cest*Cest'*Cest-Cest*B'*Cest));
%Find the best stepsize as a minimum using the goldensection line search
stepsize = fminbnd( @(stepsize) step(stepsize,Cest,dnew,X*X',B,sigma,no_samples),-.1,.1);

%Update
Cest = Cest + stepsize*dnew;
normdev = norm(dnew,'fro');
end

function error = step(stepsize,Cest,dnew,XX,B,sigma,no_samples)
Cest = Cest + stepsize*dnew;
error = norm(Cest - 1/no_samples*(XX - 2/sigma^2 * (Cest^3-Cest*B*Cest)),'fro');

I tried :
%Quasi-Newton
stepsize = fminunc( @(stepsize) step(stepsize,Cest,dnew,X*X',B,sigma,no_samples),dnew);
But MATLAB get stuck (no heap memory) probably due the fact that this function is suppose to be used when we don't have a trust region.

Any suggestions ?
 
Physics news on Phys.org
So I moved to a stochastic approach (gradient descent approaches find only local minima, stochastic approaches can find global minima) like simulated annealing, now the problem is that with my objective function the default convergence never happens.
So I have to force annealing to stop. BUT WHEN ?
I tried after some iteration but the annealing has the defect that at some point it randomize his trusted region and maybe when I stop it, it is really far from the minima (I am using too many iterations). And I have the same problem with time limit. So when I should stop the minima the search ?

Code using the simulated annealing:
Cest = initial_point;

lb = -20*ones(size(Cest));
ub = 20*ones(size(Cest));

options = saoptimset('MaxIter',300) ;

Cest = simulannealbnd(@(funcobj) myfuncobj(Cest ,X*X',B,sigma,no_samples),Cest,lb,ub, options);


function error = myfuncobj( Cest, XX, B, sigma, no_samples)
error = norm(Cest - 1/no_samples*(XX - 2/sigma^2 * (Cest^3-Cest*B*Cest)),'fro');
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
10K