When to stop a simulate annealing when it doesn't stop by itself

  • Thread starter Thread starter alecrimi
  • Start date Start date
alecrimi
Messages
18
Reaction score
0
Hi guys
I posted something similar some time ago:
This is a long story, I make it short:
I am working in a project where I need to find a matrix defined by a third degree polynomial.
Originally I thought that a good idea can be to find the solution iteratively using a gradient descent technique, so I implemented a golden section line search in MATLAB (with the code described below).
The algorithm looks powerful (it finds automatically the perfect step), but unfortunately the golden section line search does not avoid being stuck in local minima.
So I moved to a stochastic approach like simulated annealing, now the problem is that with my objective function the default convergence never happens.
So I have to force annealing to stop. BUT WHEN ?
I tried after some iteration but the annealing has the defect that at some point it randomize his trusted region and maybe when I stop it, it is really far from the minima (I am using too many iterations). And I have the same problem with time limit. So when I should stop the minima the search ?
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Code using the golden section:

%Initial third degree polynomial
Cest = initial_guess;
normdev=inf ;
stepsize=inf;

%Stopping condition
stopping_condition = 10^(-5) *norm(X*X'/no_samples,'fro');

while abs(normdev*stepsize) > stopping_condition
%Third degree polynomial
dnew = Cest - 1/no_samples*(X*X' - 2/sigma^2 * (Cest*Cest'*Cest-Cest*B'*Cest));
%Find the best stepsize as a minimum using the goldensection line search
stepsize = fminbnd( @(stepsize) step(stepsize,Cest,dnew,X*X',B,sigma,no_samples),-.1,.1);

%Update
Cest = Cest + stepsize*dnew;
normdev = norm(dnew,'fro');
end

function error = step(stepsize,Cest,dnew,XX,B,sigma,no_samples)
Cest = Cest + stepsize*dnew;
error = norm(Cest - 1/no_samples*(XX - 2/sigma^2 * (Cest^3-Cest*B*Cest)),'fro');

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Code using the simulated annealing:
Cest = initial_point;

lb = -20*ones(size(Cest));
ub = 20*ones(size(Cest));

options = saoptimset('MaxIter',300) ;

Cest = simulannealbnd(@(funcobj) myfuncobj(Cest ,X*X',B,sigma,no_samples),Cest,lb,ub, options);


function error = myfuncobj( Cest, XX, B, sigma, no_samples)
error = norm(Cest - 1/no_samples*(XX - 2/sigma^2 * (Cest^3-Cest*B*Cest)),'fro');
 
Physics news on Phys.org
Have you considered using code from a book or published paper?
 
Hi
Thank you for your reply.
Unfortunately... I can't
Have you seen my target matrix ? Does it look like anything you can find in a book or was already published ?

The expression for a gradient descent works (but it is stuck in a local minimum).
I guess it will be solved with a stochastic approach, but reading all the tutorial, they assume explicit functions as error function, I haven't seen anything like I need to minimize.
 
Why can't you use code from a book?

Also, if you want me to look at your objective function, you should write it down and clearly label all its variables.
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...

Similar threads

Back
Top