Mario Rodriguez wrote:
>> Rescaling and using a sum of squared errors approach allowed lbfgs to
>> converge, so thanks for the suggestions. I'd still like to hear the
>> list's suggestions for additional methods to add to the optimization
>> toolkit in Maxima. I have had one off-list suggestion which I will try
>> when I have a little more time (next week).
>>
>
> What about Simulated Annealing or Genetic Algorithms?
>
> They are stochastic global optimization algorithms. And the function to
> be optimized doesn't need to be differentiable; in fact, they are also
> used in discrete optimization.
>
> Simulated Annealing should be easier to program.
>
Last night I programmed up a very simplified simulated annealing
function using an independent multivariate normal proposal. I noticed
several issues, the first is that I'm not sure the "right" way to
evaluate an expression within this type of a function. I've heard the
debates on ev versus subst and soforth, but it seems to me like I need
to use "ev" because I'm not guaranteed that I have a simple expression.
for example if the figure of merit is a block or a call to a function,
or a compound expression. Currently when I want to evaluate my figure of
merit at the current location I do something like this:
nextval:ev(expr,map("=",vars,nextpos))
Any comments?
Another thing I noticed is that if I iterate my simulated annealing
enough I get a hard crash of maxima, with all available RAM used up. I
don't think this is my fault. I'll work a little more on the code and
then post it to the list for comments.
Anyone have any experience on simulated annealing as far as how to deal
with different scales for each of the variables, and how to adjust the
temperature as the iterations proceed? Any reference books or websites?
I think it may be beneficial to take the final "best" result from the
simulation, and then sample randomly in the vicinity and fit a
polynomial surface to find the local minimum before returning the final
"best" value.
thanks for any help,
Dan