What about Neural Network?
Laurent.
> -----Message d'origine-----
> De?: maxima-bounces at math.utexas.edu [mailto:maxima-bounces at math.utexas.edu] De la part de
> Daniel Lakeland
> Envoy??: samedi 15 d?cembre 2007 19:09
> ??: maxima at math.utexas.edu
> Objet?: Re: [Maxima] Optimization again
>
> On Sat, Dec 15, 2007 at 08:41:16AM +0100, Mario Rodriguez wrote:
> >
> > > Rescaling and using a sum of squared errors approach allowed lbfgs to
> > > converge, so thanks for the suggestions. I'd still like to hear the
> > > list's suggestions for additional methods to add to the optimization
> > > toolkit in Maxima. I have had one off-list suggestion which I will try
> > > when I have a little more time (next week).
> > >
> >
> > What about Simulated Annealing or Genetic Algorithms?
> >
> > They are stochastic global optimization algorithms. And the function to
> > be optimized doesn't need to be differentiable; in fact, they are also
> > used in discrete optimization.
> >
> > Simulated Annealing should be easier to program.
>
> I think simulated annealing would be a good choice for a general
> purpose highly robust strictly numerical optimization algorithm. It is
> also possible to specify inequality constraints using SA by rejecting
> proposals that do not meet the constraints. However, it requires the
> user to generate proposals which may be difficult for the user to
> understand.
>
>
> --
> Daniel Lakeland
> dlakelan at street-artists.org
> http://www.street-artists.org/~dlakelan
> _______________________________________________
> Maxima mailing list
> Maxima at math.utexas.edu
> http://www.math.utexas.edu/mailman/listinfo/maxima