RE : crazy run-time fluctuations (mostly super-slow) -- a bit more data



I fully expect large discrepancies between lisps on long-run-time tasks 
based on different implementations of storage allocation (garbage 
collection).  Relatively short runs coupled with relatively large 
physical random-access memory mask the differences between GCs.  For 
people with 4GB +  computers and jobs that complete in an hour of CPU 
time at 2.5GHz, it may not matter at all.  For jobs that fill up memory 
and run at "disk" speed because of page faults to disk, there may be 
huge discrepancies (1000X) if a better GC avoids these faults.

In principle, "conservative" GCs can leave a lot of memory unusable 
(though I don't know if Maxima usage might cause this); GCs that do not 
"copy" can leave memory fragmented, etc.  (There are treatises written 
on GC, e.g by Paul Wilson).

Just googling for GC and ECL, SBCL, GCL  suggests that ECL is more 
likely to lose than SBCL.  I am unclear on the GCL situation -- it says 
"stratified" conservative garbage collection.

The commercial Lisp systems treat GC as a serious issue and likely have 
a performance advantage for jobs that might run essentially 
indefinitely.  I personally rarely run Maxima for more than a few CPU 
minutes.

RJF