Memory use, forcing GC, %oXX vars keeping references



On Sat, 2007-01-20 at 10:38 -0800, Daniel Lakeland wrote:
> I've been using the "rk" function with very small step sizes, I tend
> to get back lists of several thousand or a million entries.
I have a modified version of rk that only saves the results of
some of the steps (for example, steps 0,100,200,...). I use it when
I have to study chaotic systems with very small step sizes. I will
soon commit a new version of rk with an extra option to do that.

In the mean time, you can also do the following: instead of running
rk once with n steps, run it 10 times with n/10 steps, in the second,
third, ..., tenth run you use as initial values the last value on the
list returned by the previous run. After each run you can select a few
points from the whole list using makelist. The English translation of
my book, which gives more examples on the use of "dynamics", should
become available soon.

> After a while, gcl eventually crashes. I'm running on an AMD64 machine
> with 2G of memory, but I'm not convinced that gcl is able to use the
> entire address space since it tends to stay below about 500MB before
> crashing. I do "kill(solns)" (the variable that stores the output) at
> each iteration, but I'm thinking perhaps that the %o1,%o2 type
> variables keep a handle to these big lists so they aren't gc'ed.
Oh really? that would be a bummer. Are you ending each %i command with
$ ?

Regards,
Jaime