how to parameterise local variables?



Hi,

On Fri, Jul 18, 2008 at 10:51:39PM -0600, Robert Dodier wrote:
> On 7/18/08, Oliver Kullmann <O.Kullmann at swansea.ac.uk> wrote:
> 
> >  My first question is whether some helper functionality already
> >  exists in Maxima to handle the common tasks of creating new
> >  variable names, translating between the different forms etc. ?
> 
> Well, to generate a new variable, you can call the Lisp GENSYM function.
> 
> foo : ?gensym ();
>  => g36372
> bar : ?gensym (1234);
>  => g1234
> baz : ?gensym ("quux");
>  => QUUX36373
> 
> The case inversion (quux --> QUUX) is an artifact of Lisp's bizarre
> case sensitivity rules. Oh well. You don't need to know the assigned
> name anyway, and can't really use the name as far as I can tell.
> But you can bind that new symbol to a value via the indirect
> assignment operator "::" .
> 
> baz :: 6789;
> baz;
>  => quux36373
> ''baz;  /* that's 2 single quotes there */
>  => 6789
>

Just to make sure that I understand this right, since
(%i1) foo : ?gensym ();
(%o1) ?g15829

The new name (which can be used clash-free) is "?g15829", not "g15829". 

> 
> >  I started writing such functions myself (based on sconcat and eval_string),
> 
> If the answer is eval_string, almost certainly you probably meant
> to ask a different question.
> 

Aha, so "parse_string" is the right function to use!

This actually solves the problem I had, since now those inequalities,
equations etc., created with the help of parse_string, stay unevaluated,
whether e.g. "x1" has a value or not.

> 
> >  Alright, so I create my artificial variable names like "x1, x2, ...",
> >  and so on, but the problem is that all breaks down if for example
> >  x1 is already in use.
> >
> >  Now there are several solutions:
> >  1) The trivial solution is to use "xyz" instead and hope the best.
> >  2) Other computer algebra systems offer functions like "new_symbol".
> >  3) For Maxima it seems easiest if one could do the following:
> >   a) Create a list L = [x1,x2, ...]
> >     of the variables concerned.
> >   b) Create a dynamic scope block(L, ...),
> >     and call all functions within this scope.
> >
> >  No. 3 would look nice and natural to me, if only I knew an (easy) way
> >  how to tell the "block"-function to treat a first argument as the list
> >  of local variables. The issue seems to be that at parse-time a special
> >  treatment of "block([x1,x2])" is performed, while in "block(L)" that
> >  the list L shall be the list of variables is not recognised.
> 
> Well, code = data in Maxima as in Lisp ... If I understand the
> problem, you can construct a suitable block (via buildq) and
> then evaluate it.
> 
> L : [x, y, z];
> buildq ([L], block (L, F (splice (L))));
>  => block([x, y, z], F(x, y, z))
> 

thanks, that works (didn't think about buildq).
Just to make an example, if I want to create a function which
just takes its argument and uses it for the definition of local
variables, returning a, I would use

f(L) := ev(buildq([L],block(L,a)),eval)$
(%i11) f([]);
(%o11) a
(%i12) f('[a:1]);
(%o12) 1
(%i13) a;
(%o13) a

> It sounds like you want to protect some code against accidental
> name collisions --- if Maxima had lexical scope, maybe it wouldn't
> be an issue ... Anyway maybe you are thinking of something like this?
> 

Isn't it just the opposite, *only* by dynamic scoping can we prevent
the trouble? It seems essential for an interactive computer algebra
system that the user can define global variables like "x:1;", and
that these definitions are then available globally. Given that,
only dynamic scoping helps you, and that rather easily (especially,
given the current code! by just creating "block([x],...)", you can make
all inside functional calls safe, but if x would be statically scoped, then
those functions called dynamically inside the block (where typically you have no
control over what is called) are not protected).
If a variable definition like "x:1;" from the command line
wouldn't be global, how would it work?


> sanitize ([b, c, e], foo(a, b, c, d, e));
>  => foo(a, B36379, C36379, d, E36379)
> 
> sanitize ([x, y], foo (x, y) := y - x);
>  => foo(X36379, Y36379) := Y36379 - X36379
> 
> where "sanitize" is the following bit of obscure macrology or something like it:
> 
> sanitize (L%, e%) ::=
>   (map (?gensym, map (string, L%)),
>    apply (buildq, [map (lambda ([x%, y%], buildq ([x%, y%], x% : y%)),
> L%, %%), e%]));
> 

This is interesting.
Though at this time it seems I'm "safe enough".

There is only one thing, perhaps I could use it here:
In my "unit test" test system I write "higher order test functions",
e.g.,

/* Concept: f(x) is defined for x=1, returns 77. */
okltest_concepttobetested(f) := (
  assert(f(1) = 77),
  true)$

Then called like

okltest_concepttobetested(model1);

The little problem here is that if "f" is a globally defined function,
then this overwrites the parameter.
Thus my test files all contain a "kill(f);".
But one could use "sanitize" instead.

Thanks for your help!

Oliver