replacement of short functions by their definitions andother code cleanups
Subject: replacement of short functions by their definitions andother code cleanups
From: Viktor T. Toth
Date: Sat, 5 May 2007 12:01:54 -0400
Dear Andreas,
> These are all standard common lisp functions with clear semantics.
I agree. However, that really is not the issue here. If there was a standard
LISP replacement for concat (i.e., a 2-argument function that does exactly
what "our" concat does), by all means, replace the non-standard
implementation with a call to the standard version. But that is not the case
here.
Instead, there is a standard LISP way of doing what concat does, but it's
not a single function call; it is a more complicated expression. Inserting
that more complicated expression everywhere where concat was used previously
is problematic, as it makes the code less readable.
In other words, what it boils down to is not that you chose to replace the
old concat code with new code. It's that you chose to replace all _calls_ to
concat with explicit code, instead of simply rewriting the concat function
so that it does its business the right way.
> There is no need to test format or intern, since they are
> standard functions.
I didn't mean to suggest that format or intern need to be tested. What I did
mean to say is that during debugging, instead of testing an expression that
consists of three symbols (in the form of (concat arg1 arg2)) one now needs
to verify that an expression comprising twice as many symbols and two
nesting levels (that is, (intern (format nil "~a~d" $idummyx n))) is
correct. This means twice as many chances of making a mistake in our code,
i.e., it is an extra hurdle during debugging. When some functionality can be
so evidently encapsulated in a single function call, it is the right thing
to do, regardless whether or not that function may have been originally
inherited from maclisp.
> Probably the whole thing should be replaced with a call
> to gentemp, because that is what it is trying to do
> - generate a unique symbol, isn't it?
As in the integration case reported by Raymond earlier today, the symbols
are sequentially generated, and this is documented behavior. Furthermore,
they are most certainly NOT random, and they are not even necessarily
intended to be unique. (For instance, if a tensor expression is a sum, we
may restart the dummy index in each sum element when dummy indices are
renamed.)
> As this issue is obviously a sensitive one I will refrain
> from making such changes in the future.
Thank you, I appreciate your consideration. Just to be clear about it, the
issue that I am sensitive to is not the idea that the code is being
standardized; I'm all for it! However, I see a clear distinction between
standardizing a function's implementation vs. replacing all calls to that
function with functionally equivalent, but more complicated explicit code.
On a more philosophical note, I am also wary of changes that are not
intended to address known functionality or compatibility problems. (This is
the "don't fix it if it ain't broken" argument.) That is not to say that I
don't recognize the importance of standard code usage, but our priority,
first and foremost, should be to ensure that Maxima never produces incorrect
math! That is certainly my priority with the tensor package, and these
recent changes help little, and may even hinder me in my efforts towards
that goal. That's the reason for my sensitivity.
Viktor