Accuracy and error analysis (was Re: [Maxima] primes)



[C Y , Fri, 13 May 2005 13:20:27 -0700 (PDT)]:
> 1.10 and 1.1 imply different things, in such a case.

When the distinction matters, people usually state the uncertainty
explicitly, e.g., 1.100(5) vs. 1.10(5).  You are then also not limited
to the artifacts of decimal notation.

> I think the issue should be invoked only when the user makes it clear,
> by specifying an input with a user supplied error, that the user cares
> about the uncertainty in the outcome.  

I don't think that is desirable: this means that any input would be
checked for whether there is something uncertain in it, and the error
analysis would start to creep into the system as it did in Mma.  Not
good.  Make the user ask for error analysis explicitly:

    with_error_analysis([a : ..., b : ... ], a + b);

This would also make it easier to switch from, say, naive significance
arithmetic to something more sophisticated.  And it would ensure that
those who don't want this functionality are not at all.

With a full set of pre-processing / post-processing hooks analogous to
Mma's $PreRead, $Pre, $PrePrint, $Post, one can then easily arrange
for with_error_analysis() to be wrapped around every expression if
this is desired.

> But for numerical calculations, at least ones that to not take such a
> long time that the user notices, empirical determination of the error
> should be sufficient, and if Maxima decides it needs greater accuracy
> it has the option of either computing it out, or acknowledging the
> uncertainty introduced and keeping track of it. 

If experience with Mma is any indication, users often want to do
things where speed is an issue.

Recomputing something with higher working precision in order to reach
some desired accuracy is apparently hard to get right, as Mma shows.
Furthermore, that strategy is not only costly but also impacts the
user in that she has to be even more careful in defining her functions
as soon as side effects are involved.

Albert.