>
> Robert> It turns out that Maxima scans nnnn. as an integer.
>
Some background on the history here.
In MacLisp, the Lisp that Macsyma [sic] was written in, the default input
base was typically 8, not 10. A final decimal point forced the decimal
integer interpretation. Common Lisp's input base is 10, but it allows a
final decimal point for MacLisp compatibility. This is presumably the legacy
that Maxima continues.
In Fortran and other numerically-oriented languages, a final decimal point
implies floating-point, as it does in C, I believe. Ada and some other
languages do not allow trailing decimal points at all (on the grounds that
they are error-prone).
I would recommend that Maxima interpret trailing decimal point as a marker
of floating-point on input for consistency with the numerical systems that
people are familiar with, but never produce that on output.
-s