From: Stavros Macrakis
In MacLisp, the Lisp that Macsyma [sic] was written in, the default input
base was typically 8, not 10. A final decimal point forced the decimal
integer interpretation. Common Lisp's input base is 10, but it allows a
final decimal point for MacLisp compatibility.
This isn't quite correct. The input base in CL is initially decimal
but may be bound or set arbitrarily within the range 2..36. An
integer ending in a trailing decimal point (aka the period char) is
always interpreted as decimal, overriding the current default.
(let ((*read-base* 8)) (read-from-string "(+ 10 10.)")) => (+ 8 10)
I don't know if Maxima allows the default input radix to be changed,
but regardless, the proposed change to input syntax is not backward
compatible. How can you be sure in some ancient but still running
Maxima program source that there isn't an integer written with a
training point?
I think the problem is with the printing. Format controls are
wonderful for producing human-readable syntax, but many controls such
as ~g do not necessarily preserve print/read consistency. They should
_not_ be used in circumstances were a program may later need to reread
the output and depend on the precise types and numeric values.