strange behaviour with simple decimals



On 4/12/07, Henning Siebel <henning.siebel at gmx.de> wrote:
>
> Michel Van den Bergh <michel.vandenbergh <at> uhasselt.be> writes:
> http://www2.hursley.ibm.com/decimal:
>
> > "binary floating-point arithmetic should not be used for financial,
> > commercial, and user-centric applications"
>

Note that "scientific and mathematical" applications are not mentioned
here.  And indeed, binary floating-point is the standard arithmetic in those
domains.  It's not clear what is meant here by "user-centric"; surely not
simply non-system software.

But your original question did set into motion an interesting discussion.  I
still think that the conclusion for now is very simple: set the default
number of digits to print as 15 (or even 14) rather than 16.

          -s