[Stavros:] It is of course silly to define [-1,1]==0; who does that? answer: WRI



  Mathematica arithmetic does this..

u=SetAccuracy[0,-0.1]

     defines u as a number of low "significance" ... zero with a slosh 
of  some amount related
to that -0.1.  e.g. Log[0.1,10]= -1 .  Sort of 1 decimal digit.

In Mathematica, u prints as  "0."  but is internally some kind of interval.


Here are some tests and results

u==  0    True
u== -1     True
u== +1    True

u==  2  False

etc


So in Mathematica, indeed ,   the arithmetic supports  intervals being 
equal to ANY element within their bounds.
This means that equality is interpreted as "possibly equal".

There is another feature of Mathematica, the explicit Interval[{}].  
They are related in some sense..

InputForm[Interval[u]]
is

Interval[{-2.`0.20102999566398058, 2.`0.20102999566398058}]

where the notation -2.`0.2 .....   means something about Precision 
rather than Accuracy.

These two Capitalized terms are used in Mathematica as part of their own 
(in my opinion
misguided) versions of bigfloat arithmetic.

The arithmetic involving Interval[]  is not broken in this particular way.

RJF