if y=limit(f(t),t,x) then as t approaches x, f(t) approaches y. But if
y is an interval, say [-1,1], then f(t) never approaches y in a uniform
sense, regardless of how close t approaches x.
(typically x is infinite in these cases)
If f(t) is a scalar s between -1 and 1, then the distance between s and
[-1,1] is an interval of length 2, and doesn't get smaller.
Perhaps it is possible to reinterpret the definition of limit to
accommodate this, but I haven't seen it. You'd have to redefine
derivatives, no?
RJF