behavior of 'taylor' when Taylor series is undefined



I wonder what should be the behavior of 'taylor' when no Taylor series
exists. (There is nothing documented for that case and the existing
implementation doesn't seem to handle it.) I can see a few possibilities:

 * trigger an error
 * return a 'taylor(...) noun expression
 * throw something, perhaps a noun expression

About triggering an error, that is what 'sum' and 'integrate' do when
the sum or integral is divergent. An error is correct in a sense but
less useful from a programming point of view -- not clear how to recover
from it automatically.

Returning a noun expression suggests that the Taylor series exists but
it wasn't able to calculate it. I guess that leaves open the door to
impose some interpretation on the noun expression after the fact, but
that seems remote -- what different interpretation could there be?

Throwing something allows for the caller (or anybody farther down the
call stack) to handle the problem automatically. Throwing a noun
expression allows the handler to extract info about the failed call.

At this point, I think throwing a 'taylor(...) noun expression is my
preference. Only point against it is that it's unfamiliar to users.

Comments?

best,

Robert Dodier