git blame reports that code to make assumptions about the integration
variable since day 1. (The code is credited to wfs, requiescat in pacem.)
I guess it's possible the assumptions code wasn't always called
or something like that, I didn't investigate that.
best
Robert Dodier
On 12/19/11, Raymond Toy <toy.raymond at gmail.com> wrote:
> On Mon, Dec 19, 2011 at 12:02 PM, Barton Willis <willisb at unk.edu> wrote:
>
>> I don't understand why defint is so picky about verifying that the limits
>> of integration are real. The code for defint seems
>> to set up its own private context and makes various assumptions, but I
>> don't see where the code simplifies the
>> integrand in the context that the variable is between the limits of
>> integration. Actually even when the limits are
>> non-real, an affine change of variable could change them to either 0 to 1
>> (or 0 to infinity, or ...)
>>
>>
> FWIW, I think in older versions (pre 5.25?) defint was not so picky. It
> was some "recent" change that made it picky. At least that's my vague
> memory. Could be related to one old bug where defint needed to know how
> many periods were in the limit of integration to compute the correct value
> of the integral. I have not had a chance to test this out with an old
> version of maxima.
>
> Ray
>