Firstly, let me state that I'm very new to the world of computer
algebra systems. But, I'm very interested in them from the perspective
of a user, as well as from the perspective of a developer.
I came across the following limit the other day in Spivak's chapter on
logarithms and exponential functions:
limit((b**h - 1)/h, h, 0)
He seemed to promise that that the limit existed, and that it was
sensible, though he gave no actual value. I found that bizzare, and
very unlike him.
So, I punched it into Maxima, and got back "ln(b)", (or "log(b)", as
Maxima put it).
This puzzled me greatly, so I spent a few days working on the problem,
and eventually pieced together the proof:
http://www.rutski89.com/static/exp_diff.pdf
All but the very last section is taken straight from Spivak and Knuth,
but I did the illustrations; it turned out to be amazingly simple in
the end.
The question I would like to address to the list is the same question
posed at the bottom of the PDF:
"Now, I can do it as a human, but how the heck would a computer
analysis system figure that out!?"
A very curious student,
-Patrick
P.S.
I'm still reading Paul Grahm's Common LISP book, so it'll be a while
before I can start reading Maxima source in order to see for myself
how it works.