Hello,
> Some comments:
>
> (1) While it is good that the interpol routines are available, they seem
> to be much too slow for my purposes.
Specific numeric procedures would speed up computations in your case,
since you are working with a lot of sample points. But the idea behind
the interpolating functions is to have symbolic capabilities as well.
> [....]
>
> (2) Back to the "interpol" routines. It seems to me that they should
> be rewritten using search tools. Suppose that the data are given as
> (using latex notation) $$ \{ I_i = (x_i, y_i) \} $$
>
> One should first define the affine functions, say $g_i$ for the
> intervals $I_i$.
>
> If $g$ is the final interpolation function
>
> (a) $$ g(x) = \sum charfun(I_i)g_i(x)$$
>
> then to evaluate g(x), one should first
>
> (b) search to find which interval $I_j$ contains $x$,
>
> then
>
> (c) evaluate $g_j(x)$.
>
> Ditto for other one dimensional interpolation functions.
>
Another approach is to order the set of pairs with respect to x_i, and
search for the interval (x_i, x_{i+1}) containing x, then you can
compute g_i with pairs (x_i, y_i) and (x_{i+1}, y_{i+1}).
> Computing the sum in (a) wastes a lot of time and computing
> resources. I am not sure which search tools are best for the one
> dimensional search routine. Do you have suggestions?
First, you need to get the pairs ordered with respect to x,
(%i1) load(interpol)$
(%i2) z: interpol_check_input([[-3,4],[4,2],[-10,5],[0,6]]);
(%o2) [[- 10, 5], [- 3, 4], [0, 6], [4, 2]]
For searching, a bipartition method could be fast enough: first compare
x with element in position length(z)/2, and take the inferior or
superior half of pairs, and repeat this process until you find the
interval.
Does this help?
--
Mario Rodriguez Riotorto
www.biomates.net