On 2012-10-12, Przemek Klosowski <przemek.klosowski at nist.gov> wrote:
> I tried to get expressions for linear regression. First I defined the
> sum of squares:
>
> sumsq:sum((y[i]-a*x[i]-b)^2,i,1,N);
>
> then tried to solve for the minimum (derivatives of coefficients must
> be zero at solution:
>
> solve([diff(sumsq,a),diff(sumsq,b)],[a,b]);
and Robert Dodier got me going:
> I think you need declare(sum, linear) so that a and b are moved out of
> summations (sum isn't declared linear by default). You might also
> need to apply 'expand' to the derivatives to pull out a and b.
indeed,
declare(sum, linear);
solve(expand([diff(sumsq,a),diff(sumsq,b)]),[a,b]);
works--maxima is awesome! (sorry, over-enthusiastic newbie here)
Shouldn't 'sum' be linear by default BTW?
Just to practice my maxima skills, I tried back-substituting:
sx:sum(x[i],i,1,N); sxy:sum(x[i]*y[i],i,1,N);
sx2:sum(x[i]^2,i,1,N); sy:sum(y[i],i,1,N);
aa:(N*sxy-sx*sy)/(N*sx2-sx*sx); bb:(sx2*sy-sx*sxy)/(N*sx2-sx*sx);
expand(sum((y[i]-aa*x[i]-bb)^2,i,1,N));
and that didn't go well: I got a huge non-reduced expression. Tried
ratsimp on it, but it got even worse. What's the right way to simplify this?