Whats wrong with this code?



The following gives an error with the cvs version of maxima:

s[0]:matrix([1,0],[0,1]);
s[1]:matrix([1,0],[0,-1]);
s[2]:matrix([0,1],[1,0]);
s[3]:matrix([0,%i],[-%i,0]);

aa:a0*s[0]+a1*s[1]+a2*s[2]+a3*s[3];

aa.aa;

gives:
(%i1) s[0]:matrix([1,0],[0,1]);
				   [ 1  0 ]
(%o1) 				   [ 	  ]
				   [ 0  1 ]
(%i2) s[1]:matrix([1,0],[0,-1]);
				  [ 1   0  ]
(%o2) 				  [ 	   ]
				  [ 0  - 1 ]
(%i3) s[2]:matrix([0,1],[1,0]);
				   [ 0  1 ]
(%o3) 				   [ 	  ]
				   [ 1  0 ]
(%i4) s[3]:matrix([0,%i],[-%i,0]);
				 [  0	 %i ]
(%o4) 				 [ 	    ]
				 [ - %i	 0  ]
(%i5) aa:a0*s[0]+a1*s[1]+a2*s[2]+a3*s[3];
			  [  a1 + a0    %i a3 + a2 ]
(%o5) 			  [ 			   ]
			  [ a2 - %i a3	 a0 - a1   ]
(%i6) aa.aa;
Maxima encountered a Lisp error:

 
SYMBOL-NAME: 1 is not a symbol

Automatically continuing.
To reenable the Lisp debugger set *debugger-hook* to nil.

There is a more general version of this problem that I'm trying to deal
with.  I have expressions that contain linear combinations of matrices
like

A[i]=a[i,1]*M[1] + a[i,2]*M[2] etc.

where M[i] is a matrix and a[i,j] is a scalar.  Maxima does seem to
create the matrix as expected, but A_i.A_i behaves as if both the a's
and the M's are some sort of array.  I've tried scalar(a) as well as
scalar(a[i,j]) constructs but this doesn't seem to help.

David