Re: Algebra structure



I wrote:
>(1) a # b is additive in a and additive in b. [snip]
>(2) * has higher precedence than #, so that a # b*c means a # (b*c), and
>    # has higher precedence than +,-.
>(3) We're given an n x n matrix M whose entries are elements of L.
>(4) With biadditivity, if a,b are elements of L then a # b is reducible,
>    using (1),(2), to a sum of terms of the form m1*yi # m2*yj where m1,m2
>    are monomials (maybe with integer coefficients) in x1,...,xm. I need to
>    be able to tell Maxima that this simplifies to m1*m2*M[i,j]. (Thus, the
>    # product of two linear combinations turns out to be another linear
>    combination, so L is a nonassociative algebra over R.

After posting this, I realized that since I did say that the algebra L (and
therefore the operator #) should be commutative, I can in effect define the
algebra L by using LET rules for ordinary multiplication * without introducing
a new infix operator #. To deal with nonassociativity, I just have to be
careful never to multiply more than two things at a time without a lot of
parentheses, so that Maxima never gets a chance to use the associativity of *.
However, the formulation I gave makes perfectly good sense even if # is not
assumed to be commutative. It becomes commutative by taking the matrix M
symmetric, and otherwise it isn't. So, I'd still be interested in how to
solve the problem without assuming that the operator # is commutative.
-- 
Ignorantly,
Allan Adler 
* Disclaimer: I am a guest and *not* a member of the MIT CSAIL. My actions and
* comments do not reflect in any way on MIT. Also, I am nowhere near Boston.