occasional meanderings in physics' brave new world

Name:
Location: New Zealand

Marni D. Sheppeard

## Wednesday, January 23, 2008

### Associativity

It is easy enough to invent associative algebraic structures that quickly lead to associated non-associative structures. Unfortunately, ordinary numbers are almost always used as models in the sense that addition is assumed to be commutative. Let us consider a system with two binary operations, by convention called addition and multiplication, neither of which is commutative. Let us assume that scalars are associative and distributive, so that

$x(y + z) = xy + xz \neq xz + xy = x(z + y)$

What happens with $2 \times 2$ matrices over these scalars? Matrix multiplication is well defined by the usual rule, but one must be careful about ordering scalars. In a triple product of matrices $ABC$, associativity is lost, because the first element of the product $A(BC)$ is given by

$A_{11} B_{11} C_{11} + A_{11} B_{12} C_{21} + A_{12} B_{21} C_{11} + A_{12} B_{22} C_{21}$

which is distinct from

$A_{11} B_{11} C_{11} + A_{12} B_{21} C_{11} + A_{11} B_{12} C_{21} + A_{12} B_{22} C_{21}$

in $(AB)C$ by non-associativity of addition. Commutativity of addition would restore associativity for all matrices.

phil said...

You said: "It is easy enough to invent associative algebraic structures that quickly lead to associated non-associative structures", however you have not actually constructed one and I am not convinced that it is "easy" to do so. I.e. can you construct an algebra whose matrix maultiplication is non-associative but whose addition and multiplication is assocative and distributive and prove it?

The reason I think this is difficult is that
you can show associativity for matrix multiplication with only minimal additional rules. E.g. if you have a cancellation law for addition

$x + y = x + z => y = z$ and
$y + x = z + x => y = z$

then use

$(a + b)(c + d) = (a + b)c + (a + b)d = ac + bc + ad + bd = a(c + d) + b(c + d) = ac + ad + bc + bd$

therefore
$ac + bc + ad + bd = ac + ad + bc + bd$
now using the cancellation law on each side
$bc + ad = ad + bc$

so addition is then always commutative for composite numbers which is enough to show that your two expressions for the element of the product are equal and matrix multiplication is associative.

(I used the right distributive law even though you only stated the left distributive law, but you had used the right distributive law in your matrix multiplication so I think you meant to include it)

January 23, 2008 11:18 PM
Kea said...

Thanks, Phil! But cancellation laws only apply to monic (or epic) arrows in a category and do not in general hold, so I might want to ditch them.

January 24, 2008 9:21 AM
kneemo said...

It is easy enough to invent associative algebraic structures that quickly lead to associated non-associative structures

I think Kea had the Jordan algebra of (complex) n x n Hermitian matrices in mind. The Jordan product in this case is given by:
A o B = 1/2(AB+BA), for arbitrary Hermitian matrices A, B and "AB" and "BA" denoting ordinary matrix multiplication. Playing with this Jordan product you'll see it is nonassociative, while the full algebra of n x n complex matrices and the complex numbers are associative and distributive.

January 25, 2008 4:07 PM