Grassman algebra matrix representation

jacobrhcp
Messages
164
Reaction score
0

Homework Statement



I want to find a matrix representation of the grassman algebra {1,x,x*,x*x} (and linear combinations with complex coefficients)

defined by [x,x]+=[x,x*]+=[x*,x*]+=0

I really don't know how to make matrix representations of an algebra. Is any set of 4 matrices that obey the anti commutation rules OK? Is there a standard procedure or do you always have play with matrices until you have some slick trick to find a representation?

The Attempt at a Solution



I think I need 4x4 matrices, but I'm not sure why. I'd really like to know why I would need 4x4 matrices.

Furthermore, because '1' is in my algebra I suspect I need the identity matrix. I can even rewrite the definition into xx=x*x*=0 which my x and x* matrices need to obey. The notation suggest that the complex conjugate of x needs to be x* (usually with matrices we then take the hermitian conjugate, but the notation does say * and not dagger). At the same time x*x is not 0, so this would mean my matrices need to have both complex and real coefficients. This all doesn't give me a decent guess.

Can anyone give me some general hints or explanations? I am determined to adopt the 'way of thinking' for finding matrix representations of an algebra.
 
Last edited:
Physics news on Phys.org
You can try to use tensor products of nilpotent matrices, Pauli matrices and the identity. The nilpotent matrices take care of the fact that powers larger than 1 have to be zero, the Pauli matrices implement the anti-commutativity.
 
that's a great idea! Will do.
 


Is there a representation, where x^{*}x is not the zero matrix? In other words, is there a 4x4 2-nilpotent matrix x, so that y=x^{*}x is a non-zero 2-nilpotent matrix?
 
I ask, and I answer myself :) There exists such a matrix:
<br /> \begin{pmatrix}<br /> 0&amp;0&amp;i&amp;0\\<br /> 1&amp;0&amp;0&amp;-i\\<br /> 0&amp;0&amp;0&amp;0\\<br /> 0&amp;0&amp;1&amp;0<br /> \end{pmatrix}<br />

However, if looked upon just like a basis transformation of some vector A=(a_1,a_2,a_3,a_4) from (1,\phi,\phi^*,\phi^*\phi) to their matrix representation, we could do \phi A=(0,a_1,0,-a_3) and from there deduce a matrix form of \phi (and then the others). But then we would have to redefine "complex conjugation" in some other way, as the new basis matrices are real.

Could you please tell me, how to do the "basis transformation", preserving the usual definition of complex conjugation? Thanks a lot!
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top