Show that a group of operators generates a Lie algebra

russel
Messages
13
Reaction score
0
Hello there! Above is a problem that has to do with Lie Theory. Here it is:
The operators P_{i},J,T (i,j=1,2) satisfy the following permutation relations:
[J,P_{i}]= \epsilon_{ij}P_{ij},[P_{i},P_{j}]= \epsilon_{ij}T, [J,T]=[P_{i},T]=0
Show that these operators generate a Lie algebra. Is that algebra a semisimple one? Also show that
e^{uJ}P_{i}e^{-uJ}=P_{i} \cos{u}+ \epsilon_{ij}P_{j} \sin{u}

Does anyone know how to deal with it?
 
Last edited:
Physics news on Phys.org
Changing "MATH" to "itex" will probably generate more response.
 
Martin Rattigan said:
Changing "MATH" to "itex" will probably generate more response.
ok! I fixed it! I didn't know the exact code, sorry!
 
You just have to prove the three things:
1) The bracket is bilinear
2) The bracket is anti-commutative
3) The bracket satisfies the Jacobi identity.

Since you're given a proposed list of generators, the bracket is defined to be the bilinear operator satisfying the properties that you're given, so 1 is already done.

Let's say I'm given two arbitrary elements A and B. Do you see how you could prove that in general, [A,B]=-[B,A] using only what you know about the Lie bracket on the generators?
 
I think I got it! Thanks a lot! What about the last equation? How do I prove it?
 
russel said:
I think I got it! Thanks a lot! What about the last equation? How do I prove it?

The last one is the group adjoint action. Try taking the derivative of the LHS wrt u.
This should show you how to rewrite the group adjoint as an "exponentiated commutator".
I'll let you do the rest.

Also have you proved that it's semi-simple yet?
 
Last edited:
Simon_Tyler said:
The last one is the group adjoint action. Try taking the derivative of the LHS wrt u.
This should show you how to rewrite the group adjoint as an "exponentiated commutator".
I'll let you do the rest. Also have you proved that it's semi-simple yet?
I'll haven't had enough time, so I didn't really managed to solve it. As far as I know, a lie algebra is a semi-simple one if it is a direct sum of simple lie algebras. I will follow this path? Or should I think of a different way to prove it?
 
Also, a semi-simple Lie algebra L satisfies [L,L]=L.
 
Office_Shredder said:
Also, a semi-simple Lie algebra L satisfies [L,L]=L.
Oh, I see. So, I'll choose every element and see if this condition is satisfied.
 
  • #10
The L Office Shredder is referring to is the whole algebra, not a single element. The Lie bracket of an element with itself is 0, by definition.

Look at the presentation for the algebra. Then, consider \[L,L\]. Can you get each generator from the Lie bracket of other elements?
 
  • #11
To prove that you have a lie algebra, you wave to show that your bracket is a bilinear operator, and also that the Jacobi identity holds: i.e given any arbitrary 3 operators A,B,C,
[[A,B],C]+[[B,C],A]+[[C,A],B]= 0. That would involve simple computations.
As for showing that it is semi-simple, this might actually involve some more work since you have to show that the nilradical of the group is zero.
Vignon S. Oussa
 
Last edited:
  • #12
vigvig said:
that the Jacobi identity holds: i.e given any arbitrary 3 operators A,B,C,
[[A,B],C]+[[B,C],A]+[[C,A],B]= 0. That would involve simple computations.

How do we do that in the case with 4 operators (let's say A,B,C and D) ?

[[A,B],C]+[[B,C],D]+[[C,D],A]+[[D,A],B]= 0

Is it right?
 
  • #13
maxverywell said:
How do we do that in the case with 4 operators (let's say A,B,C and D) ?

[[A,B],C]+[[B,C],D]+[[C,D],A]+[[D,A],B]= 0

Is it right?

Oops, it's wrong and I think I figure it out. It's

[[A,B],C]+[[B,C],A]+[[C,A],B]= 0

[[A,B],D]+[[B,D],A]+[[D,A],B]= 0

[[A,C],D]+[[C,D],A]+[[D,A],C]= 0

[[B,C],D]+[[C,D],B]+[[D,B],C]= 0

So now we have 4 Jacobi identities which must be satisfied to form a Lie algebra.

In case of N operators (generators of Lie group) there are \binom{N}{3} Jacobi identities.

Am I right? Please let me know.
 
Last edited:
  • #14
That is correct
 
  • #15
Thnx! So there are a lot of work if we have many generators...
 
  • #16
Is there any other way of doing this? If we have 10 generators than there are 120 Jacobi identities that we need to prove...
 
  • #17
Just an idea...I haven't thought about the details. How about finding a Lie group that has a Lie algebra that satisfies the other requirements? Then the Jacobi identity will be automatic.
 
  • #18
Are you saying that if the following two properties are satisfied than the third is sutisfied automaticaly?

Office_Shredder said:
You just have to prove the three things:
1) The bracket is bilinear
2) The bracket is anti-commutative

3) The bracket satisfies the Jacobi identity.
 
  • #19
Ok let me show you one example:

I want to prove that the algebra of octanions with generators X_a, a=1,...,7 that sutisfy the following relation: X_a X_b=\epsilon_{abc}X_c is Lie algebra.

My proof is the following:

1) [X_a, X_b]=X_a X_b-X_b X_a=2\epsilon_{abc}X_c

2) \epsilon_{abc}=-\epsilon_{bac}

3) [[X_a, X_b],X_c]=[X_a X_b, X_c]-[X_b X_a, X_c]=[X_a, X_c]X_b+X_a [X_b,X_c]-[X_b, X_c]X_a-X_b[X_a, X_c]=X_a X_c X_b - X_c X_a X_b + ...=0

because X_i X_j X_k = 0 for i\neq j \neq k

Is it correct?
 
  • #20
maxverywell said:
Are you saying that if the following two properties are satisfied than the third is sutisfied automaticaly?
No, that would be a false claim. I'm thinking along these lines: Suppose that they're all satisfied. Then the bracket defines a Lie algebra, and we can try to find a group that has that Lie algebra as the tangent space at the identity element. If you find such a group, the claim you're trying to prove must be true.

I haven't tried to solve your problem with this method or any other. It's just the first idea that occurred to me.
 
  • #21
Bi-linearity doesn't mean you can pull out elements of the Lie algebra:

[X_a X_b, X_c] \neq X_a[X_b,X_c]
 
  • #22
Bilinearity does however mean that if you expand arbitrary vectors in a basis, you can pull the coefficients out of the brackets:

[X,[Y,Z]]=[X_iE_i,[Y_jE_j,Z_kE_k]]=X_iY_jZ_k[E_i,[E_j,E_k]]

If we have already proved that the Jacobi identity holds when only basis vectors are involved, we can rewrite this as

=X_iY_jZ_k(-[E_j,[E_k,E_i]]-[E_k,[E_i,E_j]])=-[Y,[Z,X]]-[Z,[X,Y]]

So if the Jacobi identity holds when only basis vectors are involved, it holds for three arbitrary vectors.

Edit: I don't think he was "pulling things out". Looks like he was using an identity that holds for commutators.
 
  • #23
Office_Shredder said:
Bi-linearity doesn't mean you can pull out elements of the Lie algebra:

[X_a X_b, X_c] \neq X_a[X_b,X_c]

No I'm not pulling out elements like you did.

[X_a X_b, X_c] = X_a[X_b,X_c]+[X_a, X_c] X_b
 
  • #24
Reffering to the initial problem, could someone thoroughly explain how to prove that this Lie algebra is a semi-simple one?
 
  • #25
russel said:
Reffering to the initial problem, could someone thoroughly explain how to prove that this Lie algebra is a semi-simple one?

Use the Cartan criterion (i.e prove that the Killing form is nondegenerate).
 
Last edited:
  • #26
First of all, substitute X1=P1, X2=P2, X3=J and X4=T.
Find the structure constants of the Lie algebra C_{ij}^k, e.g C_{12}^3=1, C_{13}^2=-1 etc.
Then use them to constract the Cartan metric g_{ij} and try to prove that det(g)\neq 0.

P.S. SEMFE?
 
Last edited:
  • #27
russel said:
Also show that
e^{uJ}P_{i}e^{-uJ}=P_{i} \cos{u}+ \epsilon_{ij}P_{j} \sin{u}

For this one, define:

f_1(u)=e^{uJ}P_{1}e^{-uJ}

Take the Maclaurin series expansion of f_1(u):

f_1(u)=P_1+[J,P_1]u+\frac{1}{2!}[J,[J,P_1]u^2+\cdot\cdot\cdot

f_1(u)=P_1+\frac{1}{2!}P_2 u-P_1 u^2+\cdot\cdot\cdot=P_1-\frac{1}{2!}P_1 u^2+\cdot\cdot\cdot+P_2 u-\frac{1}{3!}P_2 u^3+\cdot\cdot\cdot

f_1(u)=P_1 \cos u+P_2\sin u

Do the same for f_2(u).
 
Last edited:

Similar threads

Back
Top