# I Commutator between Casimirs and generators for Lorentz group

1. Apr 7, 2016

### julian

The generators $\{ L^1, L^2 , L^3 , K^1 , K^2 , K^3 \}$ of the Lorentz group satisfy the Lie algebra:

\begin{array}{l}
[L^i , L^j] = \epsilon^{ij}_{\;\; k} L^k \\
[L^i , K^j] = \epsilon^{ij}_{\;\; k} K^k \\
[K^i , K^j] = \epsilon^{ij}_{\;\; k} L^k
\end{array}

It has the Casimirs

$$C_1 = \sum_i (K^i K^i - L^i L^i) , \qquad C_2 = \sum_i K^i L^i$$

I wish to prove that the Casimirs commute with all of the generators of the Lie algebra. It is easy to prove that $[C_1 , L^j] = 0$, $[C_2 , L^j]$ and $[C_2 , K^j] = 0$. However, I'm having more trouble proving $[C_1 , K^j] = 0$. What I obtain, for example for $j=1$, is:

$$[C_1 , K^1] = 2 [L^2 , K^3]_+ - 2 [L^3 , K^2]_+$$

where $[\cdot , \cdot]_+$ is the anti-commutator. I'm not sure how to prove directly that this vanishes. However, there may be an indirect way of proving $[C_1 , K^1] = 0$. First note:

\begin{array}{l}
[C_1 , C_2] = \sum_i [K^i K^i - L^i L^i , C_2] \\
\sum_i (K^i [K^i , C_2] + [K^i , C_2] K^i - L^i [L^i , C_2] - [L^i , C_2] L^i) \\
= 0
\end{array}

where we have used $[C_2 , L^j]$ and $[C_2 , K^j] = 0$. Next write

\begin{array}{l}
0 = [C_1 , C_2] \\
= \sum_i [C_1 , K^i L^i] \\
= \sum_i ([C_1 , K^i] L^i + K^i [C_1 , L^i]) \\
= \sum_i [C_1 , K^i] L^i .
\end{array}

where we have used $[C_1 , L^j] = 0$.

Is it possible to use this to prove $[C_1 , K^1] = 0$?

I would prefer to prove first that the Casimirs commutate with all the generators first and then conclude the two Casimirs commute, but if this is what I have to resort to...

Last edited: Apr 7, 2016
2. Apr 12, 2016

### JorisL

The indirect method certainly isn't wrong, you basically prove that $C_1$ commutates with the $K^i$ and $C_2$ at the same time.
From the final expression you know that $\sum_i[C_1,K^i]L^i=0$ but the $L^i$ are linearly independent so the commutators all vanish.

I haven't looked at it in detail but maybe you can use the Jacobi identity in some way to directly prove that $[C_1,K^j]=0$.

3. Apr 12, 2016

### julian

Thanks JorisL, I was hoping to try and make some linearly-independent type argument. Now, it would be easy if the coefficients" in front of the $L^i$'s were numbers then we would have an equation like:

$\alpha_1 L^1 + \alpha_2 L^2 + \alpha_3 L^3 = 0$

then I could do

\begin{array}{l}
0 = \alpha_1 [L^1 , L^2] + \alpha_2 [L^2 , L^2] + \alpha_3 [L^3 , L^2] \\
= \alpha_1 L^3 - \alpha_3 L^1
\end{array}

and then do

\begin{array}{l}
0 = \alpha_1 [L^3 , L^1] - \alpha_3 [L^1 , L^1] \\
= \alpha_1 L^2
\end{array}

implying $\alpha_1 = 0$.

However, an issue in the sum $\sum_i[C_1,K^i]L^i=0$ is that the coefficients" aren't numbers but are combinations of generators. Not sure how to proceed in the above way.

I have also been trying to use the Jacobi identities in order to try to prove the result (i.e. $[C_1,K^1]=0$) directly, I think the relevant identities to consider would be

\begin{array}{l}
[[L^i,L^j],K^k] + [[L^j,K^k],L^i] + [[K^k,L^i],L^j] = 0 \\
[[K^i,K^j],K^k] + [[K^j,K^k],K^i] + [[K^k,K^i],K^j] = 0
\end{array}

But the only interesting result I seem to be getting from this is the identity that results from setting $k=2,i=1,j=2$ in the first of these Jacobi identities, it gives

$L^3 K^2 - K^2 L^3 - K^3 L^2 + L^2 K^3 = 0$

(actually this identity follows easily from $[K^2 , L^3] = - \epsilon^{32}_{\;\; 1} K^1 = \epsilon^{23}_{\;\; 1} K^1 = - [K^3 , L^2]$. Using this identity allows the simplification:

\begin{array}{l}
[C_1 , K^1] = 2 [L^2 , K^3]_+ - 2 [L^3 , K^2]_+ \\
= 4 (K^3 L^2 - L^3 K^2)
\end{array}

but I'm not sure how to proceed from here.

Last edited: Apr 12, 2016
4. Apr 12, 2016

### JorisL

In this case I would think of an algebra as a vector space (addition) equipped with a group structure (the multiplication).
If I'm not mistaken you could think of $\sum_i[C_1,K^i]L^i$ as an "algebra-valued vector" that is the components of the vector are elements of the algebra itself.
If that's correct the conclusion follows immediately.

Hailing @Samy_A, @lavinia and @micromass to ensure I'm not giving any wrong information. (I hope not because I used this kind of argument before)

5. Apr 13, 2016

### Samy_A

In general, that doesn't seem to work.
Let's take the Algebra of 2*2 real matrices.
The matrices $A=\begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$ and $B=\begin{pmatrix} 0& 1 \\ 1& 0 \end{pmatrix}$ are linearly independent, yet it is easy to find non-zero matrices $C, D$, such that $CA+DB=\begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$.

6. Apr 15, 2016

### julian

By the way these are the Casimirs for an infinite unitary representation of the Lorentz group (in particular the unitary irreducible representations of the principle series, if that means anything to anyone). Unitarity here implies that the generators be anti-hermitian:

$$(L^i)^\dagger = - L^i , \qquad (K^i)^\dagger = - K^i .$$

This doesn't seem to help as it implies $[C_1 , K^1]^\dagger = [C_1 , K^1]$ and $\big( 2 [L^2,K^3]_+ - 2 [L^3 , K^2]_+ \big)^\dagger = 2 [L^2,K^3]_+ - 2 [L^3 , K^2]_+$.

Could it be to do with the particular matrix representation itself? I'm guessing the representation itself could be constructed starting from the fact that the Casimirs commute with each other, commute with all the $L^i$'s and that $\sum_i L^i L^i$ and $L^3$ commute with each other - and using that commuting operators have simultaneous eigenstates.

Let me explain how I understand Lie algebra matrix representations.

The representation of a Lie algebra is defined as a mapping of the algebra onto linear operators on a vector space, i.e. operatos (matrices) $\hat{D} (\hat{L}_i)$ are assigned to the elements of the Lie algebra $\hat{L}_i$ (generators of the Lie group).

These operators have to satisfy linearity,

$$\hat{D} (\alpha \hat{L}_i + \beta \hat{D} (\hat{L}_j) = \alpha \hat{D} (\hat{L}_i) + \beta \hat{D} (\hat{L}_j) \qquad Eq.1$$

and must be homomorphic to the Lie algebra

$$\hat{D} ([\hat{L}_i , \hat{L}_j]) = [\hat{D} (\hat{L}_i) , \hat{D} (\hat{L}_i)] \qquad Eq.2$$

In general, a representation in a vector space with basis $|\{ \phi_k> \}$ is obtained by assigning to every operator $\hat{L}_i$, by means of

$$\hat{L}_i |\phi_n> = \sum_m |\phi_m> D (\hat{L}_i)_{mn}$$

a matrix. From this it follows that

\begin{array}{l}
\hat{L}_i \hat{L}_j |\phi_m> = \hat{L}_i \sum_m |\phi_m> D (\hat{L}_i)_{mn} \\
= \sum_m \big( \hat{L}_i |\phi_m> \big) D (\hat{L}_i)_{mn} \\
= \sum_m \big( \sum_p |\phi_p> D (\hat{L}_i)_{pm} \big) D (\hat{L}_i)_{mn} \\
= \sum_p |\phi_p> \Big( \sum_m D (\hat{L}_i)_{pm} D (\hat{L}_i)_{mn} \Big) \\
= \sum_p |\phi_p> D (\hat{L}_i \hat{L}_j)_{pn}
\end{array}

From which it follows that

$$\sum_m D (\hat{L}_i)_{pm} D (\hat{L}_i)_{mn} = D (\hat{L}_i \hat{L}_j)_{pn} \qquad Eq 3$$

Hence, the matrix obtained by simple matrix multiplication of $D (\hat{L}_i)$ and $D (\hat{L}_j)$ is equal to the matrix $D (\hat{L}_i \hat{L}_j)$, assigned to the operator $\hat{L}_i \hat{L}_j$. If the basis is orthonormalised then the matrix representation is given directly by the scalar product

\begin{array}{l}
<\phi_m| \hat{L}_i |\phi_n> = \sum_p <\phi_m | \phi_p> D (\hat{L}_i)_{pn} \\
= D (\hat{L}_i)_{mn} .
\end{array}

Eq 1. and Eq 2. are automatically satisfied:

\begin{array}{l}
D (\alpha \hat{L}_i + \beta \hat{L}_j) = <\phi_n | \alpha \hat{L}_i + \beta \hat{L}_j |\phi_m> \\
= \alpha <\phi_n | \hat{L}_i |\phi_m> + \beta <\phi_n| \hat{L}_j |\phi_m> \\
= \alpha D (\hat{L}_i) + \beta D (\hat{L}_j)
\end{array}

is satisfied, using this we obtain:

\begin{array}{l}
D ([\hat{L}_i , \hat{L}_j]) = D (\hat{L}_i \hat{L}_j - \hat{L}_j \hat{L}_i) \\
= D (\hat{L}_i \hat{L}_j) - D (\hat{L}_j \hat{L}_i) \\
\end{array}

and using Eq 3,

\begin{array}{l}
D ([\hat{L}_i , \hat{L}_j]) = D (\hat{L}_i) D (\hat{L}_j) - D (\hat{L}_j) D (\hat{L}_i) \\
= [ D(\hat{L}_i) , D(\hat{L}_j)] .
\end{array}

Thereby obtaining a representation.

Any thoughts on this?

Last edited: Apr 15, 2016