3 questions about matrix lie groups

Bobhawke
Messages
142
Reaction score
0
1. The exponential map is a map from the lie algebra to a matrix representation of the group. For abelian groups, the group operation of matrix multiplication for the matrix rep clearly corresponds to the operation of addition in the lie algebra:

<br /> \sum_a \Lambda_a t_a \rightarrow exp(\sum_a \Lambda_a T_a)
\sum_b \Lambda&#039;_b t_b \rightarrow exp(\sum_b \Lambda&#039;_b T_b

<br /> \sum_a \Lambda_a t_a + \sum_b \Lambda&#039;_b t_b \rightarrow exp(\sum_a \Lambda_a T_a + \sum_b \Lambda&#039;_b T_b ) = exp(\sum_a \Lambda_a T_a) exp(\sum_b \Lambda&#039;_b T_b)

Where t_a, t_b are elements of the lie algebra, T_a, T_b are their matrix representations.

However, for non-abelian groups this doesn't work because in combining exponentials of non-commuting matrices we have to use the BCH formula. The last line is not true:


exp(\sum_a \Lambda_a T_a + \sum_b \Lambda&#039;_b T_b ) \neq exp(\sum_a \Lambda_a T_a) exp(\sum_b \Lambda&#039;_b T_b)

Addition cannot be the operation on the lie algebra that corresponds to matrix multiplication for non-abelian groups. My question then is what is?



2. The tensor product of two groups is defined by

<br /> G \otimes H = \ { (g,h) \vert g \epsilon G , h\epsilon H \ }<br /> <br />

The tensor product of two matrices is defined by
<br /> A \otimes B =\left( {\begin{array}{cc}<br /> a_{11}B &amp; a_{12}B \\<br /> a_{21}B &amp; a_{22}B \\<br /> \end{array} } \right)<br /> <br />

My question is: Why does the tensor product of two matrix reps of two groups A,B form a rep of of the group AxB? It is not obvious to me that this should be so. If anyone knows where I could find a proof of this statement it would be greatly appreciated. Also, does this hold for lie groups only or for any groups that have a matrix representation?




3. This is related to the last question. In physics you often see equations stating how tensor products of matrix reps of groups can be decomposed into their irreducible representations, e.g 3 \otimes 3 = 8 \oplus 1 where 3 means a 3x3 matrix rep of SU(3), and 8 and 1 are the irreducible reps. But it seems to me that if you wanted to form a matrix rep of, for example, SU(3)xSU(3), the easiest thing to do would be to make a block diagonal 6x6 matrix with a copy of SU(3) in each of the blocks. Such an object would respect the group structure. And this isn't the only option, you could easily make a rep of SU(3)xSU(3) with as many dimensions as you want. My question is, when we write down a matrix rep of a tensor products of two groups, why are its dimensions always the product of the dimensions of the matrix reps of the groups we multiplied together? Why not choose some other number of dimensions?
 
Physics news on Phys.org
1. Your notion of "correspond" here is not the appropriate one. You should instead be thinking in terms of differential geometry: the Lie algebra g of a Lie group G is the tangent space to G at the identity; the differential of the multiplication map G \times G \to G is the addition map \mathfrak{g} \times \mathfrak{g} \to \mathfrak{g}. It's in this sense that addition corresponds to multiplication.

2. Actually, you generally don't take the tensor product of two (nonabelian) groups. What you defined is the direct product (or direct sum) of G and H. I suspect what you're trying to ask is: "if V and W are representations of two Lie groups G and H, then why (or, more precisely, how) is V\otimes W a representation of G\times H?" It is a representation in a very natural way: (g,h) v\otimesw = gv \otimes hw for g in G, h in H, v in V and w in W. The first thing we ought to do is check that what we wrote down makes sense: since v is in V and V is a rep of G, gv is in V; similarly homework is in W. So the expression gv\otimes hw makes sense. I'll leave it to you to check that this definition satisfies all the axioms of a representation.

3. I don't really understand what you're asking. If you decompose a vector space V into a direct sum of subspaces, then the dimensions of these subspaces had better add up to V!
 
Ah it took me a while to figure it all out but I understand now. Sorry for my imprecise language, and also thank you very much for taking the time to reply morphism.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...

Similar threads

Back
Top