Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Products of representations

  1. Mar 11, 2009 #1
    Hi,

    I have a problem. Consider the representation of SU(2) which maps every [tex] U \in SU(2)[/tex] into itself, i.e. [tex] U \mapsto U [/tex], and the vector space is given by [tex] \mathbb{C}^{2} [/tex] with the basis vectors [tex] e_{1} = (1,0) [/tex] and [tex]e_{2} = (0,1) [/tex]

    How do I show that the tensor product (Kronecker) of the representation with itself on [tex] V \otimes V [/tex] is reducible?

    Unfortunetly I don't know how to do that. Has anyone an idea?
     
  2. jcsd
  3. Mar 11, 2009 #2

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    By showing it is not simple, which is, um, simple...
     
  4. Mar 11, 2009 #3
    I forgot to write that I need to know the invariant subspaces explicitly. And I only had (very) basic representation theory, so I don't know very much about it.

    I thought that I have to find a similarity transformation so that every product assumes a block-diagonal form. But I absolutely don't know how to find such a transformation. Maybe it has something to do with the Pauli matrices which are the generators of the su(2) algebra. And I know that every matrix in SU(2) can be expressed by: [tex] exp(i \vec{\sigma} \cdot \vec{\alpha}/2) [/tex] where [tex] \vec{\alpha} = (\alpha_{1}, \alpha_{2}, \alpha_{3}) [/tex] describes a set of parameters. But I don't think that it could be useful to solve this problem.
     
  5. Mar 11, 2009 #4

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    Write down an invariant subspace and appeal to complete reducibility. Just play with it and think for a while.
     
  6. Mar 11, 2009 #5
    ohh, I first used the wrong definition of the product of representations, therefore I was completely unable to find an invariant subspace.

    Ok, now I've found one: [tex] W = span \left( e_{1} \otimes e_{2} - e_{2} \otimes \e_{1} \right) = span \left( \begin{pmatrix} 0 \\ 1 \\ -1 \\ 0 \end{pmatrix} \right) [/tex].

    Ok, but now I don't know what to do further. The complementary space seems to be not invariant, so I don't know what to do???
     
  7. Mar 12, 2009 #6

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    "The" complementary space? Any non-trivial subspace of a vector space has <i>infinitely many</i> subspaces. Why do you need to find one explicitly? You know the theorem of complete reducibility holds, right? If not, try again at playing around with things. What you've written down is called many things such as the second exterior power or the <i>antisymmetric</i> part of the tensor space. What might the symmetric part be?
     
  8. Mar 12, 2009 #7
    I don't know the theorem of complete reducibility. But I hope I can solve it without it :-)

    I think the symmetric part would be:

    [tex] span \left( e_{1} \otimes e_{2} + e_{2} \otimes e_{1} \right) = span \left( \begin{pmatrix} 0 \\ 1 \\ 1 \\ 0 \end{pmatrix} \right) [/tex].

    I just thought that if W is an invariant subspace then the orthogonal complement
    [tex] W^{\bot} = span \left( \begin{pmatrix} 1 \\ 0 \\ 0 \\ 0 \end{pmatrix} \right), \left( \begin{pmatrix} 0 \\ 0 \\ 0 \\ 1 \end{pmatrix} \right), \left( \begin{pmatrix} 0 \\ 1 \\ 1 \\ 0 \end{pmatrix} \right) [/tex]
    is it too, but here it seems not to be. But I don't see how to find other invariant subspaces explicitly.

    Maybe my approach isn't good. I took a matrix from SU(2) and wrote it like follows:

    [tex] U = \begin{pmatrix} a & b \\ - b* & a* \end{pmatrix} [/tex]

    Then I calculated the product [tex] U \otimes U [/tex] and I obtained a 4 dimensional matrix. There I saw that [tex] e_{1} \otimes e_{2} - e_{2} \otimes e_{1} [/tex] is an eigenvector of this matrix. But now I've got stuck and don't know how to go on and "play" around with things.
     
  9. Mar 12, 2009 #8

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    You have found part of the symmetric component. Now try to find some more things.

    Why are you stuck? Why can't you just think what happens when you apply group elements to some elementary vectors? What happens if you apply a generic element of SU_2 to, say, v\otimes v for any vector v?


    Just as a vector subspace does not have a unique complement, it does not have a unique orthogonal complement. Orthogonality is not an inherent characteristic, it is the product of a choice of an inner product, and there are infinitely many of those. If your inner product were SU-2 invariant then you would have something interesting...
     
  10. Mar 12, 2009 #9
    Ok, I took a general matrix of SU(2):

    [tex] U = \begin{pmatrix} a & b \\ - b^* & a^* \end{pmatrix} [/tex]

    and calculated [tex] U \otimes U [/tex] which is:

    [tex] \begin{pmatrix} a^{2} & ab & ba & b^{2} \\ - ab^* & \vert a \vert^{2} & - \vert b \vert^{2} & ba^* \\ - b^* a & - \vert b \vert^{2} & \vert a \vert^{2} & a^* b \\ (b^*)^{2} & -b^* a^* & - a^* b^* & (a^*)^{2} \end{pmatrix} [/tex].

    And I applied it to some vectors, for example [tex] e_{1} \otimes e_{1}, e_{1} \otimes e_{2}, ... [/tex], but I don't "see" anything special.

    Now I took a general vectors [tex] v = \begin{pmatrix} x \\ y \end{pmatrix} [/tex] and calculated: [tex] v \otimes v = \begin{pmatrix} x^{2} \\ xy \\ xy \\ y^{2} \end{pmatrix} [/tex]

    Now I apply [tex] U \otimes U [/tex] to [tex] v \otimes v [/tex] with the U defined above.

    [tex] (U \otimes U) (v \otimes v) = \begin{pmatrix} a^{2} x^{2} + 2 abxy + b^{2} y^{2} \\ - ab^* \, x^{2} + xy ( \vert a \vert^{2} - \vert b \vert^{2} ) + b a^* \, y^{2} \\ - ab^* \, x^{2} + xy ( \vert a \vert^{2} - \vert b \vert^{2} ) + b a^* \, y^{2} \\ (b^*)^{2} x^{2} - 2 a^*b^* \, xy + (a^*)^{2} y^{2} \end{pmatrix} [/tex].

    But what should I do with that???
     
  11. Mar 12, 2009 #10

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    Boy, that's complicated.

    I'm just going to use (v,v) rather than tensor, cos I can't be bothered with latex for this one.

    What is g(v,v)? It is (gv,gv). You don't see anything? What if I write w=gv, so have (w,w)? I've taken something "diagonal", and gotten something "diagonal". What's the smallest subspace that contains the pairs (v,v)? What's a basis for it?

    Have you looked for references to symmetric powers anywhere?
     
  12. Mar 12, 2009 #11
    Sooory, but I'm too dizzied. What do you mean that you have taken something "diagonal", and gotten something "diagonal"?

    Maybe I'm too silly to understand that, but I still don't "see" anything. Could you be more explicit please? Maybe I understand it better than (at least I hope so)
     
  13. Mar 12, 2009 #12

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    I can't be more specific: something of the form v\otimes v is "diagonal". What is the smallest vector subspace containing this clearly SU_2 invariant set?
     
  14. Mar 12, 2009 #13
    if I take a vector v, let's say

    [tex] v = \begin{pmatrix} a \\ b \end{pmatrix} [/tex]

    and I consider the product:

    [tex] v \otimes v = \begin{pmatrix} a^{2} \\ ab \\ ab \\ b^{2} \end{pmatrix}[/tex]

    I only see that the there are two entries which are identical, namely ab. The first and second entry are both squares of a resp. b. But I don't undestand what is "diagonal" here and what the smallest vectorspace is (maybe [tex] V \otimes V = \mathbb{C}^{4}[/tex]?)
     
  15. Mar 12, 2009 #14

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    Here's a different method.

    Consider the involution S(u \otimes v) = v \otimes u. Extend to a linear map on V \otimes V. What is its minimal polynomial? What are its eigenspaces? Hint, -1 is an eigenvector and you're already written down the eigen space for that.

    EQuivalently, you've already found the space A such that S(a)=-a for all a in A. You're trying to find the subspace B such that S(b)=b for all b in B. And then you must show that these are subreps and that they span all of V\otimes V.
     
  16. Mar 14, 2009 #15
    I'm currently learning representation theory and Lie algebra.

    From what I have understood, the finite dimensional representations for su(2) are labelled by their spin. Since the representation is from su(2) itself, the label is spin one-half. The basis vectors can be specified as
    J2 |jm> = |jm> j(j+1)
    J3 |jm> = |jm> m

    Am I doing something silly here?

    The direct product of two spin irreducible representations j1 and j2 is always reducible!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook