Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

A Justify matrices form basis for SO(4)

  1. Dec 29, 2016 #1
    I am given the following set of 4x4 matrices. How can i justify that they form a basis for the Lie Algebra of the group SO(4)? I know that they must be real matrices, and [itex]AA^{T}=\mathbb{I}[/itex], and the [itex]detA = +-1.[/itex] Do i show that the matrices are linearly independent, verify these properties, and so they are a basis? Why are they 6 elements?

    [tex]
    _{A_1}=\begin{pmatrix}
    0 &0 &0 &0 \\
    0 & 0 & 1 & 0 \\
    0& -1 &0 &0 \\
    0& 0& 0 & 0
    \end{pmatrix}
    ,\\
    \
    _{A_2}=\begin{pmatrix}
    0 &0 &-1 &0 \\
    0 & 0 & 0 & 0 \\
    1& 0 &0 &0 \\
    0& 0& 0 & 0
    \end{pmatrix}

    ,\\

    _{A_3}=\begin{pmatrix}
    0 &-1 &0 &0 \\
    1 & 0 & 0 & 0 \\
    0& 0 &0 &0 \\
    0& 0& 0 & 0
    \end{pmatrix}
    \\
    _{B_1}=\begin{pmatrix}
    0 &0 &0 &-1 \\
    0 & 0 & 0 & 0 \\
    0& 0 &0 &0 \\
    1& 0& 0 & 0
    \end{pmatrix}

    ,\\

    _{B_2}=\begin{pmatrix}
    0 &0 &0 &0 \\
    0 & 0 & 0 & -1 \\
    0& 0 &0 &0 \\
    0& 1& 0 & 0
    \end{pmatrix}

    ,\\

    _{B_3}=\begin{pmatrix}
    0 &0 &0 &0 \\
    0 & 0 & 0 & 0 \\
    0& 0 &0 &1 \\
    0& 0& -1 & 0
    \end{pmatrix}

    [/tex]
     
  2. jcsd
  3. Dec 29, 2016 #2

    fresh_42

    Staff: Mentor

    I get the impression you confuse the special orthogonal group ##SO(4)## and its Lie algebra ##\mathfrak{so}(4)##. At least on the level of their defining conditions. Your matrices are elements of a Lie algebra, whereas neither is invertible ##(AA^T=\mathbb{1})## nor is of determinant ##\pm 1##, which are the conditions for the group. In the Lie algebra ##\mathfrak{so}(4)## the defining conditions are ##A+A^T=0## and ##tr(A)=0##.

    And, yes, to form a basis they have to be linear independent, which therefore has to be shown. I think this will be easier if you write them as ##\delta_{ij}-\delta_{ji}##. To show they form a basis, you also have to show that they span the entire vector space of ##\mathfrak{so}(4)##, which could easiest be done by counting the dimension: How many matrix entries are there for matrices in ##\mathbb{M}_\mathbb{R}(4)## and how many linear equations, conditions does this system have?
     
    Last edited: Dec 29, 2016
  4. Dec 30, 2016 #3
    Thank you, fresh_42. I'm new to the subject, things are still a little confusing.
    So, we know that a ##\mathbb{M}_\mathbb{R}(4)## matrix has 16 entries, but since ##A=-A^T##, we can get rid of the elements of the upper or lower triangular part of our matrix: hence subtracting ## N(N+1)/2## from ##N^2##, so now we need ##n=N^2-N(N+1)/2=N(N-1)/2## elements for this basis, which is exactly what i wanted to understand.
    To show that they are linearly independent, i simply write them as a linear combination, multiplying them by constants: a, b, c, d, e, f, respectively, and prove ##a=b=c=d=e=f=0##.

    What do you mean by writing them as ##\delta_{ij}-\delta_{ji}##.?
     
  5. Dec 30, 2016 #4

    fresh_42

    Staff: Mentor

    The matrices ##A,B## above are all of the form ##\delta_{ij}-\delta_{ji}## where ##\delta_{ij}## is a matrix with exactly one ##1## in the ##i-##th row and ##j-##th column and ##0## elsewhere. They are also written as ##E_{ij}, e_{ij}## or ##\mathfrak{e}_{ij}## or similar as they are the standard basis vectors like ##(0,\ldots,1,\ldots,0) \in \mathbb{R}^n##. To chose ##E_{ij}## is probably preferable over the ##\delta## notation, because it saves the ##\delta## for pairs of integers as usual; my fault, sorry.)

    It is almost obvious by itself, that the matrices above are linear independent, since all non-zero entries are all in different positions so they cannot cancel each other out. The advantage of the notation as ##E_{ij}-E_{ji}## becomes important if you start multiplying them: ##E_{ij}\cdot E_{mn}=\delta_{jm}E_{in} ##, i.e. match the inner indices and take the outer as the new ones; zero if the inner indices don't match.
     
  6. Dec 31, 2016 #5
    A set B of vectors (in this case matrices) is a basis of a vector space V if and only if any element of V can be written as a linear combination of the elements of B (i.e. B spans V) and the elements of B are linearly independent.

    In order to show that B spans V=##\mathfrak so(4)##, you can simply find the general expression for an arbitrary element A of ##\mathfrak so(4)## using the "defining conditions" that fresh_42 referred (A is traceless and ##A=-A^\dagger##). If you do this, you will be able to write this matrix A as a linear combiantion of the matrices you wrote, so they span V.

    In order to show that the elements of B are linearly independent, you can write a general linear combination of them, set it equal to zero, and solve the resulting equation: you will see that all coefficients must be zero, and this by definition tells you that B is linearly independent.
     
  7. Jan 6, 2017 #6
    Thank you guys, i got it :)
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Justify matrices form basis for SO(4)
  1. Forming a basis (Replies: 3)

Loading...