A Justify matrices form basis for SO(4)

OhNoYaDidn't
Messages
25
Reaction score
0
I am given the following set of 4x4 matrices. How can i justify that they form a basis for the Lie Algebra of the group SO(4)? I know that they must be real matrices, and AA^{T}=\mathbb{I}, and the detA = +-1. Do i show that the matrices are linearly independent, verify these properties, and so they are a basis? Why are they 6 elements?

<br /> _{A_1}=\begin{pmatrix}<br /> 0 &amp;0 &amp;0 &amp;0 \\<br /> 0 &amp; 0 &amp; 1 &amp; 0 \\<br /> 0&amp; -1 &amp;0 &amp;0 \\<br /> 0&amp; 0&amp; 0 &amp; 0<br /> \end{pmatrix}<br /> ,\\<br /> \<br /> _{A_2}=\begin{pmatrix}<br /> 0 &amp;0 &amp;-1 &amp;0 \\<br /> 0 &amp; 0 &amp; 0 &amp; 0 \\<br /> 1&amp; 0 &amp;0 &amp;0 \\<br /> 0&amp; 0&amp; 0 &amp; 0<br /> \end{pmatrix}<br /> <br /> ,\\<br /> <br /> _{A_3}=\begin{pmatrix}<br /> 0 &amp;-1 &amp;0 &amp;0 \\<br /> 1 &amp; 0 &amp; 0 &amp; 0 \\<br /> 0&amp; 0 &amp;0 &amp;0 \\<br /> 0&amp; 0&amp; 0 &amp; 0<br /> \end{pmatrix}<br /> \\<br /> _{B_1}=\begin{pmatrix}<br /> 0 &amp;0 &amp;0 &amp;-1 \\<br /> 0 &amp; 0 &amp; 0 &amp; 0 \\<br /> 0&amp; 0 &amp;0 &amp;0 \\<br /> 1&amp; 0&amp; 0 &amp; 0<br /> \end{pmatrix}<br /> <br /> ,\\<br /> <br /> _{B_2}=\begin{pmatrix}<br /> 0 &amp;0 &amp;0 &amp;0 \\<br /> 0 &amp; 0 &amp; 0 &amp; -1 \\<br /> 0&amp; 0 &amp;0 &amp;0 \\<br /> 0&amp; 1&amp; 0 &amp; 0<br /> \end{pmatrix}<br /> <br /> ,\\<br /> <br /> _{B_3}=\begin{pmatrix}<br /> 0 &amp;0 &amp;0 &amp;0 \\<br /> 0 &amp; 0 &amp; 0 &amp; 0 \\<br /> 0&amp; 0 &amp;0 &amp;1 \\<br /> 0&amp; 0&amp; -1 &amp; 0<br /> \end{pmatrix}<br /> <br />
 
Physics news on Phys.org
OhNoYaDidn't said:
I am given the following set of 4x4 matrices. How can i justify that they form a basis for the Lie Algebra of the group SO(4)? I know that they must be real matrices, and AA^{T}=\mathbb{I}, and the detA = +-1. Do i show that the matrices are linearly independent, verify these properties, and so they are a basis? Why are they 6 elements?

<br /> _{A_1}=\begin{pmatrix}<br /> 0 &amp;0 &amp;0 &amp;0 \\<br /> 0 &amp; 0 &amp; 1 &amp; 0 \\<br /> 0&amp; -1 &amp;0 &amp;0 \\<br /> 0&amp; 0&amp; 0 &amp; 0<br /> \end{pmatrix}<br /> ,\\<br /> \<br /> _{A_2}=\begin{pmatrix}<br /> 0 &amp;0 &amp;-1 &amp;0 \\<br /> 0 &amp; 0 &amp; 0 &amp; 0 \\<br /> 1&amp; 0 &amp;0 &amp;0 \\<br /> 0&amp; 0&amp; 0 &amp; 0<br /> \end{pmatrix}<br /> <br /> ,\\<br /> <br /> _{A_3}=\begin{pmatrix}<br /> 0 &amp;-1 &amp;0 &amp;0 \\<br /> 1 &amp; 0 &amp; 0 &amp; 0 \\<br /> 0&amp; 0 &amp;0 &amp;0 \\<br /> 0&amp; 0&amp; 0 &amp; 0<br /> \end{pmatrix}<br /> \\<br /> _{B_1}=\begin{pmatrix}<br /> 0 &amp;0 &amp;0 &amp;-1 \\<br /> 0 &amp; 0 &amp; 0 &amp; 0 \\<br /> 0&amp; 0 &amp;0 &amp;0 \\<br /> 1&amp; 0&amp; 0 &amp; 0<br /> \end{pmatrix}<br /> <br /> ,\\<br /> <br /> _{B_2}=\begin{pmatrix}<br /> 0 &amp;0 &amp;0 &amp;0 \\<br /> 0 &amp; 0 &amp; 0 &amp; -1 \\<br /> 0&amp; 0 &amp;0 &amp;0 \\<br /> 0&amp; 1&amp; 0 &amp; 0<br /> \end{pmatrix}<br /> <br /> ,\\<br /> <br /> _{B_3}=\begin{pmatrix}<br /> 0 &amp;0 &amp;0 &amp;0 \\<br /> 0 &amp; 0 &amp; 0 &amp; 0 \\<br /> 0&amp; 0 &amp;0 &amp;1 \\<br /> 0&amp; 0&amp; -1 &amp; 0<br /> \end{pmatrix}<br /> <br />
I get the impression you confuse the special orthogonal group ##SO(4)## and its Lie algebra ##\mathfrak{so}(4)##. At least on the level of their defining conditions. Your matrices are elements of a Lie algebra, whereas neither is invertible ##(AA^T=\mathbb{1})## nor is of determinant ##\pm 1##, which are the conditions for the group. In the Lie algebra ##\mathfrak{so}(4)## the defining conditions are ##A+A^T=0## and ##tr(A)=0##.

And, yes, to form a basis they have to be linear independent, which therefore has to be shown. I think this will be easier if you write them as ##\delta_{ij}-\delta_{ji}##. To show they form a basis, you also have to show that they span the entire vector space of ##\mathfrak{so}(4)##, which could easiest be done by counting the dimension: How many matrix entries are there for matrices in ##\mathbb{M}_\mathbb{R}(4)## and how many linear equations, conditions does this system have?
 
Last edited:
  • Like
Likes OhNoYaDidn't
fresh_42 said:
I get the impression you confuse the special orthogonal group ##SO(4)## and its Lie algebra ##\mathfrak{so}(4)##. At least on the level of their defining conditions. Your matrices are elements of a Lie algebra, whereas neither is invertible ##(AA^T=\mathbb{1})## nor is of determinant ##\pm 1##, which are the conditions for the group. In the Lie algebra ##\mathfrak{so}(4)## the defining conditions are ##A+A^T=0## and ##tr(A)=0##.

And, yes, to form a basis they have to be linear independent, which therefore has to be shown. I think this will be easier if you write them as ##\delta_{ij}-\delta_{ji}##. To show they form a basis, you also have to show that they span the entire vector space of ##\mathfrak{so}(4)##, which could easiest be done by counting the dimension: How many matrix entries are there for matrices in ##\mathbb{M}_\mathbb{R}(4)## and how many linear equations, conditions does this system have?
Thank you, fresh_42. I'm new to the subject, things are still a little confusing.
So, we know that a ##\mathbb{M}_\mathbb{R}(4)## matrix has 16 entries, but since ##A=-A^T##, we can get rid of the elements of the upper or lower triangular part of our matrix: hence subtracting ## N(N+1)/2## from ##N^2##, so now we need ##n=N^2-N(N+1)/2=N(N-1)/2## elements for this basis, which is exactly what i wanted to understand.
To show that they are linearly independent, i simply write them as a linear combination, multiplying them by constants: a, b, c, d, e, f, respectively, and prove ##a=b=c=d=e=f=0##.

What do you mean by writing them as ##\delta_{ij}-\delta_{ji}##.?
 
OhNoYaDidn't said:
What do you mean by writing them as ##\delta_{ij}-\delta_{ji}##.?
The matrices ##A,B## above are all of the form ##\delta_{ij}-\delta_{ji}## where ##\delta_{ij}## is a matrix with exactly one ##1## in the ##i-##th row and ##j-##th column and ##0## elsewhere. They are also written as ##E_{ij}, e_{ij}## or ##\mathfrak{e}_{ij}## or similar as they are the standard basis vectors like ##(0,\ldots,1,\ldots,0) \in \mathbb{R}^n##. To chose ##E_{ij}## is probably preferable over the ##\delta## notation, because it saves the ##\delta## for pairs of integers as usual; my fault, sorry.)

It is almost obvious by itself, that the matrices above are linear independent, since all non-zero entries are all in different positions so they cannot cancel each other out. The advantage of the notation as ##E_{ij}-E_{ji}## becomes important if you start multiplying them: ##E_{ij}\cdot E_{mn}=\delta_{jm}E_{in} ##, i.e. match the inner indices and take the outer as the new ones; zero if the inner indices don't match.
 
A set B of vectors (in this case matrices) is a basis of a vector space V if and only if any element of V can be written as a linear combination of the elements of B (i.e. B spans V) and the elements of B are linearly independent.

In order to show that B spans V=##\mathfrak so(4)##, you can simply find the general expression for an arbitrary element A of ##\mathfrak so(4)## using the "defining conditions" that fresh_42 referred (A is traceless and ##A=-A^\dagger##). If you do this, you will be able to write this matrix A as a linear combiantion of the matrices you wrote, so they span V.

In order to show that the elements of B are linearly independent, you can write a general linear combination of them, set it equal to zero, and solve the resulting equation: you will see that all coefficients must be zero, and this by definition tells you that B is linearly independent.
 
  • Like
Likes OhNoYaDidn't
Thank you guys, i got it :)
 

Similar threads

Back
Top