Group Theory Basics: Where Can I Learn More?

Click For Summary
Group Theory is gaining interest among learners due to its widespread applications, particularly in physics and mathematics. Recommended resources for beginners include "Groups and Symmetry" by M.A. Armstrong, "An Introduction to the Theory of Groups" by J. Rotman, and Schaum's Outline of Group Theory, which is noted for its solved examples. There is a suggestion for a collaborative workshop to explore classical groups like SO(3) and SU(2), focusing on their relevance to physics. Online resources are also available, including a link to a free textbook by Marsden that covers Lie groups. The discussion emphasizes the importance of accessible learning materials and community engagement in mastering Group Theory.
  • #31
Originally posted by marcus
I just happened to notice that Greg Tom and chroot are browsing the math forum and any of them could give a rigorous def
of a Lie group (I have not given the definition so far) and its Lie algebra.

Actually, I can't. They did not talk about this stuff in my Abstract Algebra course. I only know about it through my QM courses, which is why I talk in terms of examples. We need people such as Hurkyl, Lethe, SelfAdjoint, etc... to provide the rigorous generalities.

and that would be a step in the right direction (of collectivizing and getting several persons approaches)

I posted mine a few seconds before yours (see above).
 
Physics news on Phys.org
  • #32
it rarely a mistake to look at examples before studying
abstract definitions and my favorite example of a Lie group/algebra
is rotations/skew-symmetric matrices.

Everybody has had linear algebra so probably know
At the transpose of a (real) matrix and
A-1 the matrix inverse
and may also know that an orthogonal matrix
(one that doesn't change the length of vectors
obviously a very valuable interesting kind and it
also does not change their inner product when you apply
it to two vectors)
this very nice kind of matrix is described by
At = A-1

Now those things form a group because if A and B dont
change lengths or inner products then AB will not either
and you can also check the At = A-1
condition for AB.

But they arent a vector space because if you add two A+B
is usually not that kind of matrix any more

Everybody knows the determinant and that detAt = detA
and that detA-1 = 1/detA
So if you look at the At = A-1 condition in that light you will see that there is no possible thing that det A can be except +1 or -1. The matrices with det = +1 form a subgroup.

These are very nice simple useful Lie groups and the question is, what is the Lie algebra. What does the tangent space at the
identity matrix look like?

So you Lonewolf ask "what is the Lie algebra" and I am temporarily turning this question into a very concrete one: "what is the Lie algebra of this particular group of matrices, the orthogonal ones, or the subgroup of them which are simple rotations. We can try to answer that in either 2D or 3D.
Are there any questions so far?

Anybody who wishes is invited to take over explaining and discussing at this point.
 
  • #33
Originally posted by Tom
...know about it through my QM courses, which is why I talk in terms of examples.


I posted mine a few seconds before yours (see above).

Glad to see you here
We may need examples far more than rigor
Would invite and encourage examples
 
  • #34
We need people such as Hurkyl, Lethe, SelfAdjoint, etc... to provide the rigorous generalities.

Eep!


Paraphrased from my abstract algebra text:

A Lie Algebra is simply a vector space A over a field F equipped with a bilinear operator [,] on A that satisfies [x, x] = 0 and the jacobi identity:

[[x, y], z] + [[y, z], x] + [[z, x], y] = 0


(If F does not have characteristic 2, [x, x] = 0 is equivalent to [x, y] = -[y, x])


I would like to point out that [x, y] is not defined by:

[x, y] = xy - yx

(or various similar definitions); it is merely a bilinear form that satisfies the Jacobi identity and [x, x] = 0.


However, for any associative algebra A, one may define the lie algebra A- by defining the lie bracket as the commutator.


An example where [,] is not a commutator is (if I've done my arithmetic correctly) the real vector space R3 where [x, y] = x * y, where * is the vector cross product.


edit: fixed an omission
 
Last edited:
  • #35
Quoting from John Baez' "Gauge Fields, Knots, and Gravity,"

"Lie algebras are a very powerful tool for studying Lie groups. Recall that a Lie group is a manifold that is also a group, such that the group operations are smooth. It turns out that the group structure is almost completely determined by its behavior near the identity. This, in turn, can be described in terms of an operation on the tangent space of the Lie group, called the 'Lie bracket.'

"To be more precise, suppose that G is a Lie group. We define the Lie algebra of G, often written g, to be the tangent space of the identity element of G. This is a vector space with the same dimension of G. A good way to think of Lie algebra elements is as tangent vectors to path in G that start at the identity. An example of this is the physicists' notion of an 'infinitesimal rotation.' If we let [gamma] be the path in SO(3) such that [gamma](t) corresponds to a rotation by the angle t (counterclockwise) about the z axis:
Code:
[gamma](t) =

  cos t   -sin t   0
  sin t   cos t    0
   0        0      1
"Then the tangent vector to [gamma] as it passes through the identity can be calculated by differentiating the components of [gamma](t) and setting t = 0:
Code:
[gamma]'(0) =

0  -1   0
1   0   0
0   0   0

This is an element of so(3), the Lie algebra of SO(3). Any such matrix, which is the tangent vector to a path through the identity of SO(3), is a member of so(3).

- Warren
 
Last edited:
  • #36
Originally posted by Hurkyl

A Lie Algebra is simply a vector space A over a field F equipped with a bilinear operator [,] on A that satisfies [x, x] = 0 and the jacobi identity:

[[x, y], z] + [[y, z], x] + [[z, x], y] = 0


(If F does not have characteristic 2, [x, x] = 0 is a consequence of the Jacobi identity and may be dropped as an axiom)

whoa! is that true? i m not so sure. i think what you want to say here is:
[x,y]=-[y,x] is a consequence of [x,x]=0, and if the field does not have characteristic 2, then [x,y]=-[y,x] implies [x,x]=0, but not in fields with characteristic 2, so we drop that as an axiom.


I would like to point out that [x, y] is not defined by:

[x, y] = xy - yx

(or various similar definitions); it is merely a bilinear form that satisfies the Jacobi identity and [x, x] = 0.


However, for any associative algebra A, one may define the lie algebra A- by defining the lie bracket as the commutator.


An example where [,] is not a commutator is (if I've done my arithmetic correctly) the real vector space R3 where [x, y] = x * y, where * is the vector cross product.

other examples include the poisson bracket and the lie bracket (well, the lie bracket does turn out t be a commutator, but it is certainly not defined that way).
 
  • #37
You wan a complete synthetic definition of Lie algebra? Here it is: A Lie algebra L is a pair (V,t) formed by an F-module (F being a commutative ring) and an alternated tensor t of mixed type (2,1) satisfying the Jacobi identity.

A special case is F a field.
 
  • #38
Originally posted by Hurkyl
I would like to point out that [x, y] is not defined by:
[x, y] = xy - yx
[/B]

For the so called abstract Lie algebras only the bracket [,] by itself has a meaning, but it can be proven that for any (finite dimensional) Lie algebra L we can find a vector space V such that the elements of L are linear transormations of V, so that the formulae above holds (that is, we can alsays find a faithful linear representation of L). For those interested in details, this is known as the theorem of Ado (1945).
 
  • #39
Originally posted by Tom
A Lie algebra is a nonAbelian algebra whose elements ai satisfy the following properties:

1. [ai,ai]=0 (ai commutes with itself.)
2. [aj+ak,ai]=[aj,ai]+[ak,ai] (Linearity of commutator.)
3. [ai,[aj,ak]]+[aj,[ak,ai]]+[ak,[ai,aj]]=0 (Jacobi identity.)

Just a comment, you can drop the word nonabelian since any vector space is an abelian Lie algebra simply taking the zero bracket. Indeed abelian algebras play a fundamental role in the theory (see for example the Cartan subalgebras). You have also the require bi-linearity, otherwise the result is not necessarily a Lie algebra. The (local!) relation with the Lie groups is expressed by the Campbell-Hausdorff formula (exponentiation of elements). But this is delicate, since not all elements of the Lie group must be the exp of some element of the Lie algebra (the example is well known to you!).
 
  • #40
Finally, you can also use operators to construct Lie algebras. If you take hermitian conjugate operators B, B* (in an infinite dimensional space) with the rule [B,B*]=BB*-B*B you obtain the Heisenberg Lie algebra, which is the basis of all classical analysis of harmonic oscillators and gave rise to the boson formalism used by Schwinger, Holstein and Primakoff in the 40's to analyze angular momentum.
 
  • #41
Originally posted by rutwig
But this is delicate, since not all elements of the Lie group must be the exp of some element of the Lie algebra (the example is well known to you!).

what? this is not known to me, i thought that any element of the Lie Group could indeed be obtained by exponentiation of the Lie Algebra. what is the example?
 
  • #42
Originally posted by lethe
what? this is not known to me, i thought that any element of the Lie Group could indeed be obtained by exponentiation of the Lie Algebra. what is the example?

If you have worked only with compact groups, then you will not have observed this; any element is the exponentiation of some element in the Lie algebra. But for noncompact groups this is no longer true, and we have to consider a finite number of elements in the Lie algebra to recover the elements of the group.

Example: show that the element

|-a 0 |
|0 -1/a |

of SL(2,R) cannot be expressed as the exponential of an unique element X of the Lie algebra sl(2,R) if a is different from 1.
 
  • #43
Originally posted by rutwig
If you have worked only with compact groups, then you will not have observed this; any element is the exponentiation of some element in the Lie algebra. But for noncompact groups this is no longer true, and we have to consider a finite number of elements in the Lie algebra to recover the elements of the group.

Example: show that the element

|-a 0 |
|0 -1/a |

of SL(2,R) cannot be expressed as the exponential of an unique element X of the Lie algebra sl(2,R) if a is different from 1.

OK, so let me see. the lie algebra of SL(2,R) is just the set of real 2x2 traceless matrices right? a basis for this algebra is:

[1 0] [0 1] [0 0]
[0 -1], [0 0], [1 0]

right? obviously the matrix you mentioned has to be constructed from the first basis element.

what about
exp(ln a[1 0])
[0 -1]

no wait, that will give me only

[a 0]
[0 1/a]

obviously, i ll never be able to get negative numbers by exponentiating these matrices, so it s impossible, as you say. why is that? what does this have to do with compactness?
 
  • #44
The result is not entirely obvious, but compactness ensures some properties (like existence of invariant integration)
that are not given otherwise. For this case, the key is that for compact (connected) groups any element is conjugate to
an element in a maximal torus (analytic subgroups corresponding to an abelian subalgebra of the Lie algebra).
 
  • #45
These are notes taken from Marsden. Some of the proofs have been omitted, but are available in Marsden's text.

The Real General Linear Group

GL(n,R) is defined as GL(n,R) = {A in Rnxn: det(A)!=0}
GL+(n,R) is defined as GL+(n,R) = {A in Rnxn: det(A)>0}
GL-(n,R) is defined as GL-(n,R) = {A in Rnxn: det(A)<0}

where R in the set of real numbers, and Rnxn is the set of real nxn matrices.

GL+(n,R) is the connected component of the identity in GL(n,R), and GL(n,R) has exactly two connected components. Marsden proves this using the real polar decomposition theorem. Following the proof of the the conclusion below is reached.

The real general linear group is a non-compact disconnected n2 dimensional Lie group whose Lie algebra consists of the set of all nxn matrices with the bracket [A,B] = AB-BA.


The Special Linear Group

SL(n,R) is defined as SL(n,R) = {A in GL(n,R): det(A)=1}

R\{0} is a group under multiplication, and det:GL(n,R)->R\{0} is a Lie group homomorphism since det(AB) = det(A)det(B).

The Lie algebra of SL(n,R) consists of the set of nxn matrices with trace 0 and bracket [A,B] = AB-BA.

Since trace B=0 imposes one condition dim[sl(n,R)]=n2-1, where sl(n,R) is the Lie algebra of SL(n,R).

It is useful to introduce the inner product on the Lie algebra gl(n,R) of GL(n,R) <A,B> = trace(ABT). Note that ||A||2 = &Sigma;i,j=1naij2 which shows that the norm on gl(n,R) coincides with the Euclidean norm on Rn2. This norm can be used to show that SL(n,R) is not compact.

Let v1 = (1,0,...,0), v2 = (0,1,...,0),...,vn-1 = (0,...,1,0) and vn = (t,0,...,1) where all vi are members of Rn

Let A = (v1, v2,...,vn) be a matrix in Rnxn. All matrices of this form are elements of SL(n,R) whose norm is equal to &radic;(n+t2) for all t in R. SL(n,R) is not a bounded subset of gl(n,R), and hence SL(n,R) is not compact. SL(n,R) is also connected, but the proof has been left to Marsden due to space constraints.

The section concludes with the following propostition:

The Lie group SL(n,R) is a non-compact connected (n2-1) dimensional Lie group whose Lie algebra sl(n,R) consists of the nxn matrices with trace 0 and bracket [A,B] = AB-BA.

Apologies for any typos/inaccuracies. I'm pretty new to this stuff.
 
  • #46
I could not find any typos/inaccuracies
except for the one typo on an unintended extra t in "proposition"
"The section concludes with the following propostition:"
And I could not even find any instance of lack of clarity.
This is great. We might even have a Lie groups workshop
going. If the others, like Chroot and Hurkyl, keep in touch.

I am wondering what the others think would be good to do now.

One could look at what you just said and try to
say intuitively why those things are, in fact, the Lie algebra
of GL(n,R). How gl(n,R) really does correspond to infinitesimal transformations around the identity----and how it is the tangent space at the point on GL(n, R) which is the identity matrix.
Or one could do the same kind of concrete case investigation for SL(n, R) and its Lie algebra. I mean, try out and verify a few special cases and get our hands dirty.

Or, alternatively, we could move on to some more matrix groups like O(n) and SO(n), or begin looking at their complex cousins.

Or, if enough people were involved, we could go in both directions at once. Some could proceed to describe the other classic Lie groups and their algebras while others, like me, cogitate about the very simplest examples.

Let's see what happens. I hope something does.
 
  • #47
exp(A) the exponential function of a matrix

In a previous post I was going thru a section of marsden
in that pages 283-292 part of chapter 9, and it mentioned
the exponential function defined by the power series
exp(x) = 1 + x + x2/2! +...
and gave a case of where you plug a matrix A in for x
and get a matrix exp(A)

this has always seemed to me like a cool thing to do
and I see it as illustrating a kind of umbilical connection between Lie algebra and Lie group.

The algebra element A is what gets plugged into exp() to give exp(A) which is in the group.

Or in more cagey differential geometry style----exp(tA) for t running from 0 to 1 gives you a curve down in the Lie group (the manifold) which starts out at the identity point and travels along in the manifold and reaches exp(A) as its destination. Indeed exp(tA) for t running forever gives a one-dimensional subgroup--but this is a bit too abstract for this time of morning.

What I always think is so great is that if A is 3x3 skew sym
matrix, meaning AT = -A
then plugging A into that good old exp() power series gives a rotation matrix, one of the SO(3) Lie group.

More wonderful still, exp(A) is the rotation by exactly |v| radians about the vector v = (v1, v2, v3) as axis where A is
given by

+0 -v3 +v2
+v3 +0 -v1
-v2 +v1 +0

any skew symmetric matrix would have such a form for some
v1,v2,v3

And we may be able to convince ourselves of this, or prove it a bit, without much effort, just by looking at the power series in A.

If I stands for the identity matrix,

B = exp(A) = I + A + A2/2! +...

Now consider that since AT = - A, we can take the transpose of this whole power series and it will be as if we put a minus sign in front of A.

BT = exp(A)T = exp(- A)

But multiplying exp(x) and exp( -x) always gives one. When you multiply the two power series there is a bunch of cancelation and it boils down to the identity. So exp (-A) is the matrix INVERSE of exp(A).

BT = exp(A)T = exp(- A) = exp(A)-1 = B-1

BT = B-1 means that B is orthogonal


BTW one reason to think about paths exp(tA) from the identity to the endpoint exp(A) is to see clearly that exp(A) is in the same connected component of the group. O(3) is split into two pieces, one with det = 1 and one with det = -1.

The latter kind turn your shirt inside out as well as rotating it, so they are bad mothers and it is generally safer to work with the det = 1 kind which are called "special" or SO(3).

this curve going t = 0 to 1 shows that exp(A) is in the same connected component as the identity, because how could the curve ever leap the chasm between the two components?
So it shows det A = 1. But that is just mathematical monkeyshines, of course the determinant is one! :wink:

All this stuff can be written with an n sitting in for 3, but
as an inveterate skeptic I often suspect that
dimensions higher than 3 do not exist and prefer to write 3 instead of n. It looks, somehow, more definite and specific that way.

We should check out the elementary fact that [A,B] works with
skew sym matrices A and B! Why not! Maybe later today, unless someone else has already done it.

I will bring along this earlier post with an extract from pages 289-291 of the book
**************************************
SO(3) is a compact Lie group of dimension 3.

Its Lie algebra so(3) is the space of real skew-symmetric 3x3 matrices
with bracket [A,B] = AB - BA.

The Lie algebra so(3) can be identified with R3
the 3-tuples of real numbers by a vectorspace isomorphism
called the"hat map"

v = (v1,v2,v3) goes to v-hat, which is a skew-symmetric matrix
meaning its transpose its its NEGATIVE, and you just stash the three numbers into such a matrix like:

+0 -v3 +v2
+v3 +0 -v1
-v2 +v1 +0

v-hat is a matrix and apply it to any vector w and
you get vxw.

Everybody in freshman year got to play with v x w
the cross product of real 3D vectors
and R3 with ordinary vector addition and cross product v x w is kind of the ancestral Lie algebra from whence all the others came.

And the hat-map is a Lie algebra isomorphism

EULER'S THEOREM

Every element A in SO(3) not equal to the identity is a rotation
thru an angle &phi; about an axis w.

SO SO(3) IS JUST THE WAYS YOU CAN TURN A BALL---it is the group of rotations

THE EIGENVALUE LEMMA is that if A is in SO(3) one of its
eigenvalues has to be equal to 1.
The proof is just to look at the characteristic polynomial which is of degree three and consider cases.

Proof of Euler is just to look at the eigenvector with eigenvalue one----pssst! it is the axis of the rotation. Marsden takes three sentences to prove it.

A CANONICAL MATRIX FORM to write elements of SO(3) in
is

+1 +000 +000
+0 +cos&phi; -sin&phi;
+0 +sin&phi; cos&phi;

For typography I have to write 0 as +000
to leave space for the cosine and sine under it
maybe someone knows how to write handsomer matrices?

EXPONENTIAL MAP
Let t be a number and w be a vector in R3
Let |w| be the norm of w (sqrt sum of squares)
Let w^ be w-hat, the hat-map image of w in so(3), the Lie algebra. Then:

exp(tw^) is a rotation about axis w by angle t|w|

It is just a recipe to cook up a matrix giving any amount of rotation around any axis you want.
 
Last edited:
  • #48
routine checks

sometimes just doing the routine checks is a good way to
get used to something. In the last post I was talking about
so(3) the skewsym matrices that are the Lie algebra of SO(3) the rotations and I said


"We should check out the elementary fact that [A,B] works with
skew sym matrices A and B! Why not! Maybe later today, unless someone else has already done it."

What I mean is just verify the extremely simple fact that
if you have skew sym A,B then the bracket [A,B] is also skew sym!

And there is also the dreaded jacobite identity to verify namely

[[A,B], C] + [[B,C], A] + [[C,A], B] = 0

this terrible formula can only be verified by those who have memorized the alphabet, at least up to C, and
in our culture very young children are made to recite the alphabet to ensure that when they reach maturity they will be able to
verify the Jacobi identity.

It is, you may have noticed the main axiom of abstract Lie algebra.

There are sort of two wrong approaches to anything, (1)purely axiomatic and (2)bloodyminded practical----really have to do both, if one is learning about concrete examples one should occasionally look around and verify that they satisfy the axioms too.
 
  • #49
whoa! is that true? i m not so sure. i think what you want to say here is:
[x,y]=-[y,x] is a consequence of [x,x]=0, and if the field does not have characteristic 2, then [x,y]=- [y,x] implies [x,x]=0, but not in fields with characteristic 2, so we drop that as an axiom.

Yes, that's essentially what I meant to say.
 
  • #50
I've got a proof for the Lie algebra of skew-symmetric matrices in three dimensions produces another three-dimensional skew-symmetric matrix since that's our focus, but it's pretty simplistic, and unelegant. Maybe someone has a better one.

Let A be defined as
Code:
(0 -a -b)
(a  0 -c)
(b  c  0)
Let B be defined as
Code:
(0 -d -e)
(d  0 -f)
(e  f  0)
Let g=-(ad+be), h=-(ad+ce), i=-(be+cf)

Then AB is
Code:
( g  -bf af)
(-ce  h -ae)
( cd -bd  i)
And BA is
Code:
( g  -ce cd)
(-bf  h -bd)
( af -ae  i)
So, [A,B]=AB-BA=
Code:
(0      ce-bf af-cd)
(bf-ce  0     bd-ae)
(cd-af  ae-bd 0    )
Which is again a skew-symmetric matrix.

EDIT: Took chroot's advice.
 
Last edited:
  • #51
The best way to render matrices here is to put them in a [ code ][ /code ] container, which preserves spacing:
Code:
( 0       ce-bf    af-cd )
( bf-ce     0      bd-ae )
( cd-af   ae-bd      0   )
- Warren
 
  • #52
For A and B skew symmetric matrices:

(AB - BA)t = (AB)t - (BA)t
= BtAt - AtBt
= (-B)(-A) - (-A)(-B)
= BA - AB
= -(AB - BA)

So the commutator of any two skew symmetric matrices is again skew symmetric.

In general, for any involution *:

[A, B]* = (AB-BA)*
= (AB)* - (BA)*
= B*A* - A*B*
= [B*, A*]

where [,] is the commutator

edit: fixed a formatting error
 
Last edited:
  • #53
two good things just happened.
Lonewolf who is new to groups (background = one course in linear algebra) tackled it and proved it down-in-the-mud
and then Hurkyl proved it elegantly as a special
case of a more general fact that would include
the complex case of skew-Hermitian where you
take transpose and then complex conjugate of the matrix entries
can not restrain a broad grin
because both the dirtyhands approach and the elegant one
are indispensible
great


Originally posted by Hurkyl
For A and B skew symmetric matrices:

(AB - BA)t = (AB)t - (BA)t
= BtAt - AtBt
= (-B)(-A) - (-A)(-B)
= BA - AB
= -(AB - BA)

So the commutator of any two skew symmetric matrices is again skew symmetric.

In general, for any involution *:

[A, B]* = (AB-BA)*
= (AB)* - (BA)*
= B*A* - A*B*
= [B*, A*]

where [,] is the commutator
 
  • #54
you know, this thread is turning into a pretty nice lie group/lie algebra thread. there is the differential forms thread. now all we need if for someone to start a representation theory thread, and we ll have all the maths we need to do modern particle physics.

who wants to volunteer?
 
  • #55
I would absolutely love a rep theory thread -- especially if we could include both the down-n-dirty and the high-level approaches. I'm resonably competent to talk about Lie groups, but I am lost on representations.

- Warren
 
  • #56
Originally posted by chroot
I would absolutely love a rep theory thread -- especially if we could include both the down-n-dirty and the high-level approaches. I'm resonably competent to talk about Lie groups, but I am lost on representations.

- Warren

i m down for the high level part.
 
  • #57
Sure, I'll have a go at representation theory. Even if I don't understand it all, I'm sure I'll get something out of it.
 
  • #58
Originally posted by Lonewolf
Sure, I'll have a go at representation theory. Even if I don't understand it all, I'm sure I'll get something out of it.

lonewolf-

how much maths do you know? i don t think representation theory is all that hard. hang in there, i m sure we can get through it.
 
  • #59
I've covered the basics of group theory, and completed a course in linear algebra to be concluded next academic year. I'm pretty comfortable with the prerequisites you listed in the other thread. I'm willing to learn and I've got four months to fill, so I'm prepared to put some time in.
 
  • #60
Originally posted by Lonewolf
I've covered the basics of group theory, and completed a course in linear algebra to be concluded next academic year. I'm pretty comfortable with the prerequisites you listed in the other thread. I'm willing to learn and I've got four months to fill, so I'm prepared to put some time in.
dat s good to hear!
 

Similar threads

  • · Replies 26 ·
Replies
26
Views
831
  • · Replies 42 ·
2
Replies
42
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 52 ·
2
Replies
52
Views
6K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
1
Views
1K