Questions about solvable Lie algebras

In summary: column, and zero otherwise.$$\begin{align}[e_{13},e_{31}]&=e_{11}=0\cdot e_{13}+(1-0)\cdot e_{31}\\[e_{14},e_{41}]&=e_{11}=0\cdot e_{14}+(1-0)\cdot e_{41}\\[e_{23},e_{32}]&=e_{22}=0\cdot e_{23}+(1-0)\cdot e_{32}\\[e_{24},e_{42}]&=e_{22}=0\cdot e_{24}+(1-0)\cdot e_{42}\\[e_{34},e_{43}]&
  • #1
HDB1
77
7
TL;DR Summary
About solvable algebra
Please, in the book of Introduction to Lie Algebras and Representation Theory J. E. Humphreys p.11, I have a question:

Proposition.
Let ##L## be a Lie algebra.
(a) If ##L## is solvable, then so are all subalgebras and homomorphic images of ##L##.
(b) If ##I## is a solvable ideal of ##L## such that ##L / I## is solvable, then ##L## itself is solvable.
(c) If ##I, J## are solvable ideals of ##L##, then so is ##I+J##.

Please, in proof, (b); how we get:
##\left(L^{(i)}\right)^{(j)}=L^{(i+j)} \text { implies that } L^{(n+m)}=0##

Thanks in advance,
 
Physics news on Phys.org
  • #2
Dear, @fresh_42 , if you could help, I would appreciate it. :heart:
 
  • #3
HDB1 said:
TL;DR Summary: About solvable algebra

Please, in the book of Introduction to Lie Algebras and Representation Theory J. E. Humphreys p.11, I have a question:

Proposition.
Let ##L## be a Lie algebra.
(a) If ##L## is solvable, then so are all subalgebras and homomorphic images of ##L##.
(b) If ##I## is a solvable ideal of ##L## such that ##L / I## is solvable, then ##L## itself is solvable.
(c) If ##I, J## are solvable ideals of ##L##, then so is ##I+J##.

Please, in proof, (b); how we get:
##\left(L^{(i)}\right)^{(j)}=L^{(i+j)} \text { implies that } L^{(n+m)}=0##

Thanks in advance,
You can formally prove ##\left(L^{(i)}\right)^{(j)}=L^{(i+j)}## by induction. However, the heuristic should be sufficient to convince you. What is ##L^{(i)}##? It is ##L## multiplied by itself, then the result multiplied with itself, then this result multiplied with itself and so on, ## i ## times. The same is then done with that result another ## j ## times. But this is the same as starting with ##L## and proceeding ## i+j ## times.

In part (b) of the proof we have that ##I## is solvable and ##L/I## is solvable. The first means ##I^{(n)}=0## for some ##n## and the second means ##\left(L/I\right)^{(m)}=0.##

Now we must look at the definition of the quotient (see the other thread) and remember what the zero in ##L/I## is. It is the equivalence class ##0 + I## which is ##I##. Hence solvability of ##L/I## means that ##\left(L/I\right)^{(m)} \subseteq 0+I=I.## The multiplication in ##L/I## goes
$$
[a+I\, , \,b+I]=[a\, , \,b]\, +\,I
$$
Combining all that we have ##\underbrace{\left(\underbrace{\left(L/I\right)^{(m)}}_{\subseteq I}\right)^{(n)} }_{\subseteq I^{(n)}=\{0\}}.##
 
  • Love
Likes HDB1
  • #4
Thank you so much, @fresh_42 , :heart:

But, please, how then prove that ##L## is solvable?
 
  • #5
HDB1 said:
Thank you so much, @fresh_42 , :heart:

But, please, how then prove that ##L## is solvable?
##L/I## is solvable. That means multiplying it with itself and so on will end up as ##\bar 0.##
But the ##\bar 0## class in the quotient ##L/I## is ##0+I=I.## So we end up in ##I## after sufficiently many steps. Now, ##I## is also solvable, too. So we proceed with multiplying everything with itself and after another ##n## steps we will be in the real ##\{0\}## subset of ##L, ## i.e. at ##0\in L.## That makes ##L## solvable.
 
  • Like
Likes HDB1
  • #6
HDB1 said:
TL;DR Summary: About solvable algebra

Please, in the book of Introduction to Lie Algebras and Representation Theory J. E. Humphreys p.11, I have a question:

Proposition.
Let ##L## be a Lie algebra.
(a) If ##L## is solvable, then so are all subalgebras and homomorphic images of ##L##.
(b) If ##I## is a solvable ideal of ##L## such that ##L / I## is solvable, then ##L## itself is solvable.
(c) If ##I, J## are solvable ideals of ##L##, then so is ##I+J##.

Please, in proof, (b); how we get:
##\left(L^{(i)}\right)^{(j)}=L^{(i+j)} \text { implies that } L^{(n+m)}=0##

Thanks in advance,
Dear, @fresh_42 , I am so sorry, but I have a question here:

about: (b), I need example of it, and I found: upper triangular matrix, let it ##A##, so if we bracket ##A## with itself, we will get strictly upper matrix, which is nilpotent, and then: solvable ideal, but what about the quotient of upper with strictly? what will be the outcome, please,

also please, do you have example of (a) or (c)..

Thanks in advance, :heart: :heart: :heart:
 
  • #7
HDB1 said:
Dear, @fresh_42 , I am so sorry, but I have a question here:

about: (b), I need example of it, and I found: upper triangular matrix, let it ##A##, so if we bracket ##A## with itself, we will get strictly upper matrix, which is nilpotent, and then: solvable ideal, but what about the quotient of upper with strictly? what will be the outcome, please,

also please, do you have example of (a) or (c)..

Thanks in advance, :heart: :heart: :heart:
Let's take ##n=4## in your example, i.e. the Lie algebra of upper ##4\times 4## triangular matrices, and its ideal ##I## of strictly upper ##4\times 4## triangular matrices.
$$
\left\{\begin{pmatrix}0&x_{12}&x_{13}&x_{14}\\0&0&x_{23}&x_{24}\\0&0&0&x_{34}\\0&0&0&0\end{pmatrix}\right\} = I \trianglelefteq L = \left\{\begin{pmatrix}x_{11}&x_{12}&x_{13}&x_{14}\\0&x_{22}&x_{23}&x_{24}\\0&0&x_{33}&x_{34}\\0&0&0&x_{44}\end{pmatrix}\right\}
$$
Every element in ##L## can be written as ##X=D+S## where ##D## is a diagonal matrix, and ##S\in I.## You correctly noted that a) ##I## is a nilpotent, and therewith solvable ideal in the solvable algebra ##L.##

a) ##I\trianglelefteq L\;:##
$$
[X,T]=[D+S,T]=\underbrace{[D,T]}_{\in I}+\underbrace{[S,T]}_{\in [I,I]\subseteq I} \text{ for any } X=D+S\in L\, , \,S,T\in I
$$
b) ##I^3=[I,[I,[I,I]]]=\{0\}\,:##

Let ##e_{pq}## be the matrix with a ##1## in position ##(i,j)## i.e. ## i ##-th row and ##j## th column and zeros elsewhere. Then ##[e_{12},[e_{12},[e_{12}+e_{23},e_{23}+e_{34}]]] = [e_{12},[e_{12},e_{13}+e_{24}]]=[e_{12},e_{14}]=0## is the longest expression we can get.

c) ##I^{(2)}=\{0\}\,:##
$$
\left[\begin{pmatrix}0&x_{12}&x_{13}&x_{14}\\0&0&x_{23}&x_{24}\\0&0&0&x_{34}\\0&0&0&0\end{pmatrix},\begin{pmatrix}0&y_{12}&y_{13}&y_{14}\\0&0&y_{23}&y_{24}\\0&0&0&y_{34}\\0&0&0&0\end{pmatrix}\right]=\begin{pmatrix}0&0&z_{13}&z_{14}\\0&0&0&z_{24}\\0&0&0&0\\0&0&0&0\end{pmatrix}
$$
which is abelian, so ##[[I,I],[I,I]]=\{0\}.## I haven't calculated the values for ##z_{ij}## as we are only interested in the shape of the matric, not its values.

d) ##L/I \cong \left\{\begin{pmatrix}x_{11}&0&0&0\\0&x_{22}&0&0\\0&0&x_{33}&0\\0&0&0&x_{44}\end{pmatrix}\right\}## is abelian because ##[L,L]\ni [D+S,D'+S']=\underbrace{[D,D']}_{=0}+\underbrace{[D,S'] +[S,D']+[S,S']}_{\in I}.##
Note that ##I## is the zero in ##L/I## so ##L/I## is abelian, and therefore nilpotent, and therefore solvable.

e) Putting all these together we have:
$$
[L,L] \subseteq I \,\Longrightarrow\, \underbrace{[\underbrace{[\underbrace{[L,L]}_{\subseteq I},\underbrace{[L,L]}_{\subseteq I}]}_{\subseteq [I,I]},\underbrace{[\underbrace{[L,L]}_{\subseteq I},\underbrace{[L,L]}_{\subseteq I}]}_{\subseteq [I,I]}]}_{\subseteq [[I,I],[I,I]]=\{0\}}
$$

The idea behind (b) of the theorem is the following: We can write an element ##X\in L## as a sum of an element ##\bar X\in L/I## and ##S\in I.## ##\bar X## is the diagonal matrix ##D## I began with. Thus
$$
[L,L]=[L/I + I\, , \,L/I +I]=\underbrace{[L/I,L/I]}_{\subseteq L/I} + \underbrace{[L/I,I]+[I,I]}_{\subseteq I}
$$
Since ##L/I ## is solvable, continued multiplication by itself will end up in zero, which is ##I##. But then we are left with an expression that is completely in ##I.## However, ##I## is solvable, too, so continued multiplication by itself will end up in ##\{0\}.##

I'm not quite sure if this answers your question. If we take a higher value of ##n## then only the chains get longer, but the result will be the same. If we take ##n=2## or ##n=3## then the chains are shorter.
 
  • #8
(a) is given by the definition of a Lie algebra homomorphism. Say we have ##\varphi \, : \,L\longrightarrow \varphi (L).## Then
$$
[\varphi (L),\varphi (L)]=\varphi ([L,L]) \; , \;[[\varphi (L),\varphi (L)],[\varphi (L),\varphi (L)]]=\varphi ([L,L],[L,L])
$$
and so on. If the chain of ##L's## becomes zero, so will the chain of ##\varphi (L)'s.##

Note that the opposite is not true. If a homomorphic image is solvable, i.e. ##(\varphi(L))^{(n)}=0## then we can only conclude that ##L^{(n)} \subseteq \ker \varphi ## which is in general not zero. In such a case we need the additional condition that ##\ker \varphi =\{0\},## i.e. that ##\varphi ## is injective, or at least that ##\ker \varphi ## is solvable, too.

You can take ##\varphi =\operatorname{ad}## as an important example and check it on the two-dimensional Lie algebra with ##[H,E]=2E## as multiplication.

Subalgebras are even easier. If ##U\subseteq L## then ##[U,U]\subseteq [L,L]## and so on. If the ##L's## becomes zero, so will any subalgebra ##U \trianglelefteq L.##

An example for (c) is the same as we used for (b). Take the diagonal matrices as ##J## and the strictly upper triangular matrices as ##I.## Both are solvable and so is their sum ##L.##

Edit: The diagonal matrices are only a subalgebra. If you insist on ideals, then take only those diagonal matrices that have the same value ##x_{11}=x_{22}=x_{33}=x_{44}.##
 
Last edited:
  • Like
Likes malawi_glenn

1. What is a solvable Lie algebra?

A solvable Lie algebra is a type of mathematical structure that consists of a vector space equipped with a binary operation called the Lie bracket. This bracket operation satisfies certain properties and allows for the study of the algebra's structure and properties.

2. How is a solvable Lie algebra different from a general Lie algebra?

A solvable Lie algebra is a subcategory of general Lie algebras. The main difference is that in a solvable Lie algebra, the Lie bracket operation can be iteratively applied to a set of elements until it reaches a point where all elements commute with each other. In a general Lie algebra, this may not be the case.

3. What are some examples of solvable Lie algebras?

Some examples of solvable Lie algebras include the Heisenberg algebra, the upper triangular matrices, and the nilpotent algebra. These algebras have a specific structure that allows for the iterative application of the Lie bracket until all elements commute.

4. How are solvable Lie algebras used in science?

Solvable Lie algebras have many applications in science, particularly in physics and engineering. They are used to study the symmetry and conservation laws of physical systems, as well as in the development of mathematical models and algorithms for solving complex problems.

5. What are the implications of a solvable Lie algebra being solvable?

The solvability of a Lie algebra has important implications for its structure and properties. It allows for the algebra to be decomposed into simpler subalgebras, making it easier to study and understand. Additionally, the solvability of a Lie algebra can provide insights into the behavior of physical systems and their underlying symmetries.

Similar threads

Replies
9
Views
1K
  • Linear and Abstract Algebra
Replies
15
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
384
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
19
Views
1K
  • Linear and Abstract Algebra
Replies
19
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
941
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
853
  • Linear and Abstract Algebra
Replies
2
Views
3K
Back
Top