I Mistake in Schaum's Group Theory?

  • Thread starter jstrunk
  • Start date

WWGD

Science Advisor
Gold Member
4,470
1,924
Well the 4-,5- and 9-Lemmata should work in ##\mathbf{Top}## as well, but I'm still not convinced.
?? I never heard of those names. Can you ref. please?
Do you agree with my argument using matrices?
 

fresh_42

Mentor
Insights Author
2018 Award
11,561
8,015
?? I never heard of those names. Can you ref. please?
Do you agree with my argument using matrices?
There is no problem with a finite basis and linearity. I only said, that the element count doesn't work for infinite sets. Interesting would be the case of uncountable dimensions: Can we embed such a vector space in itself without being surjective? Or will Hamel bases, i.e. AC save the day?
 

WWGD

Science Advisor
Gold Member
4,470
1,924
There is no problem with a finite basis and linearity. I only said, that the element count doesn't work for infinite sets. Interesting would be the case of uncountable dimensions: Can we embed such a vector space in itself without being surjective? Or will Hamel bases, i.e. AC save the day?
I never thought cardinality/count was enough; that wasn't part of my argument. I thought a key issue was that we are mapping a space to itself. And that matrices are not one-side invertible, unlike some functions. EDIT: I guess re the quotients X/S_1, X/S_2 , we need some condition on maps passing to the quotient. Thanks for the 5-lemma ref. I will see how to fit it.
 

Math_QED

Science Advisor
Homework Helper
1,199
400
Why that? What is a counterexample? A quick view on the 4 Lemma seems as if there are inclusions ##G_1/H_1 \rightarrowtail G_2/H_2 \rightarrowtail G_1/H_1##. Not sure whether there is also an epimorphism, but the two inclusions are a strong condition.
Exactly the point I was trying to make! It seems so true! Take ##G = Z_4 \times Z_2## and consider the cyclic subgroups generated by ##(2,0)## and ##(0,1)##. They are both cyclic of order 2 and hence isomorphic. Quotienting them out gives the two different groups of order ##4##.
 
Last edited:

WWGD

Science Advisor
Gold Member
4,470
1,924
Maybe one way of doing the proof for (fin. dim) V.S is to use the fact that, given an ordered basis ## \{ v_1, v_2,...,v_n \} ## there is an isomorphism between ##\text L(V,V) ## and ##\text Mat_{(n,n)}##. And then you can show a matrix can only have 2-sided inverses, meaning it has to describe a map that is 1-1 and onto.

##AB=I B=A^{-1} \rightarrow BA =A^{-1}A=I ##

So B is both right- and left- inverse and we have two-sided inverses.
BAH. My argument here is incorrect; it does not automatically follow. It is an interesting exercise: Show that for A,B same size, AB=I , then BA=I. Not trivial that matrix inverses are necessarily two-sided. EDIT: I don't know if/how this can be generalized to non-commutative rings with identity: if rr'=1 , when does it follow that r'r=1?
 
Last edited:

StoneTemplePython

Science Advisor
Gold Member
1,105
527
BAH. My argument here is incorrect; it does not automatically follow. It is an interesting exercise: Show that for A,B same size, AB=I , then BA=I. Not trivial that matrix inverses are necessarily two-sided.
an easy way to do this for ##n## x ##n## matrices is write the left inverse of ##B## as a linear combination of powers of ##B## (i.e. a polynomial in B). Now right multiply by ##B## and each side is thus equal to the identity matrix, but your polynomial commutes with ##B## (evaluate term by term) which means your left inverse commutes with ##B## as well and hence is an actual inverse.

- - - -
The "easy" polynomial for the above uses Cayley Hamilton. The much more basic result that is actually useful here is that these matrices live in a vectors space with dimension ##n^2## and hence there must be some monic polynomial of degree (at most) ##n^2## that annhilates them... once use you have your polynomial in B equal to zero, multiply on the left by the left inverse of ##B## a suitable number of times and move the lowest order term (the left inverse of B itself), to the right hand side, and rescale as needed. (This is where the first paragraph starts.)
 

Math_QED

Science Advisor
Homework Helper
1,199
400
BAH. My argument here is incorrect; it does not automatically follow. It is an interesting exercise: Show that for A,B same size, AB=I , then BA=I. Not trivial that matrix inverses are necessarily two-sided. EDIT: I don't know if/how this can be generalized to non-commutative rings with identity: if rr'=1 , when does it follow that r'r=1?
##AB= I\implies BA = B(AB)B^{-1} = BB^{-1}= I##
An inverse exists by a determinant argument: ##AB=I## means that ##\det B \neq 0 \neq \det A##

In general in non-commutative rings ##rr'=1## does not imply ##r'r=1##. See e.g. here https://math.stackexchange.com/q/70777/661543
 
Last edited:

fresh_42

Mentor
Insights Author
2018 Award
11,561
8,015
##AB= I\implies BA = B(AB)B^{-1} = BB^{-1}= I##
An inverse exists by a determinant argument: ##AB=I## means that ##\det B \neq 0 \neq \det A##

In general in non-commutative rings ##rr'=1## does not imply ##r'r=1##. See e.g. here https://math.stackexchange.com/q/70777/661543
Yes, but you cheated a bit: the multiplicity of the determinant, the existence of ##B^{-1}## as left and right inverse, which is what we wanted to show! The condition does not hold in arbitrary rings, but in groups. So the essential question behind the task was: Can we show that ##GL(V)## is a group and thus ##AB=I \Longrightarrow BA=I## without using linear algebra? It's easy if there exists a left inverse, but the existence will probably require facts from the definition of the group. Can this be proven, or can we conclude the existence of ##CA=I## only from ##AB=I\,##?
 

WWGD

Science Advisor
Gold Member
4,470
1,924
##BABB^{-1} =I ## implies ##AB=AB^{-1} ## , not necessarily the identity. Maybe we can argue:
##BA=I ## , then ##AB= ABAB=I \rightarrow (AB)^n =I ## , has only identity as solution ( since this is true for all natural ##n## )?
 

fresh_42

Mentor
Insights Author
2018 Award
11,561
8,015
You said earlier that ##AB=I## is given. I think you confused the order now. This is very confusing, as the order is all we talk about. But given ##AB=I## the following doesn't make much sense:
##BABB^{-1} =I ## implies ##AB=AB^{-1} ## , not necessarily the identity. Maybe we can argue:
##BA=I ## , then ##AB= ABAB=I \rightarrow (AB)^n =I ## , has only identity as solution ( since this is true for all natural ##n## )?
If we allow a left inverse ##CA=I## then we immediately have ##B=IB=(CA)B=C(AB)=CI=C##. The existence of ##C## is the problem. To write it as ##B^{-1}## is cheating.
 

WWGD

Science Advisor
Gold Member
4,470
1,924
You said earlier that ##AB=I## is given. I think you confused the order now. This is very confusing, as the order is all we talk about. But given ##AB=I## the following doesn't make much sense:

If we allow a left inverse ##CA=I## then we immediately have ##B=IB=(CA)B=C(AB)=CI=C##. The existence of ##C## is the problem. To write it as ##B^{-1}## is cheating.
Yes, my bad, mixed both things up. Let me go for another doppio, a 2-sided doppio. I intended a post to contain left- and right - inverses, but I somehow got logged off a few times and lost track of what I was doing :(.
 

WWGD

Science Advisor
Gold Member
4,470
1,924
Ok, this is the argument I intended:
Assume ##AB=I ##
Then ##BA=B(AB)A=(BA)^2=I ## ( just need associativity) and we can extend to ##(BA)^n =I ## for all n
 

fresh_42

Mentor
Insights Author
2018 Award
11,561
8,015
Ok, this is the argument I intended:
Assume ##AB=I ##
Then ##BA=B(AB)A=(BA)^2=I ## ( just need associativity) and we can extend to ##(BA)^n =I ## for all n
That's wrong. If ##BA=C## then all we have is ##C=BA=B(AB)A = (BA)^2=C^2## and per recursion ##C=C^n## for all ##n##. Since there are unipotent matrices which are not the identity, I don't see how ##C=I## should follow.
 

WWGD

Science Advisor
Gold Member
4,470
1,924
That's wrong. If ##BA=C## then all we have is ##C=BA=B(AB)A = (BA)^2=C^2## and per recursion ##C=C^n## for all ##n##. Since there are unipotent matrices which are not the identity, I don't see how ##C=I## should follow.
Yes, use BA or C either way. And, no, not . ##C^n =C ## for _some_ n, but ##C^n =C ## _for all_ n . Do you have a non-ID C with ##C^2=C^3=....=C^n =I ## ? Maybe, I don't know of one. Also DetC =1 here, let's assume Real entries. If not, the determinant is a 2nd, 3rd,....n-th,.... root of unity. Is there such number?
 

fresh_42

Mentor
Insights Author
2018 Award
11,561
8,015
Yes, use BA or C either way. And, no, not . ##C^n =C ## for _some_ n, but ##C^n =C ## _for all_ n . Do you have a non-ID C with ##C^2=C^3=....=C^n =I ## ?
That's what I said, for all ##n##. And, no, I don't have such a ##C##, but this isn't a formal proof, especially if we do not use linear algebra.
 

WWGD

Science Advisor
Gold Member
4,470
1,924
That's what I said, for all ##n##. And, no, I don't have such a ##C##, but this isn't a formal proof, especially if we do not use linear algebra.
Yes, I know, I never said it is a proof. But this matrix BA satisfies infinitely many polynomials ##C^n-C =0 ##, which is also strange, but, I admit, not a proof (yet?).
 

WWGD

Science Advisor
Gold Member
4,470
1,924
Just experimenting before aiming for a formal proof.
 

fresh_42

Mentor
Insights Author
2018 Award
11,561
8,015
Well, we have ##C=0## is a solution, too. And theoretically we can have zero divisors from the left and not from the right and similar nasty things. But the question "can we prove ##G=GL(V)## is a group" without using linear algebra, and only ##AB=I## is probably doomed to fail, as @Math_QED mentioned correctly, that it doesn't work in rings. But if it's a group, then we are done. If we do not allow this, we have to use something from the definition of ##G##, and that is linear algebra.
 

WWGD

Science Advisor
Gold Member
4,470
1,924
Well, we have ##C=0## is a solution, too. And theoretically we can have zero divisors from the left and not from the right and similar nasty things. But the question "can we prove ##G=GL(V)## is a group" without using linear algebra, and only ##AB=I## is probably doomed to fail, as @Math_QED mentioned correctly, that it doesn't work in rings. But if it's a group, then we are done. If we do not allow this, we have to use something from the definition of ##G##, and that is linear algebra.
C=0 Is not a solution ##AB=I ## so ##Det(AB)=DetADetB=1 ## , and ##DetC=DetBA =1 ## , but Det0=0. EDIT: ##DetC =1 ##or if you allow Complexes and ##Detc \neq 1##, DetC is a 2nd,3rd,..., nth,.... root of unity.
 

fresh_42

Mentor
Insights Author
2018 Award
11,561
8,015
C=0 Is not a solution AB=I so Det(AB)=DetADetB=1 , and Det0 =0.
We are still on the necessity part. The damn existence of ##C## is the problem. And ##0## is a solution to ##C=C^n## for all ##n##. If you use the determinant, you have to mention the field! What I say: we will need some argument from LA. In the end we could simply write down the formula for the inverse in terms of ##A## and good is.
 

WWGD

Science Advisor
Gold Member
4,470
1,924
We are still on the necessity part. The damn existence of ##C## is the problem. And ##0## is a solution to ##C=C^n## for all ##n##. If you use the determinant, you have to mention the field! What I say: we will need some argument from LA.
Well, yes, no finite fields, so C^n=C. And I had specified Complexes or Reals. Anyway, I don't see how this relates to Los Angeles, LA ;), but at any rate, I think we may be able to relax condition of linearity. All we need is scaling property and showing the image is not meager, i.e. the image contains a ball about the origin , then if we consider a line segment y=cx from the origin, we consider the multiples of that segment and every point is hit in that way.
 

fresh_42

Mentor
Insights Author
2018 Award
11,561
8,015
We still have the difficulty that zero divisors are possible as long as we haven't a group. The (unique) solvability of ##x A = B## is equivalent to the group axioms, so we circle around the main topic. Guess we have to visit La La Land.
 

mathwonk

Science Advisor
Homework Helper
10,730
908
the integers are a group in which all non zero subgroups are normal and isomorphic, but the quotients can be any finite cyclic group.
 
32,716
4,459
##(0,1) \stackrel{\operatorname{id}}{\hookrightarrow} \mathbb{R}-\{\,0\,\}\stackrel{\varphi}{\rightarrowtail} (-\frac{1}{4},\frac{1}{4}) -\{\,0\,\} \stackrel{+\frac{1}{3}}{\hookrightarrow} (0,1)##
where I only need a bijection ##\varphi## between the real line and an open interval, both without zero to make it easier.
If I understand what your notation conveys, the last bit should be ## \stackrel{+\frac{1}{4}}{\hookrightarrow} (0,1)##, since you're adding 1/4 to each element of the punctured interval (-1/4, 1/4).
 

Want to reply to this thread?

"Mistake in Schaum's Group Theory?" You must log in or register to reply here.

Related Threads for: Mistake in Schaum's Group Theory?

  • Posted
Replies
6
Views
707
  • Posted
Replies
1
Views
1K
  • Posted
Replies
10
Views
3K
  • Posted
Replies
6
Views
2K
  • Posted
Replies
2
Views
3K
  • Posted
Replies
6
Views
2K
  • Posted
Replies
1
Views
1K
  • Posted
Replies
3
Views
2K

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top