# Linear algebra, prove matrix inverse proof flawed

by karnten07
Tags: algebra, flawed, inverse, linear, matrix, proof, prove
 P: 208 1. The problem statement, all variables and given/known data Theorem: Every square matrix which has a right inverse is invertible. More precisely: let A $$\in$$ M$$_{}nxn$$ (R) and suppose there is a matrix B$$\in$$ M$$_{}nxn$$ (R) such that AB = I$$_{}n$$; then we have BA=I$$_{}n$$ as well. The object of this exercise is to explain why the following proof is flawed: Proof: Let G be the set of all matrices in M$$_{}nxn$$ (R) which have a right inverse in M$$_{}nxn$$ (R). Then G together with matrix multiplication is a group. Now proposition 1.3(b) implies the theorem: Proposition 1.3b: Let G be a group G, GxG$$\mapsto$$G (a¦b) $$\mapsto$$axb $$\forall$$a,b,c$$\in$$G, (a*b)*c = a*(b*c) $$\exists$$e $$\in$$G $$\forall$$ a$$\in$$G, e*a=a=a*e a'*a=e $$\forall$$a $$\in$$G $$\exists$$ a'$$\in$$G, a*a' = e For any a$$\in$$G There exists precisely one right inverse a' and this is also a left inverse of a. We write a$$^{}-1$$ for the inverse of a. Proof of proposition 1.3b: Let a' be a right inverse of a (a'*a)*(a'*a)=a'*(a*(a'*a)) by associativity =a'*((a*a')*a) =a'*(e*a) because a' is a right inverse of a =a'*e because e is an identity element Let b be a right inverse of c:=a'*a c*b =(c*c)*b =c*(c*b) by associativity =c*e since b is a right inverse of c =c because e is an identity element Hence a' is a left inverse of a Note: proposition 1.3b is what is given in the lecture notes. 2. Relevant equations 3. The attempt at a solution Does this have something to do with matrix multiplication being associative and distributive but not always commutative?
 P: 208 Sorry, i can't seem to get latex to work here, it is making things superscript when they should be subscript and the arrows should be maps to.
 Emeritus Sci Advisor PF Gold P: 16,091 (1) It works better amongst regular text if you use [ itex ] instead of [ tex ]. (2) It works better still if you put an entire expression inside one pair of tags. (instead of putting a single symbol)
 P: 208 Linear algebra, prove matrix inverse proof flawed Is it that proposition 1.3b doesn't explicitly describe a group that is abelian, and for the theorem to be true it requires commutativity of the matrix and it's inverse, ie, that the right and left inverses are the same?
P: 208
 Quote by karnten07 1. The problem statement, all variables and given/known data Theorem: Every square matrix which has a right inverse is invertible. More precisely: let A $$\in$$ M$$_{}nxn$$ (R) and suppose there is a matrix B$$\in$$ M$$_{}nxn$$ (R) such that AB = I$$_{}n$$; then we have BA=I$$_{}n$$ as well. The object of this exercise is to explain why the following proof is flawed: Proof: Let G be the set of all matrices in M$$_{}nxn$$ (R) which have a right inverse in M$$_{}nxn$$ (R). Then G together with matrix multiplication is a group. Now proposition 1.3(b) implies the theorem: Proposition 1.3b: Let G be a group G, GxG$$\mapsto$$G (a¦b) $$\mapsto$$axb $$\forall$$a,b,c$$\in$$G, (a*b)*c = a*(b*c) $$\exists$$e $$\in$$G $$\forall$$ a$$\in$$G, e*a=a=a*e a'*a=e $$\forall$$a $$\in$$G $$\exists$$ a'$$\in$$G, a*a' = e For any a$$\in$$G There exists precisely one right inverse a' and this is also a left inverse of a. We write a$$^{}-1$$ for the inverse of a. Proof of proposition 1.3b: Let a' be a right inverse of a (a'*a)*(a'*a)=a'*(a*(a'*a)) by associativity =a'*((a*a')*a) =a'*(e*a) because a' is a right inverse of a =a'*e because e is an identity element Let b be a right inverse of c:=a'*a c*b =(c*c)*b =c*(c*b) by associativity =c*e since b is a right inverse of c =c because e is an identity element Hence a' is a left inverse of a Note: proposition 1.3b is what is given in the lecture notes. 2. Relevant equations 3. The attempt at a solution Does this have something to do with matrix multiplication being associative and distributive but not always commutative?
Hi guys, i'm still having real trouble with this question and i have tried to systematically think through it. I have an idea as to why the proof is flawed. Is it because in proposition 1.3b it says:

G, G x G $$\rightarrow$$ G

So applying proposition 1.3b to this case where we consider that all the elements in our G are only square matrices which have right inverses. So when applying matrix multiplication to two elements of our G, does it always produce another square matrix that itself has a right inverse ie. that it is invertible and of the original group G? If not, this might be the flaw??

Thoughts would be very appreciated, thanks.
 P: 208 On second thoughts, i think thr propsotion takes care of this fact because on the next line it says, (a¦b) $$\mapsto$$axb i think the ¦ sign might actually be a comma, i probably copied it wrong from the board. So i think this sentence is meant to mean that x assigns an element axb to the ordered pair a, b. There is another part to proposition 1.3b which i thought i didnt need to reproduce because the proof of a-1 also being a left inverse of a, had been shown. But here it is anyway because i can't think of anything else that is the flaw: suppose a''$$\in$$G is another right inverse of a $$\Rightarrow$$a'*(a*a'')=(a'*a)*a'' = a'*e=a' by associativity =e*a''=a'' I'm really out of ideas, anyone have any?
 P: 208 Ah, could this be the problem? Let b be a right inverse of c:=a'*a c*b =(c*c)*b =c*(c*b) by associativity =c*e since b is a right inverse of c =c because e is an identity element Hence a' is a left inverse of a it says b is a right inverse of c, where c = a'*a but a'*a is not strictly defined to have a right inverse that resides within our G?
Math
Emeritus
Thanks
PF Gold
P: 39,552
I haven't read through the entire thing but there is an obvious error right at the start:

 Let G be the set of all matrices in Mnxn(R) which have a right inverse in Mnxn(R). Then G together with matrix multiplication is a group.
You are trying to prove that each such matrix has an inverse but asserting that G is a group is the same as asserting each has an inverse.
P: 208
 Quote by HallsofIvy I haven't read through the entire thing but there is an obvious error right at the start: You are trying to prove that each such matrix has an inverse but asserting that G is a group is the same as asserting each has an inverse.
I see what you are saying, but i thought the main point of the theorem was to show that the right inverse of an element in G is also the left inverse of the element ie. AxA^-1=I and A^-1xA=I

Also, it says that each square matrix of the set has a right inverse and so is a group because there is an inverse for each element.

I hope i am missing a point there and that you are right because i just want to move on from this question lol, thanks again
P: 208
 Quote by HallsofIvy I haven't read through the entire thing but there is an obvious error right at the start: You are trying to prove that each such matrix has an inverse but asserting that G is a group is the same as asserting each has an inverse.
I see it now, i just had to reread the definition of a group. So it is flawed because the so called proof assumes the theorem to be true in defining the set G and matrix multiplication to be a group. This is because for it to be a group it must meet the condition that:

If x$$\in$$G, then y$$\in$$G is an inverse element of x if x*y=e and y*x=e where e is an identity element of G.

which is what we want to prove so can't make this assumption in doing so.

This seems good to me? Thanks HallsofIvy, you guys are clever!!

 Related Discussions Calculus & Beyond Homework 2 Calculus & Beyond Homework 1 Introductory Physics Homework 3 Introductory Physics Homework 3 Linear & Abstract Algebra 2