Linear algebra, prove matrix inverse proof flawed

Click For Summary

Homework Help Overview

The discussion revolves around a theorem in linear algebra concerning the invertibility of square matrices with right inverses. The original poster presents a proof that claims every square matrix with a right inverse is invertible, specifically addressing the implications of a group structure formed by matrices with right inverses.

Discussion Character

  • Conceptual clarification, Assumption checking, Problem interpretation

Approaches and Questions Raised

  • Participants explore the implications of matrix multiplication being associative but not necessarily commutative, questioning whether the proof's assumptions about group properties hold.
  • Some participants consider whether the proof's reliance on the group structure of matrices with right inverses is valid, particularly regarding the definition of the set G.
  • Others raise concerns about the existence of right inverses for products of matrices within the set G.
  • There is a discussion about the necessity of commutativity for the theorem to hold true, with some suggesting that the proof does not adequately address this aspect.

Discussion Status

The discussion is ongoing, with participants actively questioning the assumptions made in the proof and exploring various interpretations of the theorem. Some have provided insights into potential flaws in the proof, particularly regarding the definition of the group and the implications of matrix multiplication. There is no explicit consensus yet, but several productive lines of inquiry have emerged.

Contextual Notes

Participants note that the proof assumes the theorem's validity in defining the set G, which may not be justified. The discussion also highlights the challenges of using LaTeX formatting for mathematical expressions, which has affected clarity in communication.

karnten07
Messages
206
Reaction score
0
Linear algebra, flawed proof

Homework Statement


Theorem: Every square matrix which has a right inverse is invertible. More precisely: let A \in M_{}nxn (R) and suppose there is a matrix B\in M_{}nxn (R) such that AB = I_{}n; then we have BA=I_{}n as well.

The object of this exercise is to explain why the following proof is flawed:

Proof: Let G be the set of all matrices in M_{}nxn (R) which have a right inverse in M_{}nxn (R). Then G together with matrix multiplication is a group. Now proposition 1.3(b) implies the theorem:

Proposition 1.3b:

Let G be a group

G, GxG\mapstoG
(a¦b) \mapstoaxb
\foralla,b,c\inG, (a*b)*c = a*(b*c)
\existse \inG \forall a\inG, e*a=a=a*e
a'*a=e
\foralla \inG \exists a'\inG, a*a' = e

For any a\inG There exists precisely one right inverse a' and this is also a left inverse of a. We write a^{}-1 for the inverse of a.

Proof of proposition 1.3b:
Let a' be a right inverse of a
(a'*a)*(a'*a)=a'*(a*(a'*a)) by associativity
=a'*((a*a')*a)
=a'*(e*a) because a' is a right inverse of a
=a'*e because e is an identity element
Let b be a right inverse of c:=a'*a
c*b
=(c*c)*b
=c*(c*b) by associativity
=c*e since b is a right inverse of c
=c because e is an identity element
Hence a' is a left inverse of a

Note: proposition 1.3b is what is given in the lecture notes.


Homework Equations





The Attempt at a Solution



Does this have something to do with matrix multiplication being associative and distributive but not always commutative?
 
Last edited:
Physics news on Phys.org
Sorry, i can't seem to get latex to work here, it is making things superscript when they should be subscript and the arrows should be maps to.
 
(1) It works better amongst regular text if you use [ itex ] instead of [ tex ].
(2) It works better still if you put an entire expression inside one pair of tags. (instead of putting a single symbol)
 
Is it that proposition 1.3b doesn't explicitly describe a group that is abelian, and for the theorem to be true it requires commutativity of the matrix and it's inverse, ie, that the right and left inverses are the same?
 
karnten07 said:

Homework Statement


Theorem: Every square matrix which has a right inverse is invertible. More precisely: let A \in M_{}nxn (R) and suppose there is a matrix B\in M_{}nxn (R) such that AB = I_{}n; then we have BA=I_{}n as well.

The object of this exercise is to explain why the following proof is flawed:

Proof: Let G be the set of all matrices in M_{}nxn (R) which have a right inverse in M_{}nxn (R). Then G together with matrix multiplication is a group. Now proposition 1.3(b) implies the theorem:

Proposition 1.3b:

Let G be a group

G, GxG\mapstoG
(a¦b) \mapstoaxb
\foralla,b,c\inG, (a*b)*c = a*(b*c)
\existse \inG \forall a\inG, e*a=a=a*e
a'*a=e
\foralla \inG \exists a'\inG, a*a' = e

For any a\inG There exists precisely one right inverse a' and this is also a left inverse of a. We write a^{}-1 for the inverse of a.

Proof of proposition 1.3b:
Let a' be a right inverse of a
(a'*a)*(a'*a)=a'*(a*(a'*a)) by associativity
=a'*((a*a')*a)
=a'*(e*a) because a' is a right inverse of a
=a'*e because e is an identity element
Let b be a right inverse of c:=a'*a
c*b
=(c*c)*b
=c*(c*b) by associativity
=c*e since b is a right inverse of c
=c because e is an identity element
Hence a' is a left inverse of a

Note: proposition 1.3b is what is given in the lecture notes.


Homework Equations





The Attempt at a Solution



Does this have something to do with matrix multiplication being associative and distributive but not always commutative?

Hi guys, I'm still having real trouble with this question and i have tried to systematically think through it. I have an idea as to why the proof is flawed. Is it because in proposition 1.3b it says:

G, G x G \rightarrow G

So applying proposition 1.3b to this case where we consider that all the elements in our G are only square matrices which have right inverses. So when applying matrix multiplication to two elements of our G, does it always produce another square matrix that itself has a right inverse ie. that it is invertible and of the original group G? If not, this might be the flaw??

Thoughts would be very appreciated, thanks.
 
On second thoughts, i think thr propsotion takes care of this fact because on the next line it says,

(a¦b) \mapstoaxb

i think the ¦ sign might actually be a comma, i probably copied it wrong from the board.

So i think this sentence is meant to mean that x assigns an element axb to the ordered pair a, b.

There is another part to proposition 1.3b which i thought i didnt need to reproduce because the proof of a-1 also being a left inverse of a, had been shown. But here it is anyway because i can't think of anything else that is the flaw:

suppose a''\inG is another right inverse of a

\Rightarrowa'*(a*a'')=(a'*a)*a'' = a'*e=a' by associativity
=e*a''=a''

I'm really out of ideas, anyone have any?
 
Ah, could this be the problem?

Let b be a right inverse of c:=a'*a
c*b
=(c*c)*b
=c*(c*b) by associativity
=c*e since b is a right inverse of c
=c because e is an identity element
Hence a' is a left inverse of a

it says b is a right inverse of c, where c = a'*a
but a'*a is not strictly defined to have a right inverse that resides within our G?
 
I haven't read through the entire thing but there is an obvious error right at the start:

Let G be the set of all matrices in Mnxn(R) which have a right inverse in Mnxn(R). Then G together with matrix multiplication is a group.
You are trying to prove that each such matrix has an inverse but asserting that G is a group is the same as asserting each has an inverse.
 
HallsofIvy said:
I haven't read through the entire thing but there is an obvious error right at the start:


You are trying to prove that each such matrix has an inverse but asserting that G is a group is the same as asserting each has an inverse.

I see what you are saying, but i thought the main point of the theorem was to show that the right inverse of an element in G is also the left inverse of the element ie. AxA^-1=I and A^-1xA=I

Also, it says that each square matrix of the set has a right inverse and so is a group because there is an inverse for each element.

I hope i am missing a point there and that you are right because i just want to move on from this question lol, thanks again
 
  • #10
HallsofIvy said:
I haven't read through the entire thing but there is an obvious error right at the start:


You are trying to prove that each such matrix has an inverse but asserting that G is a group is the same as asserting each has an inverse.

I see it now, i just had to reread the definition of a group. So it is flawed because the so called proof assumes the theorem to be true in defining the set G and matrix multiplication to be a group. This is because for it to be a group it must meet the condition that:

If x\inG, then y\inG is an inverse element of x if x*y=e and y*x=e where e is an identity element of G.

which is what we want to prove so can't make this assumption in doing so.

This seems good to me? Thanks HallsofIvy, you guys are clever!
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
Replies
5
Views
1K
  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
Replies
1
Views
2K
Replies
4
Views
2K
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K