karnten07
- 206
- 0
Linear algebra, flawed proof
Theorem: Every square matrix which has a right inverse is invertible. More precisely: let A \in M_{}nxn (R) and suppose there is a matrix B\in M_{}nxn (R) such that AB = I_{}n; then we have BA=I_{}n as well.
The object of this exercise is to explain why the following proof is flawed:
Proof: Let G be the set of all matrices in M_{}nxn (R) which have a right inverse in M_{}nxn (R). Then G together with matrix multiplication is a group. Now proposition 1.3(b) implies the theorem:
Proposition 1.3b:
Let G be a group
G, GxG\mapstoG
(a¦b) \mapstoaxb
\foralla,b,c\inG, (a*b)*c = a*(b*c)
\existse \inG \forall a\inG, e*a=a=a*e
a'*a=e
\foralla \inG \exists a'\inG, a*a' = e
For any a\inG There exists precisely one right inverse a' and this is also a left inverse of a. We write a^{}-1 for the inverse of a.
Proof of proposition 1.3b:
Let a' be a right inverse of a
(a'*a)*(a'*a)=a'*(a*(a'*a)) by associativity
=a'*((a*a')*a)
=a'*(e*a) because a' is a right inverse of a
=a'*e because e is an identity element
Let b be a right inverse of c:=a'*a
c*b
=(c*c)*b
=c*(c*b) by associativity
=c*e since b is a right inverse of c
=c because e is an identity element
Hence a' is a left inverse of a
Note: proposition 1.3b is what is given in the lecture notes.
Does this have something to do with matrix multiplication being associative and distributive but not always commutative?
Homework Statement
Theorem: Every square matrix which has a right inverse is invertible. More precisely: let A \in M_{}nxn (R) and suppose there is a matrix B\in M_{}nxn (R) such that AB = I_{}n; then we have BA=I_{}n as well.
The object of this exercise is to explain why the following proof is flawed:
Proof: Let G be the set of all matrices in M_{}nxn (R) which have a right inverse in M_{}nxn (R). Then G together with matrix multiplication is a group. Now proposition 1.3(b) implies the theorem:
Proposition 1.3b:
Let G be a group
G, GxG\mapstoG
(a¦b) \mapstoaxb
\foralla,b,c\inG, (a*b)*c = a*(b*c)
\existse \inG \forall a\inG, e*a=a=a*e
a'*a=e
\foralla \inG \exists a'\inG, a*a' = e
For any a\inG There exists precisely one right inverse a' and this is also a left inverse of a. We write a^{}-1 for the inverse of a.
Proof of proposition 1.3b:
Let a' be a right inverse of a
(a'*a)*(a'*a)=a'*(a*(a'*a)) by associativity
=a'*((a*a')*a)
=a'*(e*a) because a' is a right inverse of a
=a'*e because e is an identity element
Let b be a right inverse of c:=a'*a
c*b
=(c*c)*b
=c*(c*b) by associativity
=c*e since b is a right inverse of c
=c because e is an identity element
Hence a' is a left inverse of a
Note: proposition 1.3b is what is given in the lecture notes.
Homework Equations
The Attempt at a Solution
Does this have something to do with matrix multiplication being associative and distributive but not always commutative?
Last edited: