A question on invertible matrixes

  • Context: Undergrad 
  • Thread starter Thread starter kostas230
  • Start date Start date
Click For Summary

Discussion Overview

The discussion centers around the properties of invertible matrices in linear algebra, specifically exploring the relationship between the definitions of invertibility and the implications of one matrix product equaling the identity matrix. The scope includes theoretical aspects and mathematical reasoning.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant presents a definition of an invertible matrix and asks how to prove that if BA = I, then AB = I using matrix theory.
  • Another participant interprets the question as proving AB = I given BA = I and suggests a method involving the inverse matrices.
  • A third participant challenges the assumption that proving AB = I is straightforward, emphasizing that it requires A and B to be square matrices and providing a counterexample with non-square matrices.
  • This participant outlines a proof strategy involving homogeneous and inhomogeneous systems to show that if BA = I, then it leads to the conclusion that AB = I, assuming square matrices.
  • A final participant reiterates that the conclusion does not hold unless the matrices involved are square.

Areas of Agreement / Disagreement

Participants do not reach a consensus. There is disagreement about the validity of the proof and the conditions under which the relationship between AB and BA holds, particularly regarding the necessity of square matrices.

Contextual Notes

The discussion highlights the importance of matrix dimensions in the context of invertibility and the implications of matrix products equaling the identity matrix. The proof provided relies on assumptions that may not hold in all cases.

kostas230
Messages
96
Reaction score
3
I'm making notes for linear algebra, and I'm using the weaker definition for the invertible matrix:

"A matrix A is called invertible it there exists a matrix B such that BA = I"

How do we prove that AB = I using only matrix theory?
 
Physics news on Phys.org
You mean how one would prove that [itex]AB = I[/itex] provided that [itex]BA = I[/itex]?

If you assume that [itex]BA = I[/itex] and thus that [itex]A^{-1} = B[/itex] and [itex]A = B^{-1}[/itex] you could try multiplying from the left with [itex]B^{-1}[/itex] and from the right with [itex]A^{-1}[/itex], which gives you

[itex]B^{-1}BAA^{-1} = B^{-1}IA^{-1} \ .[/itex]​
 
Square, you are assuming what should be proved: You assume that there is a matrix ##A^{-1}## such that ##AA^{-1}=A^{-1}A=I##, but this is what should be proved.

To prove that ##BA=I## implies ##AB=I## is not entirely easy. First of all, we must assume that ##A## and/or ##B## are square matrices. Otherwise, this is false. For example, if

##B=[1\quad 1]## and ##A=[2\quad -1]^{T}##, then ##BA=[1]=I_1## and ##AB\neq I_2##.

So, assume that ##A## and ##B## are ##n\times n##-matrices, for some ##n\ge 1##, such that ##BA=I##.

Now, consider the homogeneous system

##A\bf x= 0## (1).

If ##\bf x## is a solution of (1), then

##{\bf x}=I{\bf x}=(BA){\bf x}=B(A{\bf x})=B{\bf 0}={\bf 0}##.

This means that the homogeneous system (1) has only the trivial solution ##{\bf x} ={\bf 0}##. Thus, if we solve (1) by elimination, transforming ##A## to reduced echelon form by a sequence of elementary row operations, the resulting reduced echelon form must be ##I## (otherwise, there would be non-pivot columns which correspond to free variables).

Now, consider the inhomogeneous system

##A{\bf x} ={\bf b}##, (2),

where ##\bf b## is an arbitrary vector in ##R^n##.

If we perform the same sequence of row operations as in the solution of (1), the resulting coefficient matrix is again ##I##, since we started with the same coefficient matrix ##A##.
This means that (2) has a unique solution.

This holds for all vectors ##{\bf b}\in R^n##. In particular, it holds for the standard basis vectors ##{\bf e_1}, {\bf e_2}\dots, {\bf e_n}##. Let us denote the solutions for these vectors by ##{\bf x_1}, {\bf x_2},\dots,{\bf x_n}##, respectively.

Let ##X## be the ##n\times n##-matrix with the vectors ##{\bf x_1}, {\bf x_2},\dots,{\bf x_n}## as columms, that is ##X=[{\bf x_1}\, {\bf x_2}\dots{\bf x_n}]##. Also, notice that ##[{\bf e_1}\,{\bf e_2}\dots{\bf e_n}]=I##.

It follows that ##AX=I##.

Next, ##B=BI=B(AX)=(BA)X=IX=X##, that is, ##B=X##.
Hence, ##AB=I##.
 
Last edited:
it is not true unless the matrices are square.
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 14 ·
Replies
14
Views
5K
  • · Replies 1 ·
Replies
1
Views
3K