A question on invertible matrixes

  • Thread starter kostas230
  • Start date
In summary, in order to prove that AB = I using only matrix theory, we must assume that A and B are both square matrices. Then, by considering the homogeneous and inhomogeneous systems of equations, and using the standard basis vectors, we can show that AB = I if and only if BA = I.
  • #1
kostas230
96
3
I'm making notes for linear algebra, and I'm using the weaker definition for the invertible matrix:

"A matrix A is called invertible it there exists a matrix B such that BA = I"

How do we prove that AB = I using only matrix theory?
 
Physics news on Phys.org
  • #2
You mean how one would prove that [itex]AB = I[/itex] provided that [itex]BA = I[/itex]?

If you assume that [itex]BA = I[/itex] and thus that [itex]A^{-1} = B[/itex] and [itex]A = B^{-1}[/itex] you could try multiplying from the left with [itex]B^{-1}[/itex] and from the right with [itex]A^{-1}[/itex], which gives you

[itex]B^{-1}BAA^{-1} = B^{-1}IA^{-1} \ .[/itex]​
 
  • #3
Square, you are assuming what should be proved: You assume that there is a matrix ##A^{-1}## such that ##AA^{-1}=A^{-1}A=I##, but this is what should be proved.

To prove that ##BA=I## implies ##AB=I## is not entirely easy. First of all, we must assume that ##A## and/or ##B## are square matrices. Otherwise, this is false. For example, if

##B=[1\quad 1]## and ##A=[2\quad -1]^{T}##, then ##BA=[1]=I_1## and ##AB\neq I_2##.

So, assume that ##A## and ##B## are ##n\times n##-matrices, for some ##n\ge 1##, such that ##BA=I##.

Now, consider the homogeneous system

##A\bf x= 0## (1).

If ##\bf x## is a solution of (1), then

##{\bf x}=I{\bf x}=(BA){\bf x}=B(A{\bf x})=B{\bf 0}={\bf 0}##.

This means that the homogeneous system (1) has only the trivial solution ##{\bf x} ={\bf 0}##. Thus, if we solve (1) by elimination, transforming ##A## to reduced echelon form by a sequence of elementary row operations, the resulting reduced echelon form must be ##I## (otherwise, there would be non-pivot columns which correspond to free variables).

Now, consider the inhomogeneous system

##A{\bf x} ={\bf b}##, (2),

where ##\bf b## is an arbitrary vector in ##R^n##.

If we perform the same sequence of row operations as in the solution of (1), the resulting coefficient matrix is again ##I##, since we started with the same coefficient matrix ##A##.
This means that (2) has a unique solution.

This holds for all vectors ##{\bf b}\in R^n##. In particular, it holds for the standard basis vectors ##{\bf e_1}, {\bf e_2}\dots, {\bf e_n}##. Let us denote the solutions for these vectors by ##{\bf x_1}, {\bf x_2},\dots,{\bf x_n}##, respectively.

Let ##X## be the ##n\times n##-matrix with the vectors ##{\bf x_1}, {\bf x_2},\dots,{\bf x_n}## as columms, that is ##X=[{\bf x_1}\, {\bf x_2}\dots{\bf x_n}]##. Also, notice that ##[{\bf e_1}\,{\bf e_2}\dots{\bf e_n}]=I##.

It follows that ##AX=I##.

Next, ##B=BI=B(AX)=(BA)X=IX=X##, that is, ##B=X##.
Hence, ##AB=I##.
 
Last edited:
  • #4
it is not true unless the matrices are square.
 
  • #5


To prove that AB = I, we can use the following steps:

1. First, we need to show that A and B are square matrices of the same size. This is because for AB to be equal to I, the number of columns in A must be equal to the number of rows in B.

2. Next, we can use the definition of an invertible matrix to show that there exists a matrix C such that AC = I. This means that A is also a left inverse of B.

3. Similarly, we can use the definition of an invertible matrix again to show that there exists a matrix D such that DA = I. This means that A is also a right inverse of B.

4. Now, using the associative property of matrix multiplication, we can rearrange the expression AB = I as (DA)B = I.

5. Substituting in the right inverse of A, we get (DA)B = (AC)B.

6. Using the distributive property of matrix multiplication, we can expand this to (DA)B = (AC)B = A(CB).

7. Since matrix multiplication is associative, we can further simplify this to (DA)B = A(CB) = (AC)B = I.

8. Therefore, we have shown that AB = I using only matrix theory. This is because we have used the definition of an invertible matrix and the properties of matrix multiplication to show that A is both a left and right inverse of B, leading to the conclusion that AB = I.
 

1. What is an invertible matrix?

An invertible matrix, also known as a nonsingular matrix, is a square matrix that has a unique inverse matrix. This means that when multiplied together, the matrix and its inverse will result in the identity matrix. In simpler terms, an invertible matrix is a matrix that can be "undone" or reversed.

2. How do you determine if a matrix is invertible?

A matrix is invertible if its determinant is not equal to 0. The determinant is a numerical value that can be calculated using a specific formula for square matrices. If the determinant is 0, then the matrix is not invertible. It is also important to note that only square matrices can be invertible.

3. What is the purpose of an invertible matrix?

Invertible matrices are used in a variety of mathematical applications, including solving systems of linear equations, finding eigenvalues and eigenvectors, and representing transformations in linear algebra. They also have practical applications in fields such as engineering, physics, and computer science.

4. Can all matrices be inverted?

No, only square matrices can be inverted. This means that the number of rows must be equal to the number of columns. Additionally, not all square matrices are invertible. A matrix must have a non-zero determinant in order to be invertible.

5. How do you find the inverse of a matrix?

The inverse of a matrix can be found by using a specific formula, known as the inverse matrix formula. This formula involves finding the determinant of the original matrix and using it to calculate the elements of the inverse matrix. There are also various methods and algorithms for finding the inverse of a matrix, such as Gaussian elimination and LU decomposition.

Similar threads

  • Linear and Abstract Algebra
Replies
2
Views
603
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
732
  • Linear and Abstract Algebra
Replies
11
Views
2K
Replies
7
Views
829
  • Linear and Abstract Algebra
Replies
1
Views
597
  • Linear and Abstract Algebra
Replies
20
Views
987
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
2
Replies
40
Views
3K
  • Linear and Abstract Algebra
Replies
8
Views
1K
Back
Top