Prove B is invertible if AB = I

  • Thread starter Thread starter songoku
  • Start date Start date
Click For Summary
If AB = I for square matrices A and B, it is necessary to demonstrate that B is invertible by showing BA = I. The discussion emphasizes that simply assuming B is invertible is incorrect, as it leads to contradictions. The rank-nullity theorem is relevant, indicating that if the nullity of B is zero, then B is injective, which implies it is also surjective for square matrices. The conclusion drawn is that if B has full rank, it can be shown to be invertible without relying on properties not yet covered in the lesson. Ultimately, the proof hinges on demonstrating that B is both one-to-one and onto.
  • #31
songoku said:
I think I can show ##B## is onto but using pivot point. Since ##B## has pivot point on each row, ##B## is onto.

How to show it without using pivot?

Thanks
I must admit I've never heard of "pivot point" in linear algebra. You can show that ##B## is onto if it is one-to-one by using a basis for the vector space, ##V##, where ##B: V \to V## is a linear mapping.

Let ##\{ e_1, e_2 \dots e_n\}## be a basis for V and consider ##\{ Be_1, Be_2 \dots Be_n\}##. By the linearity of ##B## we have:
$$a_1Be_1 + a_2Be_2 + \dots a_nBa_n = 0 \ \Rightarrow \ B(a_1e_1 + \dots + a_ne_n) = 0$$(And, as ##B## is one-to-one and the ##e_i## are linearly independent):
$$\Rightarrow \ a_1e_1 + \dots + a_ne_n = 0 \ \Rightarrow \ a_1 = a_2 \dots = a_n = 0$$This shows that the vectors ##Be_i## are linearly independent and hence a basis for ##V##.

Finally, if ##v \in V##, then for some scalars ##v_i##:
$$v = v_1Be_1 + \dots + v_nBe_n = B(v_1e_1 + \dots + v_ne_n)$$And, as ##v## was arbitrary, we see that ##B## is onto.
 
  • Like
Likes songoku
Physics news on Phys.org
  • #32
PeroK said:
This shows that the vectors ##Be_i## are linearly independent and hence a basis for ##V##.

Finally, if ##v \in V##, then for some scalars ##v_i##:
$$v = v_1Be_1 + \dots + v_nBe_n = B(v_1e_1 + \dots + v_ne_n)$$And, as ##v## was arbitrary, we see that ##B## is onto.
And how does this differ from the rank-nullity theorem the OP already correctly used? You use surjectivity to prove surjectivity. But I don't want to confuse the OP more than he already is. His solution might not have been perfectly phrased, nevertheless, it was correct (post #18).

The idea with the Pivots is ok. In the end, we do have not enough information on what he may use according to his book, and what lies ahead.
 
  • #33
fresh_42 said:
And how does this differ from the rank-nullity theorem the OP already correctly used?
The OP asked how to prove it without using pivots, so I showed him.

fresh_42 said:
You use surjectivity to prove surjectivity.
I used injectivity to prove surjectivity. Which seems like an important concept in linear algebra and a neat solution.
 
  • #34
PeroK said:
The OP asked how to prove it without using pivots, so I showed him.I used injectivity to prove surjectivity. Which seems like an important concept in linear algebra and a neat solution.
Yes, and it is called the rank-nullity theorem.
 
  • #35
PeroK said:
I must admit I've never heard of "pivot point" in linear algebra.
Maybe the term that you are familiar with is "pivot" or "pivot element" or "pivot position"

Thank you very much for all the help and explanation fresh_42, PeroK, Office_Shredder, Addez123, WWGD
 
  • Like
Likes Delta2, WWGD and fresh_42
  • #36
It seems to me that the thread went a little out of course. A fine approach to this problem would be to consider the matrices as connected with the Linear transformations.

If A (n x n dimension) is the coefficient matrix for T:V_n -> V_n , T is a Linear transformation and the columns of A represent the coefficients of the basis elements of range V_n when T is applied to one of the basis elements of domain V_n.

Prove that A is invertible if and only if T is invertible. And then prove that the inverse of A is the coefficient matrix of ##T^{-1}##.
 
  • Like
  • Informative
Likes songoku and Delta2
  • #37
##\det AB\neq 0## implies ##\det B\neq 0##. Therefore, as was already pointed out, ##B=BAB## and therefore ##BA## is the identity.
 
  • Like
  • Informative
Likes songoku and Delta2
  • #38
Hi everyone,
I founded this discussion very interesting, so a search if someone did it before.

"A is linear transformation from a finite dimensional vector space to itself. AB=I amounts to saying A is surjective, hence it is bijective and has a left inverse C, so that CA=I. Of course, B=(CA)B=C(AB)=C, i.e. BA=I."

I founded this in the thread https://math.stackexchange.com/questions/152668/what-'s-the-short-proof-that-for-square-matrices-$ab-=-i$-implies-$ba-=-i$?
(searching for "\(AB = I\) for square matrices " on SearchOnMath) that can give some other contribuitions.
 
  • Like
  • Informative
Likes songoku and Delta2
  • #39
You say that you learned about ranks. Did you learn the following?
  1. rank(A) ##\geq## rank(AB)?
  2. If A is nxn and rank(A) = n then A is invertible.
You could prove that A is invertible using 1 and 2.
 
Last edited:
  • Like
  • Informative
Likes songoku and Delta2
  • #40
Using determinants and that ##det(AB)=det(A)det(B)## for square matrices+ A square matrix is invertible if its determinant is not zero, this problem is very easy. But OP says at post #4 that they haven't covered determinants yet.
 
  • Like
Likes songoku
  • #41
Since ##AB=I##, it means ##A## is surjective and ##B## is injective. Thus, ##A,B## are both bijective due to finite dimension.
 
  • Like
  • Informative
Likes songoku, PeroK and Delta2

Similar threads

  • · Replies 20 ·
Replies
20
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
4
Views
5K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K