High School Is AB Invertible If n < m and B has a Non-Trivial Kernel?

Click For Summary
If A is an m x n matrix and B is an n x m matrix with n < m, then the product AB is not invertible. This is demonstrated through a proof by contradiction, where the existence of a non-trivial solution to Bx = 0 implies that AB maps a non-zero vector to zero, violating the definition of invertibility. The discussion also highlights the preference for direct proofs over proofs by contradiction, emphasizing that direct methods can be more intuitive and easier to connect with other mathematical concepts. Additionally, the rank of the product AB is limited to n, reinforcing that it cannot be invertible since its rank is less than m. Overall, the conclusion is that AB is not invertible due to its rank deficiency.
Buffu
Messages
849
Reaction score
146
If ##A## is ##m \times n## matrix, ##B## is an ##n \times m## matrix and ##n < m##. Then show that ##AB## is not invertible.

Let ##R## be the reduced echelon form of ##AB## and let ##AB## be invertible.

##I = P(AB)## where ##P## is some invertible matrix.

Since ##n < m## and ##B## is ##n \times m## therefore there is a non trivial solution to ##B\mathbf X = 0##
Let it be ##\bf X_0##

##I\mathbf X = P(AB)\mathbf X_0 \implies \mathbf X_0 = PA (B \mathbf X_0) = PA \times 0 = 0##

Which means ##\mathbf X_0 = 0##, which is contradiction. Therefore ##AB## is not invertible.

Is this correct ?
 
Physics news on Phys.org
This look ok, though I don't really think the proof by contradiction is necessary?

Notation wise, I would probably have used ##\mathbf x## to denote a vector in ##B \mathbf x = 0## -- as is, I'm not sure if the X is a matrix or vector, but it would be customary to use a vector here. Further, I wouldn't write ##A \times 0##... that looks like a cross product?

I typically avoid proofs by contradiction if feasible. It occurs to me that when you note that ##B \mathbf x_0 = 0## for some ##\mathbf x_0 \neq 0##, this tells us right away that ##AB## is not invertible. By definition an invertible matrix does not map any non-zero vector to the zero vector and ##(AB)\mathbf x_0 = A(B\mathbf x_0) = A \mathbf 0 = \mathbf 0##, but clearly ##(AB)## does so and hence is not invertible.
 
  • Like
Likes Buffu
StoneTemplePython said:
This look ok, though I don't really think the proof by contradiction is necessary?

Notation wise, I would probably have used ##\mathbf x## to denote a vector in ##B \mathbf x = 0## -- as is, I'm not sure if the X is a matrix or vector, but it would be customary to use a vector here. Further, I wouldn't write ##A \times 0##... that looks like a cross product?

I typically avoid proofs by contradiction if feasible. It occurs to me that when you note that ##B \mathbf x_0 = 0## for some ##\mathbf x_0 \neq 0##, this tells us right away that ##AB## is not invertible. By definition an invertible matrix does not map any non-zero vector to the zero vector and ##(AB)\mathbf x_0 = A(B\mathbf x_0) = A \mathbf 0 = \mathbf 0##, but clearly ##(AB)## does so and hence is not invertible.

Thank you, I made it more complicated :(. I used bold to indicate it is a vector.

Why do you avoid proof by contradiction ?
 
I suppose it is a matter of taste, but if I can show something directly, I try to show it directly. It tends to make a lot more sense to me, I can generally directly attach it into a lattice work of other ideas and theorems, is easier to remember and so on.

The tradeoff is that some things can only be shown by contradiction, or in some other cases, perhaps the proof by contradiction is a paragraph and a direct proof (say, a derivation) takes 3 pages. So you have to pick your spots.

Perhaps more idiosyncratically, I would add that I've become fond of proofs that have a "visual" component to them. In this case, we could note

##AB = \mathbf a_1 \widetilde{\mathbf b_1}^T + \mathbf a_1 \widetilde{\mathbf b_2}^T + ... + \mathbf a_n \widetilde{\mathbf b_n}^T##

which is to say ##(AB)## is an m x m matrix that is comprised of a series of ##n## rank one updates. When you add ##n## rank one matrices, you have a matrix with at most rank ##n##. Hence ##(AB) ## has at most rank ##n \lt m## and isn't invertible.
 
  • Like
Likes Buffu
StoneTemplePython said:
I suppose it is a matter of taste, but if I can show something directly, I try to show it directly. It tends to make a lot more sense to me, I can generally directly attach it into a lattice work of other ideas and theorems, is easier to remember and so on.

The tradeoff is that some things can only be shown by contradiction, or in some other cases, perhaps the proof by contradiction is a paragraph and a direct proof (say, a derivation) takes 3 pages. So you have to pick your spots.

Perhaps more idiosyncratically, I would add that I've become fond of proofs that have a "visual" component to them. In this case, we could note

##AB = \mathbf a_1 \widetilde{\mathbf b_1}^T + \mathbf a_1 \widetilde{\mathbf b_2}^T + ... + \mathbf a_n \widetilde{\mathbf b_n}^T##

which is to say ##(AB)## is an m x m matrix that is comprised of a series of ##n## rank one updates. When you add ##n## rank one matrices, you have a matrix with at most rank ##n##. Hence ##(AB) ## has at most rank ##n \lt m## and isn't invertible.

I also like that type of proofs but the more elegant a proof is the more difficult it is to derive.
 
Buffu said:
I also like that type of proofs but the more elegant a proof is the more difficult it is to derive.

Don't worry about it. Your proof worked fine.
 
  • Like
Likes Buffu
Buffu said:
If ##A## is ##m \times n## matrix, ##B## is an ##n \times m## matrix and ##n < m##. Then show that ##AB## is not invertible.

Let ##R## be the reduced echelon form of ##AB## and let ##AB## be invertible.

##I = P(AB)## where ##P## is some invertible matrix.

Since ##n < m## and ##B## is ##n \times m## therefore there is a non trivial solution to ##B\mathbf X = 0##
Let it be ##\bf X_0##

##I\mathbf X = P(AB)\mathbf X_0 \implies \mathbf X_0 = PA (B \mathbf X_0) = PA \times 0 = 0##

Which means ##\mathbf X_0 = 0##, which is contradiction. Therefore ##AB## is not invertible.

Is this correct ?
EDIT OWise, you seem to be saying/arguing that the kernel of B is contained in the kernel of AB, which is true, and proves the point.
Just to nitpick a bit; you may want to put an index in your identity I , to know whether it is the ##n \times n ; m \times m##, etc. Here , it is ##n \times n ## , by matrix product rules. Specially if you are doing proofs, it is a good idea.
 
  • Like
Likes Buffu

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 11 ·
Replies
11
Views
3K
Replies
3
Views
2K