Linear Algebra: Non-Singular Matrix and Zero Matrix

In summary, the problem is that if A is not the zero-vector, then AB=0 which is a contradiction. B must be the zero-vector, thus solving the problem.
  • #1
Tim 1234
11
0

Homework Statement



Suppose A is non-singular (nxn) matrix. Given that AB=(zero matrix), show that B=(zero matrix).

Hint: Express A as [A1, A2, ..., An]
and Express AB as [AB1, AB2, ..., ABn]

Homework Equations

The Attempt at a Solution


[/B]
I argued that because A is non-singular, A=[A1, A2, ..., An] is linearly independent, thus Ax=(theta) has the unique solution x=(theta).

That is, for x1A1+x2A2+...+xnAn=(theta), x1=x2=...=xn=0.

Further, if A is a zero matrix, Ax=(theta) cannot have the unique solution x=(theta).

Because A cannot be a zero matrix and AB=(zero matrix), B=(zero matrix) by necessity.

Is my reasoning correct in that a non-singular matrix cannot be a zero matrix?

Thanks
 
Physics news on Phys.org
  • #2
I don't understand your references to theta.

If x1A1+x2A2+...+xnAn = 0 and A1,...,An are linearly independent, what can we say about x1, ..., xn?

Then how can that help us make deductions about B?
 
  • #3
thus Ax=(theta) has the unique solution x=(theta).
You mean ##\vec{\vec A} \cdot \vec x = \vec \theta ## has the unique solution ##\vec x = {\vec{\vec A}} ^{-1} \cdot \vec \theta\ ## ?

And for
That is, for x1A1+x2A2+...+xnAn=(theta), x1=x2=...=xn=0
you mean

##\Sigma A_{ij} x_j = 0 ## for all i ##\ \Rightarrow x_j = 0 \ ## for all ##j##

--
 
  • #4
By theta, I mean the zero vector in R^n.

So x1,...,xn are all equal to 0 because A1...An are linearly independent.

Doesn't this imply that A is not a zero matrix?

If A was a zero matrix, x1, ..., xn would not need to equal 0 such that x1A1+...+xnAn = 0. x1, ..., xn could take on any value and the equation would still hold.

So a deduction we can make about B is because AB= the nxn zero matrix and A cannot be an nxn zero matrix, B has to be an nxn zero matrix?
 
  • #5
By theta, I mean the zero vector in R^n
That helps ! Wasn't aware of that convention :smile: .
Nor of the convention that ##A_1## is the vector ##A_{1j}\ \ ##. (although, upon second reading, the given hint wants you to do this)

But with that, your reasoning is a lot more valid. I would even say it's correct.
 
  • #6
That A is not the zero matrix follows immediately from it being non-singular, as the zero matrix is singular. You don't need to do the x1 ... xn thing to conclude that.

However, IIRC it is not the case that if the product of two matrices is zero, one of them must be zero.

Look at the second line of the hint they gave you. If we denote the elements of B1 by x1, ... , xn then what can you conclude from what you've done so far?

By the way, I assume that in this problem the definition of nonsingular you are using is that A has a nonzero determinant, and you may have a theorem that tells you that's equivalent to the columns being linearly independent and ditto for the rows. If we take nonsingular to mean that A is invertible (which I think I have seen used sometimes), or if you are allowed to use the theorem that nonsingular --> invertible then the problem is trivial.
 
Last edited:
  • #7
Denoting the elements of B1 by x1, ..., xn would render all elements of B1 equal to 0?

Then we could do this with every column of B, making B an nxn zero matrix?

Edit: This problem is from a section before invertibility and determinants.
 
Last edited:
  • #8
One can also characterize B, the "zero- matrix", as the matrix such that Bx= 0 for every vector x. If AB= 0, then ABx= 0. Now, suppose B is not the zero matrix. Then there exist some x such that Bx is not 0. So we have ABx= A(Bx)= 0. Finish that.
 
  • #9
Assuming B is not the zero matrix, there is an x such that Bx is not 0. We know by virtue of A being non-singular that A is not the zero matrix.

Isn't it possible that there exists an x such that Bx is not 0, but A(Bx)=0?
 
  • #10
No. If Av= 0 for any non-zero v, then A is singular. That was my point.
 
  • #11
If Av=0 for any non-zero vector v, then A is singular.

Under the assumption that B is not the zero matrix, there exists and x such that v=(Bx) is not 0.

If this is the case, and A(Bx)=Av=O, then A is singular.

But the question states A is non-singular.

So when we assume B not to be the zero matrix we arrive at a contradiction.

Therefore B must be the zero matrix.

Is my reasoning correct?
 
  • #12
Okay, all of the information I can deduce from the question:

B=[B1, ..., Bn]
AB=[AB1, ..., ABn]

These were hints given in the question - I can't determine what their relevance is.

Also from the question, A is non-singular.

From this we know A is not the zero-vector (the zero-vector is singular).

I've read through the section a number of times, but I still have no idea where to go from here.

I can't see why its necessary for B to equal the zero-matrix to satisfy AB=zero-matrix...
Edit: I know that because A is non-singular, it is also invertible and because it is invertible, we can just multiply both sides of AB=zero-vector by A-1. But invertibility and determinants are covered in the subsequent section, so I don't think we're meant to solve the problem this way.
 
Last edited:
  • #13
You are given that ##AB=0##, ie that ##\forall i:\ (A\times B)_i=A\times (B_i)=\vec{0}##.

You have worked out about ##A_1x_1+...+A_nx_n=\vec{0}## implies that all the ##x_i## are zero. Can you write that sum as the mutiplication of a matrix with a vector?
 
Last edited:
  • #14
Is the hint saying AB can be represented as a linear combination of the vector columns of B where the coefficients for each column are the matrix A.

And because A is non-zero and AB=0, the only way for the linear combination AB1+AB2+...+ABn=0 to be valid is if each column of B is composed of all zeros, thus making B a zero vector?

Please tell me this thought process is at least on track...
 

What is Linear Algebra?

Linear Algebra is a branch of mathematics that deals with the study of linear equations and their transformations. It involves the use of matrices, vectors, and other mathematical structures to solve problems related to systems of linear equations, vector spaces, and linear transformations.

What are the applications of Linear Algebra?

Linear Algebra has various applications in fields such as physics, computer science, engineering, economics, and statistics. Some common applications include image and signal processing, data compression, machine learning, and cryptography.

What are the basic concepts in Linear Algebra?

The basic concepts in Linear Algebra include vector spaces, linear transformations, matrices, determinants, eigenvalues and eigenvectors, and systems of linear equations. Understanding these concepts is essential for solving problems in Linear Algebra.

What are the advantages of using Linear Algebra in problem-solving?

Linear Algebra provides a powerful and efficient framework for solving complex problems involving large amounts of data. It also allows for the use of computational techniques, making it easier to solve problems that are difficult or impossible to solve by hand.

How can I improve my understanding of Linear Algebra?

To improve your understanding of Linear Algebra, it is important to practice solving problems and working through examples. You can also read textbooks or watch online lectures to gain a deeper understanding of the concepts. Collaborating with others and discussing problems can also be helpful in improving your understanding.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
386
  • Calculus and Beyond Homework Help
Replies
1
Views
280
  • Calculus and Beyond Homework Help
Replies
14
Views
594
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
923
  • Calculus and Beyond Homework Help
Replies
3
Views
3K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Precalculus Mathematics Homework Help
Replies
32
Views
839
Replies
8
Views
1K
Back
Top