1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear Algebra Question

  1. Sep 11, 2015 #1
    1. The problem statement, all variables and given/known data

    Suppose A is non-singular (nxn) matrix. Given that AB=(zero matrix), show that B=(zero matrix).

    Hint: Express A as [A1, A2, ...., An]
    and Express AB as [AB1, AB2, ..., ABn]

    2. Relevant equations


    3. The attempt at a solution

    I argued that because A is non-singular, A=[A1, A2, ..., An] is linearly independent, thus Ax=(theta) has the unique solution x=(theta).

    That is, for x1A1+x2A2+...+xnAn=(theta), x1=x2=...=xn=0.

    Further, if A is a zero matrix, Ax=(theta) cannot have the unique solution x=(theta).

    Because A cannot be a zero matrix and AB=(zero matrix), B=(zero matrix) by necessity.

    Is my reasoning correct in that a non-singular matrix cannot be a zero matrix?

    Thanks
     
  2. jcsd
  3. Sep 11, 2015 #2

    andrewkirk

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I don't understand your references to theta.

    If x1A1+x2A2+...+xnAn = 0 and A1,...,An are linearly independent, what can we say about x1, ..., xn?

    Then how can that help us make deductions about B?
     
  4. Sep 11, 2015 #3

    BvU

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    You mean ##\vec{\vec A} \cdot \vec x = \vec \theta ## has the unique solution ##\vec x = {\vec{\vec A}} ^{-1} \cdot \vec \theta\ ## ?

    And for
    you mean

    ##\Sigma A_{ij} x_j = 0 ## for all i ##\ \Rightarrow x_j = 0 \ ## for all ##j##

    --
     
  5. Sep 11, 2015 #4
    By theta, I mean the zero vector in R^n.

    So x1,....,xn are all equal to 0 because A1....An are linearly independent.

    Doesn't this imply that A is not a zero matrix?

    If A was a zero matrix, x1, ....., xn would not need to equal 0 such that x1A1+....+xnAn = 0. x1, ......, xn could take on any value and the equation would still hold.

    So a deduction we can make about B is because AB= the nxn zero matrix and A cannot be an nxn zero matrix, B has to be an nxn zero matrix?
     
  6. Sep 11, 2015 #5

    BvU

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    That helps ! Wasn't aware of that convention :smile: .
    Nor of the convention that ##A_1## is the vector ##A_{1j}\ \ ##. (although, upon second reading, the given hint wants you to do this)

    But with that, your reasoning is a lot more valid. I would even say it's correct.
     
  7. Sep 11, 2015 #6

    andrewkirk

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    That A is not the zero matrix follows immediately from it being non-singular, as the zero matrix is singular. You don't need to do the x1 ... xn thing to conclude that.

    However, IIRC it is not the case that if the product of two matrices is zero, one of them must be zero.

    Look at the second line of the hint they gave you. If we denote the elements of B1 by x1, ... , xn then what can you conclude from what you've done so far?

    By the way, I assume that in this problem the definition of nonsingular you are using is that A has a nonzero determinant, and you may have a theorem that tells you that's equivalent to the columns being linearly independent and ditto for the rows. If we take nonsingular to mean that A is invertible (which I think I have seen used sometimes), or if you are allowed to use the theorem that nonsingular --> invertible then the problem is trivial.
     
    Last edited: Sep 11, 2015
  8. Sep 11, 2015 #7
    Denoting the elements of B1 by x1, ...., xn would render all elements of B1 equal to 0?

    Then we could do this with every column of B, making B an nxn zero matrix?

    Edit: This problem is from a section before invertibility and determinants.
     
    Last edited: Sep 11, 2015
  9. Sep 11, 2015 #8

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    One can also characterize B, the "zero- matrix", as the matrix such that Bx= 0 for every vector x. If AB= 0, then ABx= 0. Now, suppose B is not the zero matrix. Then there exist some x such that Bx is not 0. So we have ABx= A(Bx)= 0. Finish that.
     
  10. Sep 11, 2015 #9
    Assuming B is not the zero matrix, there is an x such that Bx is not 0. We know by virtue of A being non-singular that A is not the zero matrix.

    Isn't it possible that there exists an x such that Bx is not 0, but A(Bx)=0?
     
  11. Sep 11, 2015 #10

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    No. If Av= 0 for any non-zero v, then A is singular. That was my point.
     
  12. Sep 11, 2015 #11
    If Av=0 for any non-zero vector v, then A is singular.

    Under the assumption that B is not the zero matrix, there exists and x such that v=(Bx) is not 0.

    If this is the case, and A(Bx)=Av=O, then A is singular.

    But the question states A is non-singular.

    So when we assume B not to be the zero matrix we arrive at a contradiction.

    Therefore B must be the zero matrix.

    Is my reasoning correct?
     
  13. Sep 11, 2015 #12
    Okay, all of the information I can deduce from the question:

    B=[B1, ..., Bn]
    AB=[AB1, ..., ABn]

    These were hints given in the question - I can't determine what their relevance is.

    Also from the question, A is non-singular.

    From this we know A is not the zero-vector (the zero-vector is singular).

    I've read through the section a number of times, but I still have no idea where to go from here.

    I can't see why its necessary for B to equal the zero-matrix to satisfy AB=zero-matrix...



    Edit: I know that because A is non-singular, it is also invertible and because it is invertible, we can just multiply both sides of AB=zero-vector by A-1. But invertibility and determinants are covered in the subsequent section, so I don't think we're meant to solve the problem this way.
     
    Last edited: Sep 11, 2015
  14. Sep 11, 2015 #13

    andrewkirk

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    You are given that ##AB=0##, ie that ##\forall i:\ (A\times B)_i=A\times (B_i)=\vec{0}##.

    You have worked out about ##A_1x_1+....+A_nx_n=\vec{0}## implies that all the ##x_i## are zero. Can you write that sum as the mutiplication of a matrix with a vector?
     
    Last edited: Sep 11, 2015
  15. Sep 11, 2015 #14
    Is the hint saying AB can be represented as a linear combination of the vector columns of B where the coefficients for each column are the matrix A.

    And because A is non-zero and AB=0, the only way for the linear combination AB1+AB2+...+ABn=0 to be valid is if each column of B is composed of all zeros, thus making B a zero vector?

    Please tell me this thought process is at least on track....
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Linear Algebra Question
Loading...