1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear Algebra Proof

  1. Mar 1, 2007 #1


    User Avatar
    Gold Member

    1. The problem statement, all variables and given/known data
    Prove the following theorm:
    Given any nxn matrix A, a mxm matrix B, the m+n x m+n matrix C:

    where the top left corner of C is A, the top right (X) is any mxn matrix, the bottom right is a nxm 0 matrix and the bottom right is B.
    |C| = |A|*|B| (|C| is the determinant of C...)

    I found a proof but the book did something different and i want to check if my way is correct.

    2. Relevant equations
    1) any square matrix that can't be reduced to the I matrix has a determinant of 0. (because it can be reduced to a matrix with a 0 row or coloum)
    2) multiplying a row or coloum by a scalar changes the determinant by a factor of 1 over the scalar.
    2) changing rows changes the sign of the determinant
    3) adding the multiple of a row doesn't change the D.

    3. The attempt at a solution

    Here's my proof:
    Lets say that both A and B can be reduced to the I matrix. if A can be reduced than it can be done with by operating only with the coloums using the 3 basic operations. the process of the reduction will change the D of A (|A|) by some factor that we'll call 'a'. so the determinant of A is just a (|A| = a) - this is because the determinant of I is 1.
    The same argument can be made with B - this time operating on the rows giving the determinant of b.
    Since we reduced A by operating on the coloums and B with the row, we can do all those operations on C without the operation on one of them effecting the rest of the matrix. Applying these operations to C changes the determinant by a*b and since the resaulting matrix is a trianguler matrix with a diagonal of only ones - the determinant of C is a*b. And so:
    |C| = a*b = |A| * |B|.

    If either A or B can't be reduced then one of them has a 0 determinant so |A|*|B| = 0
    and by an argument similar to the above |C| can also be reduced to a matrix with either a 0 row or 0 coloum so |C| = 0.
    Is this a correct proof?
  2. jcsd
  3. Mar 1, 2007 #2


    User Avatar
    Homework Helper

    Really? And what if A = I ? :smile:

    Try to use induction, I found the proof more elegant, but then again, degustibus.
  4. Mar 1, 2007 #3


    User Avatar
    Science Advisor
    Homework Helper

    Yeah, the proof looks correct. You could clean it up a little though. There is an invertible matrix U such that AU is either identity (if A is reducible) or has a zero column (if A is not). There is also an invertible V such that VB is either identity or has a zero row. The matrix


    which we'll call U', clearly has determinant |U|, and


    which we'll call V', has determinant |V|. If A has a zero column, then V'CU' does too. If B has a zero row, then V'CU' does too. So if either of A or B has determinant 0, V'CU' does too, so |V'CU'| = 0. Since V' and U' are invertible, this means |C| = 0, as desired. If on the other hand both A and B have non-zero determinant, then it's clear that V'CU' is just


    which has determinant 1. So

    |V'CU'| = 1
    |V'||C||U'| = 1
    |C| = (|V'||U'|)-1
    |C| = (|V||U|)-1

    Since AU = I = VB, U and V are just the inverses of A and B, thus have reciprocal determinants to A and B, i.e. |U| = |A|-1, |V| = |B|-1. This gives the desired results.

    Your argument was a really good one, I was just thinking it's better to clean it up instead of saying things like

    "the process of the reduction will change the D of A (|A|) by some factor that we'll call 'a'."

    which are unclear, and in fact wrong in this case (the determinant of A never changes, but row reducing A creates a new matrix with a new determinant).
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Linear Algebra Proof
  1. Linear Algebra Proof (Replies: 8)

  2. Linear Algebra proof (Replies: 21)