1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Proof regarding determinant of block matrices

  1. Feb 25, 2017 #1
    1. The problem statement, all variables and given/known data
    Let A,B,C,D be commuting n-square matrices. Consider the 2n-square block matrix ##M= \begin{bmatrix}
    A & B \\
    C & D \\
    \end{bmatrix}##. Prove that ##\left | M \right |=\left | A \right |\left | D \right |-\left | B \right |\left | C \right |##. Show that the result may not be true if the matrices do not commute.

    2. Relevant equations
    ##det(M)= det(A_1)det(A_2)...det(A_n)## (Where M is an upper (lower) triangular block matrix with ##A_1,A_2,...,A_n## diagonal blocks.)

    3. The attempt at a solution
    At first I tried using the theorems of determinants in combinations with the properties of (commuting) matrices to try and get the desired expression, but had no success. I then tried expressing the determinant using the elements of the square matrices in M and dividing the expression to 4 separate determinants, but could not figure out how.
    The closest I managed to get to a solution is prove the above equation where C=0.
    Any help would be appreciated.
     
  2. jcsd
  3. Mar 2, 2017 #2
    Thanks for the thread! This is an automated courtesy bump. Sorry you aren't generating responses at the moment. Do you have any further information, come to any new conclusions or is it possible to reword the post? The more details the better.
     
  4. Mar 3, 2017 #3
    Thank you Mr. AI bot thingy but I said all I really have to say, It's pretty straight forward.
     
  5. Mar 3, 2017 #4

    fresh_42

    Staff: Mentor

    Yes, but your attempts could have been a bit more elaborated. Unfortunately, the only solution I can think of are either an induction or the explicit formula via diagonals. Doesn't sound funny though.
     
  6. Mar 3, 2017 #5
    Unfortunately I cannot explain any further simply because I don't even know how to approach this, I have never worked with commutative matrices before (Learning from Shaum's outlines), let alone under the context of determinants. So I don't really know what to exploit with the given information that the matrices are commutative.
    I am certain that the proof uses induction, I managed to prove something similiar with C=0:
    ##M=\begin{bmatrix}
    A & C & \\
    0 & B &
    \end{bmatrix}##
    Prove ##det(M)=det(A)det(B)##

    If A is r-square and B is s-square, then n=r+s and
    ##\sum_{σ \in S_n}## (sgn ##σ##) ##m_{1σ(1)} m_{2σ(2)}...m_{nσ(n)}##.
    If ##i\gt r## and ##j\leq r##. ##m_{ij}=0##. Therefor we only consider the permutations ##σ(r+1,r+2,...r+s)=(r+1,r+2,...,r+s)## and ##σ(1,2,...,r)=(1,2,...,r)##.
    Let ##σ_1(k)=σ(k)## for ##k\leq r## and let ##σ_2(k)=σ(k+r)-r## for ##k\leq s##.
    Thus, (sgn ##σ##) ##m_{1σ(1)} m_{2σ(2)}...m_{nσ(n)}####=##(sgn ##σ_1##)##a_{1σ_1(1)} a_{2σ_1(2)}...a_{rσ_1(r)}##(sgn ##σ_1##)##b_{1σ_2(1)} b_{2σ_2(2)}...b_{sσ_2(s)}##
    Which implies ##det(M)=det(A)det(B)##.

    So I proved this theorem, which is the closest I got to proving the question of this post. I tried using the raw formula to no sucess, perhaps there is some hidden property of commutative matrices that will allow me to solve this.
     
  7. Mar 3, 2017 #6

    fresh_42

    Staff: Mentor

    The commutativity requirement should automatically show up during the proof. I wouldn't start with it. But it may be used early in a proof. So without any brute force methods to apply, I think there could be a tricky decomposition of ##M##. However, it should be a multiplicative decomposition or a basis transformation ##TMT^{-1}##, since we don't really want to calculate something like ##\det (M+N)##. If you already have a proof for triangular matrices, maybe you can find a decomposition into factors like this.

    One can always write any matrix ##M = \frac{1}{2}(M+M^t) + \frac{1}{2}(M-M^t)## as a sum of a symmetric and a skew-symmetric matrix. Don't know whether this helps. Or apply some normal forms you know about.
     
  8. Aug 14, 2017 at 2:33 PM #7

    StoneTemplePython

    User Avatar
    Gold Member

    This is an old thread, but it has been left open and I thought about this earlier today for some reason.

    The assertion in the problem statement is wrong.

    A simple and general counter example is: consider the case where ##n## is some even natural number, and ##\mathbf A = \mathbf D = \mathbf 0_{nxn}## and ##\mathbf B = \mathbf C = \mathbf I_{nxn}##

    The zero matrix and the identity matrix commute with all matrices. Yet the above formula indicates that the determinant is ##-1## when in fact it is ##+1##. (I.e. in this case we have a permutation matrix that becomes the identity matrix after an even number of pairwise column swaps and hence has determinant of 1.)

    real simple example: consider ##n = 2##

    ##1 =det\Big(\begin{bmatrix}
    0 & 0 & 1 &0 \\
    0 & 0 & 0 & 1\\
    1 & 0 & 0&0 \\
    0 & 1 & 0& 0
    \end{bmatrix}\Big) = det\Big(
    \begin{bmatrix}
    \mathbf 0 & \mathbf I \\
    \mathbf I & \mathbf 0 \\
    \end{bmatrix}\Big)
    \neq det\big(\mathbf 0\big)det\big(\mathbf 0\big) - det\big(\mathbf I\big)det\big(\mathbf I\big) = 0*0 - 1*1 = -1##

    - - - -
    note: the specific example I am giving is problem 6.2.5 in Meyer's Matrix Analysis.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Proof regarding determinant of block matrices
  1. Proofs with matrices (Replies: 2)

  2. Determinant Proof (Replies: 5)

Loading...