Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Determinant of a block matrix

  1. Mar 9, 2010 #1

    I've been trying to get my head around this. [itex]\Sigma_{(j)}[/itex] is a p x p matrix given by

    [tex]\Sigma_{(j)} = \left(\begin{array}{cc}\sigma_{jj} & \boldsymbol{\sigma_{(j)}'}\\\boldsymbol{\sigma_{(j)}} & \boldsymbol{\Sigma_{(2)}}\end{array}\right)[/tex]

    where [itex]\sigma_{jj}[/itex] is a scalar, [itex]\boldsymbol{\sigma_{(j)}}[/itex] is a (p-1)x1 column vector, and [itex]\boldsymbol{\Sigma_{(2)}}[/itex] is a (p-1)x(p-1) matrix.

    The result I can't understand is

    [tex]|\Sigma_{(j)}| = |\Sigma_{(2)}|(\sigma_{jj} - \boldsymbol{\sigma_{(j)}'\Sigma_{2}^{-1}\sigma_{(j)}})[/tex]

    where |.| denotes the determinant. How does one get this? It seems to be consistent, but I don't 'see' how it is obvious. I searched the internet for results on determinants of block matrices but all I got was stuff for [a b;c d] where a, b, c, d are all n x n matrices, in which case the determinant is just det(ad-bc).

    Any inputs would be appreciated.

    Thanks in advance!
  2. jcsd
  3. Mar 11, 2010 #2
  4. Mar 13, 2010 #3
    Have you worked any examples? The best way to understand how something (a proof, a theorem, a process) works is to repeat it yourself. Try it for a 3x3 matrix then a 4x4 and see if you can identify the specific machinery which permits this formula.
  5. Mar 14, 2010 #4
    Hmm, I can think of one way you could prove this, but it might not be the best or most 'obvious' way. Still, better than nothing.

    Left-Multiply your matrix by

    [tex]\left(\begin{array}{cc}1/\sigma_{jj} & \boldsymbol{0}\\\boldsymbol{0} & \boldsymbol{\Sigma_{(2)}^{-1}}\end{array}\right)[/tex]

    And see what you get. You can then work out the determinant using the determinant-of-products rule.
  6. Mar 14, 2010 #5
    Thanks everyone who replied. It turns out that the thing is rather simple:

    [tex]|\Sigma_{(j)}| = \sigma_{jj}|\Sigma_{(2)}| - \sigma'_{(j)}adj(\Sigma_{(2)})\sigma_{(j)}[/tex]

    (noting that the (1,2)th 'element' is actually a row, and using the usual minor-cofactor expansion of the determinant)

    Then the final step involves writing the adjoint as a product of the inverse and the (scalar) determinant, which is factored out. I admit though that this is more of a backward proof, than a derivation-based forward proof.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook