# Determinant of a block matrix

1. Mar 9, 2010

### maverick280857

Hi,

I've been trying to get my head around this. $\Sigma_{(j)}$ is a p x p matrix given by

$$\Sigma_{(j)} = \left(\begin{array}{cc}\sigma_{jj} & \boldsymbol{\sigma_{(j)}'}\\\boldsymbol{\sigma_{(j)}} & \boldsymbol{\Sigma_{(2)}}\end{array}\right)$$

where $\sigma_{jj}$ is a scalar, $\boldsymbol{\sigma_{(j)}}$ is a (p-1)x1 column vector, and $\boldsymbol{\Sigma_{(2)}}$ is a (p-1)x(p-1) matrix.

The result I can't understand is

$$|\Sigma_{(j)}| = |\Sigma_{(2)}|(\sigma_{jj} - \boldsymbol{\sigma_{(j)}'\Sigma_{2}^{-1}\sigma_{(j)}})$$

where |.| denotes the determinant. How does one get this? It seems to be consistent, but I don't 'see' how it is obvious. I searched the internet for results on determinants of block matrices but all I got was stuff for [a b;c d] where a, b, c, d are all n x n matrices, in which case the determinant is just det(ad-bc).

Any inputs would be appreciated.

2. Mar 11, 2010

### maverick280857

Someone?

3. Mar 13, 2010

### Newtime

Have you worked any examples? The best way to understand how something (a proof, a theorem, a process) works is to repeat it yourself. Try it for a 3x3 matrix then a 4x4 and see if you can identify the specific machinery which permits this formula.

4. Mar 14, 2010

### IttyBittyBit

Hmm, I can think of one way you could prove this, but it might not be the best or most 'obvious' way. Still, better than nothing.

$$\left(\begin{array}{cc}1/\sigma_{jj} & \boldsymbol{0}\\\boldsymbol{0} & \boldsymbol{\Sigma_{(2)}^{-1}}\end{array}\right)$$

And see what you get. You can then work out the determinant using the determinant-of-products rule.

5. Mar 14, 2010

### maverick280857

Thanks everyone who replied. It turns out that the thing is rather simple:

$$|\Sigma_{(j)}| = \sigma_{jj}|\Sigma_{(2)}| - \sigma'_{(j)}adj(\Sigma_{(2)})\sigma_{(j)}$$

(noting that the (1,2)th 'element' is actually a row, and using the usual minor-cofactor expansion of the determinant)

Then the final step involves writing the adjoint as a product of the inverse and the (scalar) determinant, which is factored out. I admit though that this is more of a backward proof, than a derivation-based forward proof.