# Homework Help: Determinant of a Block Lower Triangular Matrix

1. Feb 24, 2014

### Vespero

1. The problem statement, all variables and given/known data

Theorem. Let A be a k by k matrix, let D have size n by n and let C have size n by k. Then

$$det \left(\begin{array}{cc}A&0\\C&D\end{array}\right) = (det A)\cdot (det D)$$.

Proof. First show that
$$\left(\begin{array}{cc}A&0\\0&I_{n}\end{array}\right) \cdot \left(\begin{array}{cc}I_{k}&0\\C&D\end{array}\right) = \left(\begin{array}{cc}A&0\\C&D\end{array}\right)$$

Then use the following lemma:

Let A be an n by n matrix; let b denote its entry in row i and column j.
(a) If all the entries in row i other than b vanish, then
$$det A = b(-1)^{i+j}det A_{ij}$$

(b) The same equation holds if all the entries in column j other than the entry b vanish.

2. Relevant equations

3. The attempt at a solution

I am attempting to show the first part of the problem, where our initial matrix is broken into two further matrices. Given the dimensions of the block matrices, or even writing out the entire matrices with individual entries and multiplying (since all multiplications are in essence the rows and columns of the block matrices multiplied either by standard base matrices or by zero matrices), it is readily apparent that the equation holds true. However, I am not sure how to write this out in a way that is concise, possibly utilizing sigma notation. (I haven't taken a "higher level" math class in a while, so even when things make sense intuitively or computationally, I'm not sure how to notate them.)

On the second part, (using the lemma), it is again apparent that the determinants of the two matrices simplify down to det A and det D by recursively using the lemma on the diagonal of the identity matrix until one is left with the determinant of the other matrix on the diagonal, but I'm not sure how to write this concisely.

In essence, I suppose my problem is less one of how to understand or solve the problem, but how to write my solution in a manner that isn't rambling and unelegant.

2. Feb 25, 2014

### vela

Staff Emeritus
I'd try writing it like this, though there might be a more elegant way to notate everything.

Write the product as M=QP where
$$Q_{ij} = \begin{cases} A_{ij} & \text{if } i\le k \text{ and } j \le k \\ 0 & \text{if } i\gt k \text{ and } j \le k \\ 0 & \text{if } i\le k \text{ and } j \gt k \\ \delta_{ij} & \text{if } i \gt k \text{ and } j \gt k \end{cases}$$ where $\delta_{ij}$ is the Kronecker delta. P is defined similarly (I'll leave it to you to do that). Then calculate $M_{ij}$ for the four cases separately. For example, when $i\le k$ and $j \le k$, you have
$$M_{ij} = \sum_{l=1}^{n+k} Q_{il}P_{lj} = \sum_{l=1}^{k} Q_{il}P_{lj} + \sum_{l=k+1}^{n+k} Q_{il}P_{lj} = \sum_{l=1}^{k} A_{il}\delta_{lj} + \sum_{l=k+1}^{n+k} 0\times C_{(l-k)j} = A_{ij}$$

3. Feb 25, 2014

### AlephZero

It might be neater to prove the more general result, that
$$\begin{pmatrix} A & B \\ C & D\end{pmatrix} \begin{pmatrix} P & Q \\ R & S\end{pmatrix} = \begin{pmatrix} AP + BR & AQ + BS \\ CP + DR & CQ + DS \end{pmatrix}$$
where the number of rows and columns in the partitioned matrices match up. Just write down the expression for a term in the product and split it into two parts.

That result is not restricted to square matrices, or some of the partitions being square.