# Determinant is independent of row/column

1. Dec 6, 2012

### Bipolarity

I am curious about the proof of the fact that the value of a determinant computed using the Laplace (or cofactor) expansion is independent of along which row (or column) the expansion is performed.

Is this a very difficult proof? My textbook omits it entirely. I was curious if someone could provide a link to the proof, as I am interested in reading it. Wikipedia has a proof http://en.wikipedia.org/wiki/Laplace_expansion but it was too complicated for me to understand.

Does anyone know a simpler form of the proof i.e. one that is longer but clearer in its statements for a less insightful reader?

BiP

2. Dec 6, 2012

### micromass

By the nature of the Laplace expansion, a proof is necessarily going to be ugly and technical.

HINT: it will be much easier to prove this yourself than to follow this proof.

Let me use the same notations as in wikipedia. So take a matrix B. Let me show that expansion along the first row yields the same result as expansion along the second row. The more general statement is left to you. We prove this by induction. For the 1x1 case, the statement is trivial.

So assume that B is nxn. Expansion along the first row yields
$$b_{1,1}C_{1,1}+...+b_{1,n}C_{1,n}=b_{1,1}M_{1,1}-b_{1,2}M_{1,2}+...+(-1)^{n+1}b_{1,n}M_{1,n}$$

Expansion along the second row yields
$$b_{2,1}C_{2,1}+...+b_{2,n}C_{2,n}=-b_{2,1}M_{2,1}+b_{2,2}M_{2,2}+...+(-1)^{n+2}b_{2,n}M_{2,n}$$

We wish to calculate $M_{1,1}$. By definition this is the determinant of the matrix that results if we remove the first row and the first column from B. By induction hypothesis, we can calculate this determinant by taking the Laplace expansion along the first row. So we can write
$$M_{1,1}=b_{2,2}D_{1,2}^{1,2} - b_{3,2}D_{1,2}^{1,3}+...+(-1)^{2+n}D_{1,2}^{1,n}$$
where $D_{a,b}^{c,d}$ is the determinant of the matrix resulting from B if we remove row a and b, and if we remove column c and d.
In general:
$$M_{1,k}=(-1)^{\delta(1,k)} b_{2,1}D_{1,2}^{1,k} +(-1)^{\delta(2,k)} b_{2,2}D_{1,2}^{2,k}+... + (-1)^{\delta(n,k)}b_{2,n}D_{1,2}^{n,k}$$

We used the following notations: $D_{1,2}^{k,k}=0$ and $\delta(l,k)$ is the number of elements in $\{1,...,l-1\}\setminus\{k\}$.

To calculate $M_{2,k}$, we calculate this matrix by taking the Laplace expansion along the first row. We get
$$M_{2,k}=(-1)^{\delta(1,k)} b_{1,1}D_{1,2}^{1,k} + (-1)^{\delta(1,k)}b_{1,2}D_{1,2}^{2,k}+...+(-1)^{\delta(n,k)}b_{1,n}D_{1,2}^{n,k}$$

We substitute these values of $M_{1,k}$ and $M_{2,k}$ into the original sum.

By definition we know that $D_{1,2}^{j,k}=D_{1,2}^{k,j}$. We wish to prove that the coefficients of these terms are equal.
The coefficient of $D_{1,2}^{j,k}$ in the first sum is:
$$(-1)^{k+1}b_{1,k}(-1)^{\delta(j,k)}b_{2,j}$$
The coefficient of $D_{1,2}^{k,j}$ in the first sum is:
$$(-1)^{j+1}b_{1,j}(-1)^{\delta(k,j)}b_{2,k}$$
So together, we have
$$(-1)^{k+\delta(j,k)+1}b_{1,k}b_{2,j}+ (-1)^{j+\delta(k,j)+1}b_{2,k}b_{2,j}$$

We do the same for the terms in the second sum. The coefficient of $D_{1,2}^{j,k}$ in the second sum is:
$$(-1)^{k+2}b_{2,k}(-1)^{\delta(j,k)}b_{1,j}$$
The coefficient of $D_{1,2}^{k,j}$ in the second sum is:
$$(-1)^{j+2}b_{2,j}(-1)^{\delta(k,j)}b_{1,k}$$
So together we have
$$(-1)^{k+\delta(j,k)+2}b_{2,j}b_{1,j}b_{2,k} + (-1)^{j+\delta(k,j)+2}b_{2,j}b_{1,k}$$

In order that both sums are equal, it suffices to show that
$$(-1)^{j+\delta(k,j)+2}=(-1)^{k+\delta(j,k)+1}$$
Assume first that $k<j$. Then $\delta(k,j)$ is the number of elements in $\{1,...,k-1\}\setminus \{j\}$ and this is k-1. So the left-hand side becomes
$$(-1)^{j+k+1}$$
If $k<j$, then $\delta(j,k)$ is the number of elements in $\{1,...,j-1\}\setminus \{k\}$ and this is j-2. So the right hand side becomes
$$(-1)^{k+j-1}$$
Clearly, the left-hand side equals the right-hand side.