What is the Concise Proof for the Determinant Product Rule?

  • Thread starter Thread starter etnad179
  • Start date Start date
AI Thread Summary
The discussion focuses on finding a concise proof for the determinant product rule, specifically that det(AB) = det(A)det(B). A detailed proof is provided using the definition of the determinant in terms of permutations and the properties of the sign function. The proof involves manipulating sums over permutations to show that the determinant of the product of two matrices can be expressed as the product of their determinants. Additionally, a more elegant, coordinate-free approach is suggested, utilizing the concept of linear maps on vector spaces. The conversation emphasizes the complexity of understanding the proof steps, indicating that careful examination is required.
etnad179
Messages
11
Reaction score
0
I used to know how to prove the statement for matrices
det(AB)=det(A)det(B) concisely but for the life of me I've forgotten it...

Does anyone know a concise proof for this?

Thanks!
 
Mathematics news on Phys.org


Use the fact that

\mbox{det}A = \exp (\mbox{Tr} \ln A)
 


The definition of the determinant is

\det A=\sum_\sigma (\operatorname{sgn}\sigma)A^1_{\sigma(1)}\cdots A^n_{\sigma(n)}

where the sum is over all permutations of the set {1,2,...,n}, sgn σ is =1 when the permutation is even and =-1 when it's odd, and A^i_j denotes the entry on row i, column j. With this notation, you can do it as a fairly straightforward calculation:

<br /> \begin{align*}<br /> \det(AB) &amp;=\sum_\sigma (\operatorname{sgn}\sigma)(AB)^1_{\sigma(1)}\cdots (AB)^n_{\sigma(n)}=\sum_\sigma (\operatorname{sgn}\sigma)\Big(\sum_{i_1}A^1_{i_1}B^{i_1}_{\sigma(1)}\Big)\cdots \Big(\sum_{i_n}A^1_{i_n}B^{i_n}_{\sigma(n)}\Big)\\<br /> &amp;=\sum_{i_1,\dots,i_n}A^1_{i_1}\cdots A^n_{i_n}<br /> \underbrace{\sum_\sigma (\operatorname{sgn}\sigma) B^{i_1}_{\sigma(1)}\cdots B^{i_n}_{\sigma(n)}}_{=0\text{ unless }(i_1,\dots,i_n)\text{ is a permutation of }(1,\dots,n).}\\<br /> &amp;=\sum_\tau A^1_{\tau(1)}\cdots A^n_{\tau(n)}\sum_\sigma (\operatorname{sgn}\sigma) B^{\tau(1)}_{\sigma(1)}\cdots B^{\tau(n)}_{\sigma(n)}\\<br /> &amp;=\sum_\tau A^1_{\tau(1)}\cdots A^n_{\tau(n)}\sum_\sigma \underbrace{(\operatorname{sgn}\sigma)}_{=(\operatorname{sgn}\tau)(\operatorname{sgn}(\tau^{-1}\circ\sigma))} B^{1}_{\tau^{-1}\circ\sigma(1)}\cdots B^{n}_{\tau^{-1}\circ\sigma(n)}\\<br /> &amp;=\sum_\tau(\operatorname{sgn}\tau)A^1_{\tau(1)}\cdots A^n_{\tau(n)}\underbrace{\sum_{\tau^{-1}\circ\sigma}(\operatorname{sgn}(\tau^{-1}\circ\sigma))B^{1}_{\tau^{-1}\circ\sigma(1)}\cdots B^{n}_{\tau^{-1}\circ\sigma(n)}}_{=\det B}\\<br /> &amp;=(\det A)(\det B)<br /> \end{align*}<br />

but you will probably have to stare at this for a while before you understand all the steps.
 


Fredrik said:
The definition of the determinant is

\det A=\sum_\sigma (\operatorname{sgn}\sigma)A^1_{\sigma(1)}\cdots A^n_{\sigma(n)}

where the sum is over all permutations of the set {1,2,...,n}, sgn σ is =1 when the permutation is even and =-1 when it's odd, and A^i_j denotes the entry on row i, column j. With this notation, you can do it as a fairly straightforward calculation:

<br /> \begin{align*}<br /> \det(AB) &amp;=\sum_\sigma (\operatorname{sgn}\sigma)(AB)^1_{\sigma(1)}\cdots (AB)^n_{\sigma(n)}=\sum_\sigma (\operatorname{sgn}\sigma)\Big(\sum_{i_1}A^1_{i_1}B^{i_1}_{\sigma(1)}\Big)\cdots \Big(\sum_{i_n}A^1_{i_n}B^{i_n}_{\sigma(n)}\Big)\\<br /> &amp;=\sum_{i_1,\dots,i_n}A^1_{i_1}\cdots A^n_{i_n}<br /> \underbrace{\sum_\sigma (\operatorname{sgn}\sigma) B^{i_1}_{\sigma(1)}\cdots B^{i_n}_{\sigma(n)}}_{=0\text{ unless }(i_1,\dots,i_n)\text{ is a permutation of }(1,\dots,n).}\\<br /> &amp;=\sum_\tau A^1_{\tau(1)}\cdots A^n_{\tau(n)}\sum_\sigma (\operatorname{sgn}\sigma) B^{\tau(1)}_{\sigma(1)}\cdots B^{\tau(n)}_{\sigma(n)}\\<br /> &amp;=\sum_\tau A^1_{\tau(1)}\cdots A^n_{\tau(n)}\sum_\sigma \underbrace{(\operatorname{sgn}\sigma)}_{=(\operatorname{sgn}\tau)(\operatorname{sgn}(\tau^{-1}\circ\sigma))} B^{1}_{\tau^{-1}\circ\sigma(1)}\cdots B^{n}_{\tau^{-1}\circ\sigma(n)}\\<br /> &amp;=\sum_\tau(\operatorname{sgn}\tau)A^1_{\tau(1)}\cdots A^n_{\tau(n)}\underbrace{\sum_{\tau^{-1}\circ\sigma}(\operatorname{sgn}(\tau^{-1}\circ\sigma))B^{1}_{\tau^{-1}\circ\sigma(1)}\cdots B^{n}_{\tau^{-1}\circ\sigma(n)}}_{=\det B}\\<br /> &amp;=(\det A)(\det B)<br /> \end{align*}<br />

but you will probably have to stare at this for a while before you understand all the steps.




Hi, thanks for the help - I've been looking at this for a bit and I think I understand most of it.

Just to check the \tau is the set of permutation of the elements {i_1,...i_n}? And how can we get from this permutation \tau of the row is equal to the inverse \tau^{-1} of the column for matrix element B^{\tau(1)}_{\sigma(1)}
 


deleted, realized it is wrong and I'm too lazy to correct it.
 
Last edited:


etnad179 said:
Just to check the \tau is the set of permutation of the elements {i_1,...i_n}?
\tau and \sigma are both permutations of {1,2,...,n}, i.e. they are bijections from that set onto itself. In the revised calculation below, \rho represents a permutation of {1,2,...,n} too.

etnad179 said:
And how can we get from this permutation \tau of the row is equal to the inverse \tau^{-1} of the column for matrix element B^{\tau(1)}_{\sigma(1)}
I think I did that step wrong. This is how I would like to handle the product B^{\tau(1)}_{\sigma(1)}\cdots B^{\tau(n)}_{\sigma(n)} today: First use the fact that real numbers commute, to rearrange the factors so that the row indices (the upper indices) appear in the order (1,2,...,n). I'll write asterisks instead of the column indices until we have figured out what we should write in those slots.

B^{\tau(1)}_{\sigma(1)}\cdots B^{\tau(n)}_{\sigma(n)}=B^1_*\cdots B^n_*

Now use the fact that for each k in {1,2,...,n} we have k=\tau(\tau^{-1}(k)), to rewrite this as

=B^{\tau(\tau^{-1}(1))}_*\cdots B^{\tau(\tau^{-1}(n))}_*.

Now just look at the product we started with and note that when the row index is \tau(k), the column index is \sigma(k). This tells us what the column indices are.

=B^{\tau(\tau^{-1}(1))}_{\sigma(\tau^{-1}(1))}\cdots B^{\tau(\tau^{-1}(n))}_{\sigma(\tau^{-1}(n))} =B^{1}_{\sigma(\tau^{-1}(1))}\cdots B^{n}_{\sigma(\tau^{-1}(n))}.

So the calculation should look like this:

<br /> \begin{align*}<br /> \det(AB) &amp;=\sum_\sigma (\operatorname{sgn}\sigma)(AB)^1_{\sigma(1)}\cdots (AB)^n_{\sigma(n)}=\sum_\sigma (\operatorname{sgn}\sigma)\Big(\sum_{i_1}A^1_{i_1} B^{i_1}_{\sigma(1)}\Big)\cdots \Big(\sum_{i_n}A^n_{i_n}B^{i_n}_{\sigma(n)}\Big)\\<br /> &amp;=\sum_{i_1,\dots,i_n}A^1_{i_1}\cdots A^n_{i_n}<br /> \underbrace{\sum_\sigma (\operatorname{sgn}\sigma) B^{i_1}_{\sigma(1)}\cdots B^{i_n}_{\sigma(n)}}_{=0\text{ unless there&#039;s a permutation }\tau\text{ such that }\tau(1,\dots,n)=(i_1,\dots,i_n).}\\<br /> &amp;=\sum_\tau A^1_{\tau(1)}\cdots A^n_{\tau(n)}\sum_\sigma (\operatorname{sgn}\sigma) B^{\tau(1)}_{\sigma(1)}\cdots B^{\tau(n)}_{\sigma(n)}\\<br /> &amp;=\sum_\tau A^1_{\tau(1)}\cdots A^n_{\tau(n)}\sum_\sigma \underbrace{(\operatorname{sgn}\sigma)}_{=(\operatorname{sgn}\tau)(\operatorname{sgn}(\sigma\circ\tau^{-1}))} B^{1}_{\sigma(\tau^{-1}(1))}\cdots B^{n}_{\sigma(\tau^{-1}(n))}\\<br /> &amp;=\sum_\tau(\operatorname{sgn}\tau)A^1_{\tau(1)}\cdots A^n_{\tau(n)}\sum_{\rho}(\operatorname{sgn}\rho)B^{1}_{\rho(1)}\cdots B^{n}_{\rho(n)}\\<br /> &amp;=(\det A)(\det B)<br /> \end{align*}<br />

To understand the step where I introduced the symbol \rho, you just have to stare at those two lines until you see that the sums contain the same terms.
 
Last edited:


The most concise proof is using the most elegant, coordinate free definition. Namely if L is an endomorphism of the n-dimensional vector space V, then the induced map

\hat{A}:\bigwedge^nV\to \bigwedge^nV
is a linear map between 1-dimensional spaces. The scalar by which it acts is the determinant of L, so for \omega\in\bigwedge^nV we have

\hat{A}\omega=\det A\cdot\omega.
Hence
\widehat{AB}\omega=\hat{A}(\det B\cdot\omega)=\det A\det B\cdot\omega.
 

Similar threads

Replies
2
Views
2K
Replies
21
Views
2K
Replies
4
Views
2K
Replies
18
Views
3K
Replies
2
Views
3K
Back
Top