Proof that Determinant is Multiplicative for Commutative Rings

Site
Messages
26
Reaction score
0
Is there a nice way to show that Det(AB)=Det(A)Det(B) where A and B are n x n matrices over a commutative ring?

I'm hoping there is some analogue to the construction for vector spaces that defines the determinant in a natural way using alternating multilinear mappings...

Otherwise would you just have to bash out the identity using the Leibniz formula for the determinant?
 
Physics news on Phys.org
##\det## is just a homomorphism from the group of linear maps on ##V## to its representation on the top exterior power ##\Lambda^n V##. Taking some ##v_i \in V##, we have ##\det A : \Lambda^n V \to \Lambda^n V## given by

(\det A)(v_1 \wedge \ldots \wedge v_n) = A(v_1) \wedge \ldots \wedge A(v_n)
From here it is easy to show ##\det AB = \det A \det B## by using the usual composition of linear maps on each factor in the wedge product on the right.
 
Last edited:
Thanks, Ben. To clarify, do you mean that we set ##V=R^n ## so that the group of linear maps on ##V## is the set of ##n##x##n## matrices?

Also, do you know of a textbook that explains exterior algebra (from the module perspective) and its connections to the determinant from the ground up?
 
V can be any vector space at all.
 
Site said:
Is there a nice way to show that Det(AB)=Det(A)Det(B) where A and B are n x n matrices over a commutative ring?

I'm hoping there is some analogue to the construction for vector spaces that defines the determinant in a natural way using alternating multilinear mappings...

Otherwise would you just have to bash out the identity using the Leibniz formula for the determinant?

Maybe you can first show (not too hard) , that the determinant is multiplicative for elementary matrices Ei . Then write B as a product of elementary matrices
and rearrange.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top