Matrix determinant operator commutative?

el_llavero
Messages
29
Reaction score
0
I've been going through properties of determinants of matrices and found the following:

Assuming products are defined and the matrices involved are nonsingular of the same order

The determinant of the product of any number of matrices is equal to the determinant of each matrix; where the order of factors does not matter

det(AB)=det(A)det(B)
det(AB)=det(B)det(A)
det(BA)=det(A)det(B)
det(BA)=det(B)det(A)

det(ABC)=det(C)det(A)det(B)

det(ACB)=det(A)det(B)det(C)

Is this correct? And is there a way to describe this property regarding commutatively? I know in general matrix multiplication is not commutative unless the matrices involved are diagonal and of the same dimension. However the determinant operator seems to not preserve the non commutative property of matrix multiplication, on either side of the equality. What I’m looking for here is a formal way of describing this property that encompasses the fact that order of factors does not matter and if commutativity should be used in any part of this description. Is matrix multiplication commutative?
 
Last edited by a moderator:
Physics news on Phys.org
Probably (multiplicative) distributivity for the determinant together with commutativity of it's target space are the terms you are looking for.
 
Thanks, that makes sense. Can you explain a bit more why is matrix multiplication commutative?
 
Last edited by a moderator:
Something isn't completely accurate in the second sentence of the original post

"Assuming products are defined and the matrices involved are nonsingular of the same order"

since it doesn't matter if the matrices are nonsingular for this property to hold a more accurate statement would be

"Assuming products are defined and the matrices involved are square matrices"
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top