Determinant Question: Understanding Row Interchange & Cycles

  • Thread starter Thread starter Weather Freak
  • Start date Start date
  • Tags Tags
    Determinant
Weather Freak
Messages
40
Reaction score
0
Hi Folks,

I have a question about determinants that is probably quite simple. I know that if you have a matrix and you interchange rows, the determinant changes. However, if you cyclically change the rows up or down, still in order, the determinant does not change.

What is the theorem or other rule that governs the difference between the two? Is it a fundamental property of matrices that perhaps I've missed along the way? I've searched through a variety of textbooks and websites, and seen that this is indeed true, but no one has provided an explanation as to why.

Thanks!
 
Physics news on Phys.org
It simply follows from that very rule, i.e. det B = - det A, if one obtains B by interchanging two rows or columns from A. And the rule itself follows from the definition of the determinant.
 
Linear Algebra is not by any means my strong suit but if you have a 3x3 matrix and you shift the rows down, you still have the same 3 vectors that form the matrix. Wouldnt that be why the determinant doesn't change
 
The determinant is proportional to \epsilon_{ijk...} A_{1i}A_{2j}A_{3k}..., where epsilon is the n-dimensional Levi-Civita symbol (i.e it is equal to +1 if the indices are an even permutation of ijk... and -1 if the indices are an odd permutation of ijk..) and A_{ij} is an n by n matrix. In the expression for the determinant the rows (or columns) appear as products. Interchanging the order of these changes nothing but interchanging the order changes permutation of the indices of epsilon. Therefore, interchanging two rows you get a minus or a plus sign depending on if the interchange implies an odd or an even permutation of the indices of epsilon.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top