Determinants of order n

jeff1evesque
Theorem 4.3:The determinant of an nxn matrix is a linear function of each row when the remaining rows are held fixed. That is, for 1<=r<=n, we have

det( a1, ..., a_(r-1), u+kv, a_(r+1), ..., a_n ) = det( a1, ..., a_(r-1), u, a_(r+1), ..., a_n )
+ det( a1, ..., a_(r-1), v, a_(r+1), ..., a_n )

whenever k is a scalar and u, v, and each a_i are row vectors of F^n.

Proof:The proof is by mathematical induction on n. The result is immediate if n = 1. Assume that for some integer n >= 2 the determinant of an (n-1)x(n-1) matrix is a linear function of each row when the remaining rows are held fixed. Let A be an nxn matrix with rows a1, a2, ..., an, respectively, and suppose that for some r (1<=r<=n), we have a_r = u+kv for some u,v in F^n and some scalar k. Let u = (b1, b2, ..., bn) and v = (c1, c2, ..., cn), and let B and C be the matrices obtained from A by replacing row r of A by u and v, respectively. We must prove that det(A) = det(B) + det(C). We leave the proof of this fact to the reader for the case r = 1. For r > 1 and 1 <= j <= n, the rows of A_(1,j), B_(1,j), C_(1,j) - where these are matrices from taking the determinant coefficient from the first row- are the same except for row r-1. Moreover, row r-1 of A_(1,j) is:
(b_1 + kC1, ..., b_j-1 + kc_(j-1), b_(j+1) + kc_(j+1), ..., b_n + kc_n), which is the sum of the row r-1 of B_(1,j) and k times row r-1 of C_(1,j).

Question: Above in italized, I have no idea what they mean by the row are the same (of the matrices left after taking the first determinant to the first row of A) except for row r-1???

Last edited:

Werg22
A_(1,j) refers to the n-1xn-1 matrix obtained by crossing out the 1st row and jth column of A. Likewise for B_(1,j) and C_(1,j). Since A, B, C differ at row r only, then the matrices A_(1,j), B_(1,j) and C_(1,j) differ only at row r - 1. Moreover row r-1 of A_(1,j) is a linear combination of the corresponding rows in B_(1,j) and C_(1,j) - this is simple to see.

jeff1evesque
A_(1,j) refers to the n-1xn-1 matrix obtained by crossing out the 1st row and jth column of A. Likewise for B_(1,j) and C_(1,j). Since A, B, C differ at row r only, then the matrices A_(1,j), B_(1,j) and C_(1,j) differ only at row r - 1. Moreover row r-1 of A_(1,j) is a linear combination of the corresponding rows in B_(1,j) and C_(1,j) - this is simple to see.

Ok I understand that A,B,C differ at row r only, but that should be the same for the matrices A_(1,j), B_(1,j) and C_(1,j)...?

Werg22
A_(1,j) is obtained by eliminating the 1st row of A and its jth column. The remaining entries form A_(1,j). The same goes with B_(1,j) with respect to B and C_(1,j) with respect to C.

jeff1evesque
A_(1,j) is obtained by eliminating the 1st row of A and its jth column. The remaining entries form A_(1,j). The same goes with B_(1,j) with respect to B and C_(1,j) with respect to C.

I am not asking what the definition of a determinant is. I am asking how in this particular proof, after eliminating the first row and it's corresponding column, the matrices differ at r-1.

Thanks.

Werg22
Come up with three 4x4 matrices A, B and C with identical entries except at one row (say row 3). See what A_(1,j), B_(1,j) and C_(1,j) are and how do they differ from each other. Hopefully, that will illustrate the point clearly.

jeff1evesque
Come up with three 4x4 matrices A, B and C with identical entries except at one row (say row 3). See what A_(1,j), B_(1,j) and C_(1,j) are and how do they differ from each other. Hopefully, that will illustrate the point clearly.

I'm claiming they differ at row r, since the entries of these new matrices contain the same elements of the nxn Or 4x4. Therefore row r of A will be u_i+kv_i, row r of B will be u_i, and row r of C will be v_i. This shows that the only row that differs is row r.

Werg22
$$A = $\begin{bmatrix} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 1 & 1 & 1 & 3 \\ 9 & 10 & 11 & 12 \end{bmatrix}$$$

$$B = $\begin{bmatrix} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 1 & 1 & 1 & 1 \\ 9 & 10 & 11 & 12 \end{bmatrix}$$$

$$C = $\begin{bmatrix} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 0 & 0 & 0 & 1 \\ 9 & 10 & 11 & 12 \end{bmatrix}$$$

Row 3 of A is a linear combination of row 3 of B and row 3 of C. The matrices differ at row 3. Also, taking j = 1, clearly:

$$A_{1,j} = $\begin{bmatrix} 6 & 7 & 8 \\ 1 & 1 & 3 \\ 10 & 11 & 12 \end{bmatrix}$$$

$$B_{1,j} = $\begin{bmatrix} 6 & 7 & 8 \\ 1 & 1 & 1 \\ 10 & 11 & 12 \end{bmatrix}$$$

$$C_{1,j} = $\begin{bmatrix} 6 & 7 & 8 \\ 0 & 0 & 1 \\ 10 & 11 & 12 \end{bmatrix}$$$

In which row do the last three matrices differ? Rereading your posts, I'm wondering: maybe you thought "crossing out" meant zero-ing the entries?

Last edited:
jeff1evesque
No, I know what a determinant is. I was just confused thinking that the new matrices A_(i,j), B_(i,j), and C_(i,j), would have row 1 considered as row 2 (of the original matrix). If the text said they differed at r-1 of the new matrices, I would have understood it right off the bat. For this reason I kept thinking- yes they differ (in your example at row 2) at r-1, but this is really differing at row r, since we transplanted the elements from the original matrix. But I understand the proof now. I don't like the idea of the induction hypothesis, its a big assumption (w/o proof). Thanks a lot.

Last edited: