Determinant of nxn Matrix as Linear Function of Each Row

  • Thread starter jeff1evesque
  • Start date
  • Tags
    Determinants
In summary: B = \[ \begin{bmatrix}1 & 2 & 3 & 4...\\u_1 & u_2 & u_3 & u_4...\\v_1 & v_2 & v_3 & v_4...\\\\end{bmatrix} \]C = \[ \begin{bmatrix}1 & 2 & 3 & 4...\\u
  • #1
jeff1evesque
312
0
Theorem 4.3:The determinant of an nxn matrix is a linear function of each row when the remaining rows are held fixed. That is, for 1<=r<=n, we have

det( a1, ..., a_(r-1), u+kv, a_(r+1), ..., a_n ) = det( a1, ..., a_(r-1), u, a_(r+1), ..., a_n )
+ det( a1, ..., a_(r-1), v, a_(r+1), ..., a_n )

whenever k is a scalar and u, v, and each a_i are row vectors of F^n.

Proof:The proof is by mathematical induction on n. The result is immediate if n = 1. Assume that for some integer n >= 2 the determinant of an (n-1)x(n-1) matrix is a linear function of each row when the remaining rows are held fixed. Let A be an nxn matrix with rows a1, a2, ..., an, respectively, and suppose that for some r (1<=r<=n), we have a_r = u+kv for some u,v in F^n and some scalar k. Let u = (b1, b2, ..., bn) and v = (c1, c2, ..., cn), and let B and C be the matrices obtained from A by replacing row r of A by u and v, respectively. We must prove that det(A) = det(B) + det(C). We leave the proof of this fact to the reader for the case r = 1. For r > 1 and 1 <= j <= n, the rows of A_(1,j), B_(1,j), C_(1,j) - where these are matrices from taking the determinant coefficient from the first row- are the same except for row r-1. Moreover, row r-1 of A_(1,j) is:
(b_1 + kC1, ..., b_j-1 + kc_(j-1), b_(j+1) + kc_(j+1), ..., b_n + kc_n), which is the sum of the row r-1 of B_(1,j) and k times row r-1 of C_(1,j).
Question: Above in italized, I have no idea what they mean by the row are the same (of the matrices left after taking the first determinant to the first row of A) except for row r-1?
 
Last edited:
Physics news on Phys.org
  • #2
A_(1,j) refers to the n-1xn-1 matrix obtained by crossing out the 1st row and jth column of A. Likewise for B_(1,j) and C_(1,j). Since A, B, C differ at row r only, then the matrices A_(1,j), B_(1,j) and C_(1,j) differ only at row r - 1. Moreover row r-1 of A_(1,j) is a linear combination of the corresponding rows in B_(1,j) and C_(1,j) - this is simple to see.
 
  • #3
Werg22 said:
A_(1,j) refers to the n-1xn-1 matrix obtained by crossing out the 1st row and jth column of A. Likewise for B_(1,j) and C_(1,j). Since A, B, C differ at row r only, then the matrices A_(1,j), B_(1,j) and C_(1,j) differ only at row r - 1. Moreover row r-1 of A_(1,j) is a linear combination of the corresponding rows in B_(1,j) and C_(1,j) - this is simple to see.

Ok I understand that A,B,C differ at row r only, but that should be the same for the matrices A_(1,j), B_(1,j) and C_(1,j)...?
 
  • #4
A_(1,j) is obtained by eliminating the 1st row of A and its jth column. The remaining entries form A_(1,j). The same goes with B_(1,j) with respect to B and C_(1,j) with respect to C.
 
  • #5
Werg22 said:
A_(1,j) is obtained by eliminating the 1st row of A and its jth column. The remaining entries form A_(1,j). The same goes with B_(1,j) with respect to B and C_(1,j) with respect to C.

I am not asking what the definition of a determinant is. I am asking how in this particular proof, after eliminating the first row and it's corresponding column, the matrices differ at r-1.

Thanks.
 
  • #6
Come up with three 4x4 matrices A, B and C with identical entries except at one row (say row 3). See what A_(1,j), B_(1,j) and C_(1,j) are and how do they differ from each other. Hopefully, that will illustrate the point clearly.
 
  • #7
Werg22 said:
Come up with three 4x4 matrices A, B and C with identical entries except at one row (say row 3). See what A_(1,j), B_(1,j) and C_(1,j) are and how do they differ from each other. Hopefully, that will illustrate the point clearly.

I'm claiming they differ at row r, since the entries of these new matrices contain the same elements of the nxn Or 4x4. Therefore row r of A will be u_i+kv_i, row r of B will be u_i, and row r of C will be v_i. This shows that the only row that differs is row r.
 
  • #8
[tex] A = \[ \begin{bmatrix}
1 & 2 & 3 & 4 \\
5 & 6 & 7 & 8 \\
1 & 1 & 1 & 3 \\
9 & 10 & 11 & 12
\end{bmatrix}\] [/tex]

[tex] B = \[ \begin{bmatrix}
1 & 2 & 3 & 4 \\
5 & 6 & 7 & 8 \\
1 & 1 & 1 & 1 \\
9 & 10 & 11 & 12
\end{bmatrix}\] [/tex][tex] C = \[ \begin{bmatrix}
1 & 2 & 3 & 4 \\
5 & 6 & 7 & 8 \\
0 & 0 & 0 & 1 \\
9 & 10 & 11 & 12
\end{bmatrix}\] [/tex]

Row 3 of A is a linear combination of row 3 of B and row 3 of C. The matrices differ at row 3. Also, taking j = 1, clearly: [tex] A_{1,j} = \[ \begin{bmatrix}
6 & 7 & 8 \\
1 & 1 & 3 \\
10 & 11 & 12
\end{bmatrix}\] [/tex]

[tex] B_{1,j} = \[ \begin{bmatrix}
6 & 7 & 8 \\
1 & 1 & 1 \\
10 & 11 & 12
\end{bmatrix}\] [/tex][tex] C_{1,j} = \[ \begin{bmatrix}
6 & 7 & 8 \\
0 & 0 & 1 \\
10 & 11 & 12
\end{bmatrix}\] [/tex]

In which row do the last three matrices differ? Rereading your posts, I'm wondering: maybe you thought "crossing out" meant zero-ing the entries?
 
Last edited:
  • #9
No, I know what a determinant is. I was just confused thinking that the new matrices A_(i,j), B_(i,j), and C_(i,j), would have row 1 considered as row 2 (of the original matrix). If the text said they differed at r-1 of the new matrices, I would have understood it right off the bat. For this reason I kept thinking- yes they differ (in your example at row 2) at r-1, but this is really differing at row r, since we transplanted the elements from the original matrix. But I understand the proof now. I don't like the idea of the induction hypothesis, its a big assumption (w/o proof). Thanks a lot.
 
Last edited:

1. What are determinants of order n?

Determinants of order n are mathematical values associated with square matrices of size n x n. They are used to determine certain properties of the matrix, such as whether it is invertible or singular, and also play a crucial role in solving systems of linear equations.

2. How do you calculate the determinant of a matrix of order n?

There are a few different methods for calculating determinants of order n, including using cofactor expansion, Gaussian elimination, and using the Leibniz formula. The specific method used will depend on the individual matrix and the desired outcome.

3. What is the significance of the determinant in linear algebra?

The determinant has several important applications in linear algebra. It can be used to determine the invertibility of a matrix, to find the area or volume of a parallelogram or parallelepiped, and to solve systems of linear equations. It is also used in the computation of eigenvalues and eigenvectors.

4. Can the determinant of a matrix of order n be negative?

Yes, the determinant of a matrix of order n can be negative. The sign of the determinant is determined by the number of row exchanges made during the calculation. If the number of row exchanges is odd, the determinant will be negative, and if it is even, the determinant will be positive.

5. What is the relationship between the determinant and the trace of a matrix of order n?

The trace of a matrix of order n is equal to the sum of its eigenvalues, while the determinant is equal to the product of its eigenvalues. Therefore, the determinant and trace are related, but not directly dependent on each other. However, for a 2x2 matrix, the trace and determinant are equal.

Similar threads

  • Linear and Abstract Algebra
Replies
15
Views
4K
  • Linear and Abstract Algebra
Replies
9
Views
899
  • Linear and Abstract Algebra
Replies
1
Views
915
  • Linear and Abstract Algebra
Replies
7
Views
594
  • Linear and Abstract Algebra
Replies
4
Views
891
Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
803
Back
Top