Proving ##(cof ~A)^t ~A = (det A)I##

Click For Summary

Homework Help Overview

The discussion revolves around proving the equation ##(cof ~A)^t ~A = (det A)I##, focusing on properties of cofactors and determinants in linear algebra. Participants are exploring the implications of matrix operations and the relationships between determinants of submatrices.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants are examining the elements of the product ##(cof~A)^t ~A##, particularly focusing on diagonal and non-diagonal elements. There is an attempt to prove that non-diagonal elements are zero, with references to specific cases such as 2x2 and 3x3 matrices. Some participants suggest testing smaller matrices to identify patterns or general rules.

Discussion Status

There is an ongoing exploration of the properties of determinants, particularly in relation to matrices with repeated columns. Some participants have provided insights and interpretations that may guide further reasoning, while others express uncertainty about proving the general case.

Contextual Notes

Participants note the challenge of proving certain properties in a general context, with some suggesting that the approach taken may alter the interpretation of the expressions involved. There are mentions of notation issues affecting clarity in the discussion.

Hall
Messages
351
Reaction score
87
Homework Statement
##cof ~A## means the cofactor matrix of A, and ##(cof~ A)^t## means the transpose of cofactor matrix of A (do you call it adjoint of A, well I too used to, but no longer. det A = determinant of A and I is the identity matrix of order compatible with LHS.
Relevant Equations
The idea I would use is to show that all diagonal elements of ##(cof~A)^t~A## is equal to ##det ~A## and rest of all the elements are zero.
i-th column of ##cof~A## =
$$
\begin{bmatrix}
(-1)^{I+1} det~A_{1i} \\
(-1)^{I+2} det ~A_{2i}\\
\vdots \\
(-1)^{I+n} det ~A_{ni}\\
\end{bmatrix}$$

Therefore, the I-th row of ##(cof~A)^t## = ##\big[ (-1)^{I+1} det~A_{1i}, (-1)^{I+2} det ~A_{2i}, \cdots, (-1)^{I+n} det ~A_{ni} \big]##

The I-th -- I-th element of ##(cof~A)^t ~ A## is =
$$
\big[ (-1)^{I+1} det~A_{1i}, (-1)^{I+2} det ~A_{2i}, \cdots, (-1)^{I+n} det ~A_{ni}\big] \times
\begin{bmatrix}
a_{1i}\\
a_{2i}\\
\vdots \\
a_{ni}\\
\end{bmatrix}
= \sum_{k=1}^{n} (-1)^{I+k} a_{ki} det~A_{ki}$$
Well, the RHS is simply a ##det ~A## expanded along the ith column. Therefore, all diagonal elements of ##(cof~A)^t ~A## is equal to ##det~A##.

Now, I would try to prove that all non-diagonal elements are zero. Consider the ##I-j th element## of ##(cof~A)^t~A##
$$
\big[ (-1)^{I+1} det~A_{1i}, (-1)^{I+2} det ~A_{2i}, \cdots, (-1)^{I+n} det ~A_{ni}\big] \times
\begin{bmatrix}
a_{1i}\\
a_{2j}\\
\vdots \\
a_{nj}\\
\end{bmatrix}
= \sum_{k=1}^{n} (-1)^{I+j} a_{kj} det A_{ki}$$

But I'm unable to prove that RHS is equal to zero. Will you help me?

Note: My computer in not making me to write small I and so somewhere where there should be a small I we have a big I.
 
Physics news on Phys.org
RHS coincides with the definition of det A, for i=j. How about try n=2 in order to confirm your way ? Say RHS = R_ij, I see
R_{11}=R_{22}=a_{11}a_{22}-a_{21}a_{21}
R_{12}=R_{21}=0
Then you can go to n=3 to find a general rule of cancellation.
 
Last edited:
anuttarasammyak said:
R_{11}=R_{22}=a_{11}a_{22}-a_{21}a_{21}
R_{12}=R_{21}=0
Then you can go to n=3 to find a general rule of cancellation.
I showed ##R_{i,j}=0## when ##i\neq j## for ##2\times 2## and ##3\times 3## matrices (for 3 x 3, we shall have three cases for ##i \neq j##). But proving it in general seems unattainable at the moment.
 
  • Like
Likes   Reactions: anuttarasammyak
For i ##\neq## j, you may interpret that RHS is determinant of a matrix which has two same columns. Thus it is zero.
 
  • Like
Likes   Reactions: Hall
anuttarasammyak said:
For i ##\neq## j, you may interpret that RHS is determinant of a matrix which has two same columns. Thus it is zero.
Yes, I found this proof by Tom Apostol:

For any matrix ##A##
$$
\begin{bmatrix}
a_11 & a_{12}& \cdots &a_{1k} &\cdots& a_{1n}\\
a_{21} & a_{22} & \cdots & a_{2k} &\cdots & a_{2n}\\
\vdots&\vdots&\vdots &\vdots & \vdots &\vdots\\
a_{n1}& a_{n2} & \cdots &a_{nk} &\cdots & a_{nn}\\
\end{bmatrix}
$$
Consider a new matrix B, such that ##B=##
$$\begin{bmatrix}
a_11 & a_{12}& \cdots& a_{1j}=a_{1k} \cdots &a_{1k} &\cdots& a_{1n}\\
a_{21} & a_{22} & \cdots a_{2j} =a_{2k}& \cdots & a_{2k} &\cdots & a_{2n}\\
\vdots&\vdots&\vdots &\vdots & \vdots &\vdots\\
a_{n1}& a_{n2} &\cdots a_{nj}= a_{nk}& \cdots &a_{nk} &\cdots & a_{nn}\\
\end{bmatrix}
$$

That is, the j-th column of B is equal to the kth column of A, and rest of all things are same. That would mean ##det ~B=0##. (We have taken kth column for generality, we could show our result for any column).

For the expression
##\sum_{I=1}^{n} (-1)^{I +k} a_{ij} det ~A_{ik}##
We can replace ## a_{ij}## with ##a_{ik}## for all i.
$$
\sum_{I=1}^{n} (-1)^{I +k} a_{ij} det ~A_{ik} = \sum_{I=1}^{n} (-1)^{I +k} a_{ik} det ~A_{ik}$$
The RHS is simply det B expanded along kth column, therefore
$$
\sum_{I=1}^{n} (-1)^{I +k} a_{ij} det ~A_{ik} = det~B = 0$$

Thus, for any expression of the form ##\sum_{I=1}^{n} (-1)^{I +k} a_{ij} det ~A_{ik}## (where ##j \neq k##) we have it equal to zero.But this seems to me like a tyranny of Mathematics, we have proved something to be zero by taking it into a completely new system. Well, changing the context of something must change it's meaning, the expression is zero in the context of matrix B, not in A.
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
9
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K