Proving ##(cof ~A)^t ~A = (det A)I##

Click For Summary
SUMMARY

The discussion focuses on proving the equation ##(cof ~A)^t ~A = (det A)I##, where ##cof~A## represents the cofactor matrix of matrix A. The participants derive that the diagonal elements of the product ##(cof~A)^t ~A## equal ##det~A##, while the challenge lies in proving that all non-diagonal elements are zero. They explore specific cases for 2x2 and 3x3 matrices, ultimately referencing Tom Apostol's proof that demonstrates the determinant of a matrix with duplicate columns is zero, thereby confirming the non-diagonal elements are indeed zero.

PREREQUISITES
  • Understanding of matrix operations, specifically cofactor and transpose operations.
  • Familiarity with determinants and their properties in linear algebra.
  • Knowledge of matrix notation and indexing conventions.
  • Experience with mathematical proofs and logical reasoning in algebra.
NEXT STEPS
  • Study the properties of cofactor matrices and their applications in linear algebra.
  • Learn about the determinant's behavior under column operations and its implications.
  • Explore Tom Apostol's work on determinants for deeper insights into matrix theory.
  • Practice proving properties of determinants using various matrix sizes and configurations.
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in advanced matrix theory and proofs related to determinants and cofactors.

Hall
Messages
351
Reaction score
87
Homework Statement
##cof ~A## means the cofactor matrix of A, and ##(cof~ A)^t## means the transpose of cofactor matrix of A (do you call it adjoint of A, well I too used to, but no longer. det A = determinant of A and I is the identity matrix of order compatible with LHS.
Relevant Equations
The idea I would use is to show that all diagonal elements of ##(cof~A)^t~A## is equal to ##det ~A## and rest of all the elements are zero.
i-th column of ##cof~A## =
$$
\begin{bmatrix}
(-1)^{I+1} det~A_{1i} \\
(-1)^{I+2} det ~A_{2i}\\
\vdots \\
(-1)^{I+n} det ~A_{ni}\\
\end{bmatrix}$$

Therefore, the I-th row of ##(cof~A)^t## = ##\big[ (-1)^{I+1} det~A_{1i}, (-1)^{I+2} det ~A_{2i}, \cdots, (-1)^{I+n} det ~A_{ni} \big]##

The I-th -- I-th element of ##(cof~A)^t ~ A## is =
$$
\big[ (-1)^{I+1} det~A_{1i}, (-1)^{I+2} det ~A_{2i}, \cdots, (-1)^{I+n} det ~A_{ni}\big] \times
\begin{bmatrix}
a_{1i}\\
a_{2i}\\
\vdots \\
a_{ni}\\
\end{bmatrix}
= \sum_{k=1}^{n} (-1)^{I+k} a_{ki} det~A_{ki}$$
Well, the RHS is simply a ##det ~A## expanded along the ith column. Therefore, all diagonal elements of ##(cof~A)^t ~A## is equal to ##det~A##.

Now, I would try to prove that all non-diagonal elements are zero. Consider the ##I-j th element## of ##(cof~A)^t~A##
$$
\big[ (-1)^{I+1} det~A_{1i}, (-1)^{I+2} det ~A_{2i}, \cdots, (-1)^{I+n} det ~A_{ni}\big] \times
\begin{bmatrix}
a_{1i}\\
a_{2j}\\
\vdots \\
a_{nj}\\
\end{bmatrix}
= \sum_{k=1}^{n} (-1)^{I+j} a_{kj} det A_{ki}$$

But I'm unable to prove that RHS is equal to zero. Will you help me?

Note: My computer in not making me to write small I and so somewhere where there should be a small I we have a big I.
 
Physics news on Phys.org
RHS coincides with the definition of det A, for i=j. How about try n=2 in order to confirm your way ? Say RHS = R_ij, I see
R_{11}=R_{22}=a_{11}a_{22}-a_{21}a_{21}
R_{12}=R_{21}=0
Then you can go to n=3 to find a general rule of cancellation.
 
Last edited:
anuttarasammyak said:
R_{11}=R_{22}=a_{11}a_{22}-a_{21}a_{21}
R_{12}=R_{21}=0
Then you can go to n=3 to find a general rule of cancellation.
I showed ##R_{i,j}=0## when ##i\neq j## for ##2\times 2## and ##3\times 3## matrices (for 3 x 3, we shall have three cases for ##i \neq j##). But proving it in general seems unattainable at the moment.
 
  • Like
Likes   Reactions: anuttarasammyak
For i ##\neq## j, you may interpret that RHS is determinant of a matrix which has two same columns. Thus it is zero.
 
  • Like
Likes   Reactions: Hall
anuttarasammyak said:
For i ##\neq## j, you may interpret that RHS is determinant of a matrix which has two same columns. Thus it is zero.
Yes, I found this proof by Tom Apostol:

For any matrix ##A##
$$
\begin{bmatrix}
a_11 & a_{12}& \cdots &a_{1k} &\cdots& a_{1n}\\
a_{21} & a_{22} & \cdots & a_{2k} &\cdots & a_{2n}\\
\vdots&\vdots&\vdots &\vdots & \vdots &\vdots\\
a_{n1}& a_{n2} & \cdots &a_{nk} &\cdots & a_{nn}\\
\end{bmatrix}
$$
Consider a new matrix B, such that ##B=##
$$\begin{bmatrix}
a_11 & a_{12}& \cdots& a_{1j}=a_{1k} \cdots &a_{1k} &\cdots& a_{1n}\\
a_{21} & a_{22} & \cdots a_{2j} =a_{2k}& \cdots & a_{2k} &\cdots & a_{2n}\\
\vdots&\vdots&\vdots &\vdots & \vdots &\vdots\\
a_{n1}& a_{n2} &\cdots a_{nj}= a_{nk}& \cdots &a_{nk} &\cdots & a_{nn}\\
\end{bmatrix}
$$

That is, the j-th column of B is equal to the kth column of A, and rest of all things are same. That would mean ##det ~B=0##. (We have taken kth column for generality, we could show our result for any column).

For the expression
##\sum_{I=1}^{n} (-1)^{I +k} a_{ij} det ~A_{ik}##
We can replace ## a_{ij}## with ##a_{ik}## for all i.
$$
\sum_{I=1}^{n} (-1)^{I +k} a_{ij} det ~A_{ik} = \sum_{I=1}^{n} (-1)^{I +k} a_{ik} det ~A_{ik}$$
The RHS is simply det B expanded along kth column, therefore
$$
\sum_{I=1}^{n} (-1)^{I +k} a_{ij} det ~A_{ik} = det~B = 0$$

Thus, for any expression of the form ##\sum_{I=1}^{n} (-1)^{I +k} a_{ij} det ~A_{ik}## (where ##j \neq k##) we have it equal to zero.But this seems to me like a tyranny of Mathematics, we have proved something to be zero by taking it into a completely new system. Well, changing the context of something must change it's meaning, the expression is zero in the context of matrix B, not in A.
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
9
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K