What is the connection between cofactors, determinants, and matrix inverses?

  • Context: Undergrad 
  • Thread starter Thread starter gfd43tg
  • Start date Start date
  • Tags Tags
    Determinants
Click For Summary

Discussion Overview

The discussion centers on the relationships between cofactors, determinants, and matrix inverses in linear algebra. Participants explore the underlying concepts and connections, including the geometric interpretation of determinants and the process of calculating matrix inverses using cofactors.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant expresses confusion about the concept of cofactors and their relationship to determinants and matrix inverses, questioning the underlying meaning of these concepts.
  • Another participant suggests starting with small matrices to understand the derivation of the formula for the inverse, proposing a proof by induction.
  • A participant explains that the determinant can be interpreted as the signed volume of a parallelepiped formed by the columns of a matrix, linking this to the concept of linear independence.
  • There is a request for clarification on the geometric interpretation of determinants, specifically regarding the volume of a parallelepiped and its connection to matrix inverses.
  • One participant challenges the abstract nature of the explanation provided, indicating a preference for more straightforward concepts without complex mathematics.
  • Another participant elaborates on the scalar triple product and its relation to determinants, discussing how determinants generalize this concept to higher dimensions.
  • A detailed mathematical explanation is provided regarding the relationship between determinants and cofactors, including the process of evaluating cofactors in the context of matrix operations.

Areas of Agreement / Disagreement

Participants exhibit a mix of understanding and confusion regarding the concepts discussed. While some express interest in the geometric interpretation of determinants, others find the explanations too abstract or complex. There is no consensus on the clarity of the explanations or the best approach to understanding the relationships between these concepts.

Contextual Notes

Some participants indicate a lack of familiarity with certain mathematical concepts, suggesting that the discussion may depend on varying levels of background knowledge in linear algebra. The conversation also highlights the complexity of the relationships between cofactors, determinants, and matrix inverses, which may not be fully resolved in the current exchange.

gfd43tg
Gold Member
Messages
949
Reaction score
48
I have been reviewing linear algebra for my FE exam, and I was thinking about cofactors. What are these strange things? It totally mystifies me that you can make a cofactor matrix from a matrix A (where the does alternating +/- come from??), transpose it, find the determinant (I still don't understand what this thing is, just something I know how to calculate), then divide that determinant by the adjoint to find the inverse of the original matrix A. How in the heck do all these things relate to each other? What is the underlying meaning here other than to say that you can calculate the matrix inverse. It looks like magic to me!
 
Physics news on Phys.org
You can start with an invertible ##n\times n## matrix ##A=(a_{ij})_{i,j}## and try to solve the equation ##A \cdot X = 1## with another invertible matrix ##X##. This gives you ##n^2## linear equations with ##n^2## unknown entries. In the end you will have the formula you mentioned. Instead it is probably smarter to start with small ##n## and see how it goes. Then making a proof by induction and you will end up again with the formula.
 
One preliminary note--as every linear algebra student should know, the determinant of a matrix is equal to the signed volume of the parallelotope formed by its columns. When the columns are linearly independent, the parellelotope "flattens" and has zero volume. (It helps to imagine it in two or three dimensions.)

Consider a set of ##n## vectors in ##n##-dimensional space. We will take these vectors ##\mathbf{v}_i## to be the columns of our (square) matrix.
Now, let ##\mathbf{v}## be arbitrary. Then define a map ##f: \mathbb{R}^n\rightarrow\mathbb{R}## via the equation equation ##f(\mathbf{v})=\det [\mathbf{v\ v}_2\ \ldots\ \mathbf{v}_n]##. Because ##\det## is linear in the columns of a given matrix, ##f## is a linear functional on ##\mathbb{R}^n## with kernel ##\mathrm{span}(\{\mathbf{v}_2,\ldots,\mathbf{v}_n\})##. Also, ##f(\mathbf{v}_1)## is the determinant of the original matrix.

Finally, remember that, after choosing a basis, every linear map from ##\mathbb{R}^m## to ##\mathbb{R}^n## has a matrix associated to it. In fact, the matrix associated to ##f## is a row vector whose elements are precisely the cofactors along the first column.

This interpretation is rather abstract, and fits within the context of multilinear algebra. The early pioneers of linear algebra probably came up with cofactors as mere tools for computing determinants--but in any case, it turns out they have a very elegant interpretation in modern mathematics.
 
  • Like
Likes   Reactions: fresh_42
suremarc, I have no idea what the heck you wrote. That is way over my head!
 
Maylis said:
suremarc, I have no idea what the heck you wrote. That is way over my head!
Err, oops, I must have gotten carried away. Which part is confusing you?
 
suremarc said:
Err, oops, I must have gotten carried away. Which part is confusing you?
Basically none of it. But I do find it interesting about the part you said about the determinant being equal to the volume of a parallelepiped. Maybe you can expand on that (please no crazy abstract math if possible). Just forget everything from the second paragraph on.

Also, its weird that the volume of a parallelepiped has ANYTHING to do with an inverse of a matrix.
 
Maylis said:
Basically none of it. But I do find it interesting about the part you said about the determinant being equal to the volume of a parallelepiped. Maybe you can expand on that (please no crazy abstract math if possible). Just forget everything from the second paragraph on.
Hmm. I had figured that most of the things I mentioned are taught in linear algebra. Maybe the curriculum for engineering majors is different.

Consider a parallelepiped formed by 3 vectors, like this one:
img3345.png

Its volume is, up to a change of sign, equal to ##\mathbf{a\cdot(b\times c)}##, also known as the scalar triple product. Scaling either of ##\mathbf{a,b,}## or ##\mathbf{c}## scales the volume by the same amount.
The formula ##V=\mathbf{a\cdot(b\times c)}## mostly works, but sometimes we get negative values. This happens because our formula also depends on the orientation of ##\mathbf{a,b,}## and ##\mathbf{c}##, as per the nature of the dot and cross products. Loosely speaking, the scalar triple product tells us about the volume and the orientation of a triple of vectors.
In addition, allowing the volume to be signed gives us linearity: ##V(\mathbf{a_1+a_2, b, c})=V(\mathbf{a_1, b, c})+V(\mathbf{a_2, b, c})##. The same is not true in general when we take ##V## to be the absolute value.

The determinant works just the same way--linear in each argument, and equaling zero when the parallelotope flattens. (Hint: the 3x3 determinant is precisely the scalar triple product :wink:) Determinants are, in some sense, a generalization of the scalar triple product to an arbitrary number of dimensions.
 
  • Like
Likes   Reactions: gfd43tg
so then ##a## is whatever element you choose, and ##b \times c## is the cofactor? I learned about spans and kernels when I took linear algebra in 2012, but I am not in the mood to learn it again right now.
 
Look at things this way : given an ##n\times n## matrix ##A##, with real coefficients for exemple, its determinant is the determinant of the column vectors in the canonical basis ##{\cal B}## of ##M_{n,1}(\mathbb{R})##, which you could write ## \det A = \det_{\cal B} (C_1,...,C_n)##.

With the multilinearity and alternating property of the determinant of a family of ##n## vectors in a vector space of dimension ##n##, you can write :

##\det A = \sum_{i = 1}^n C_{i,j} \det_{\cal B} (C_1,...,C_{j-1}, e_i, C_{j+1},..,C_n)##.

And the determinant in the sum is what you call a cofactor with respect to position ##(i,j)##. Now let's evaluate this cofactor :

##\begin{align*}
\det_{\cal B} (C_1,...,C_{j-1}, e_i, C_{j+1},..,C_n) =& (-1)^{n-j} \det_{\cal B} (C_1,...,C_{j-1}, C_{j+1},..,C_n,e_i) \\
=& (-1)^{n -j} \det (B_{i,j}) \quad\quad \quad (*) \\
= & (-1)^{n -j} (-1)^{n -i} \Delta_{i,j} \\
=& (-1)^{i+j}\Delta_{i,j}
\end{align*}
##

##(*)## : ##B_{i,j}## is the transposed matrix of the matrix filled with columns ## (C_1,...,C_{j-1}, C_{j+1},..,C_n,e_i) ##. You have that the last row of ##B_{i,j}## contains only one entry that is 1 and the others are zeros.

At this point ##\Delta_{ij}## is the determinant of ##B_{ij}## modulo ##(n-i)## successive column transpositions, so that the n-th column is zero except the last term that is 1. So you get the formula for columnwise expansion of the determinant.

______

For the inverse, you can notice that if ##A## is invertible, the family ##{\cal C} = (C_1,...,C_n)## forms a basis of ##M_{n,1}(\mathbb{R})##. Given a vector ##U = {}^T (u_1,..,u_n)## given in canonical basis ##{\cal B}##, you have

## \begin{align*}
\det_{\cal C} (C_1,...,C_{j-1},U,C_{j+1},...,C_n) =& \det_{\cal C}({\cal B}) \det_{\cal B}(C_1,...,C_{j-1},U,C_{j+1},...,C_n) \\
= & \frac{1}{\det_{\cal B}({\cal C})} \sum_{i=1}^n u_i (\text{cof}(A))_{i,j} \\
= & \frac{1}{\det (A) } ({}^T \text{cof}(A) U)_j \\
\end{align*}##Replace ##U ## with ##C_i={}^T(a_{1i},...,a_{ni})## : ## \frac{1}{\det (A) } ({}^T \text{cof}(A) C_i)_j = \delta_{ij}##
and then ## \frac{1}{\det (A) } {}^T \text{cof}(A) A = I_n##
 

Similar threads

  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 34 ·
2
Replies
34
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
20K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 12 ·
Replies
12
Views
3K