Proof regarding determinant of block matrices

Click For Summary

Homework Help Overview

The discussion revolves around the determinant of a block matrix formed by commuting square matrices A, B, C, and D. The original poster seeks to prove that the determinant of the block matrix M, structured as [A B; C D], equals the product of the determinants of A and D minus the product of the determinants of B and C. There is also a note that this result may not hold if the matrices do not commute.

Discussion Character

  • Exploratory, Assumption checking

Approaches and Questions Raised

  • Participants discuss various attempts to prove the determinant formula, including using properties of determinants and exploring specific cases such as when C=0. Some suggest that induction or diagonal formulas might be relevant. Others express uncertainty about how to leverage the commutativity of the matrices in the proof.

Discussion Status

The discussion includes attempts to clarify the problem and explore different approaches. Some participants have provided insights into potential methods, while others have raised questions about the validity of the original assertion, suggesting that counterexamples exist that challenge the proposed formula.

Contextual Notes

Some participants indicate a lack of familiarity with commutative matrices and their implications for determinants, which may affect their ability to engage with the problem effectively. There is also mention of specific counterexamples that illustrate potential flaws in the original assertion.

Adgorn
Messages
133
Reaction score
19

Homework Statement


Let A,B,C,D be commuting n-square matrices. Consider the 2n-square block matrix ##M= \begin{bmatrix}
A & B \\
C & D \\
\end{bmatrix}##. Prove that ##\left | M \right |=\left | A \right |\left | D \right |-\left | B \right |\left | C \right |##. Show that the result may not be true if the matrices do not commute.

Homework Equations


##det(M)= det(A_1)det(A_2)...det(A_n)## (Where M is an upper (lower) triangular block matrix with ##A_1,A_2,...,A_n## diagonal blocks.)

The Attempt at a Solution


At first I tried using the theorems of determinants in combinations with the properties of (commuting) matrices to try and get the desired expression, but had no success. I then tried expressing the determinant using the elements of the square matrices in M and dividing the expression to 4 separate determinants, but could not figure out how.
The closest I managed to get to a solution is prove the above equation where C=0.
Any help would be appreciated.
 
Physics news on Phys.org
Thank you Mr. AI bot thingy but I said all I really have to say, It's pretty straight forward.
 
Adgorn said:
Thank you Mr. AI bot thingy but I said all I really have to say, It's pretty straight forward.
Yes, but your attempts could have been a bit more elaborated. Unfortunately, the only solution I can think of are either an induction or the explicit formula via diagonals. Doesn't sound funny though.
 
fresh_42 said:
Yes, but your attempts could have been a bit more elaborated. Unfortunately, the only solution I can think of are either an induction or the explicit formula via diagonals. Doesn't sound funny though.
Unfortunately I cannot explain any further simply because I don't even know how to approach this, I have never worked with commutative matrices before (Learning from Shaum's outlines), let alone under the context of determinants. So I don't really know what to exploit with the given information that the matrices are commutative.
I am certain that the proof uses induction, I managed to prove something similar with C=0:
##M=\begin{bmatrix}
A & C & \\
0 & B &
\end{bmatrix}##
Prove ##det(M)=det(A)det(B)##

If A is r-square and B is s-square, then n=r+s and
##\sum_{σ \in S_n}## (sgn ##σ##) ##m_{1σ(1)} m_{2σ(2)}...m_{nσ(n)}##.
If ##i\gt r## and ##j\leq r##. ##m_{ij}=0##. Therefor we only consider the permutations ##σ(r+1,r+2,...r+s)=(r+1,r+2,...,r+s)## and ##σ(1,2,...,r)=(1,2,...,r)##.
Let ##σ_1(k)=σ(k)## for ##k\leq r## and let ##σ_2(k)=σ(k+r)-r## for ##k\leq s##.
Thus, (sgn ##σ##) ##m_{1σ(1)} m_{2σ(2)}...m_{nσ(n)}####=##(sgn ##σ_1##)##a_{1σ_1(1)} a_{2σ_1(2)}...a_{rσ_1(r)}##(sgn ##σ_1##)##b_{1σ_2(1)} b_{2σ_2(2)}...b_{sσ_2(s)}##
Which implies ##det(M)=det(A)det(B)##.

So I proved this theorem, which is the closest I got to proving the question of this post. I tried using the raw formula to no sucess, perhaps there is some hidden property of commutative matrices that will allow me to solve this.
 
The commutativity requirement should automatically show up during the proof. I wouldn't start with it. But it may be used early in a proof. So without any brute force methods to apply, I think there could be a tricky decomposition of ##M##. However, it should be a multiplicative decomposition or a basis transformation ##TMT^{-1}##, since we don't really want to calculate something like ##\det (M+N)##. If you already have a proof for triangular matrices, maybe you can find a decomposition into factors like this.

One can always write any matrix ##M = \frac{1}{2}(M+M^t) + \frac{1}{2}(M-M^t)## as a sum of a symmetric and a skew-symmetric matrix. Don't know whether this helps. Or apply some normal forms you know about.
 
Adgorn said:

Homework Statement


Let A,B,C,D be commuting n-square matrices. Consider the 2n-square block matrix ##M= \begin{bmatrix}
A & B \\
C & D \\
\end{bmatrix}##. Prove that ##\left | M \right |=\left | A \right |\left | D \right |-\left | B \right |\left | C \right |##.

This is an old thread, but it has been left open and I thought about this earlier today for some reason.

The assertion in the problem statement is wrong.

A simple and general counter example is: consider the case where ##n## is some even natural number, and ##\mathbf A = \mathbf D = \mathbf 0_{nxn}## and ##\mathbf B = \mathbf C = \mathbf I_{nxn}##

The zero matrix and the identity matrix commute with all matrices. Yet the above formula indicates that the determinant is ##-1## when in fact it is ##+1##. (I.e. in this case we have a permutation matrix that becomes the identity matrix after an even number of pairwise column swaps and hence has determinant of 1.)

real simple example: consider ##n = 2##

##1 =det\Big(\begin{bmatrix}
0 & 0 & 1 &0 \\
0 & 0 & 0 & 1\\
1 & 0 & 0&0 \\
0 & 1 & 0& 0
\end{bmatrix}\Big) = det\Big(
\begin{bmatrix}
\mathbf 0 & \mathbf I \\
\mathbf I & \mathbf 0 \\
\end{bmatrix}\Big)
\neq det\big(\mathbf 0\big)det\big(\mathbf 0\big) - det\big(\mathbf I\big)det\big(\mathbf I\big) = 0*0 - 1*1 = -1##

- - - -
note: the specific example I am giving is problem 6.2.5 in Meyer's Matrix Analysis.
 

Similar threads

  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 2 ·
Replies
2
Views
8K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K