What Is the Significance of the Matrix Identity Involving \( S^{-1}_{ij} \)?

Click For Summary

Discussion Overview

The discussion revolves around a specific matrix identity involving the inverse of a matrix defined as \( S_{ij} = 2^{-(2N - i - j + 1)} \frac{(2N - i - j)!}{(N-i)!(N-j)!} \). Participants explore the significance of the identity that leads to the result \( \sum_{i,j=1}^N S^{-1}_{ij} = 2N \), and the implications of this result in the context of maximizing a particular functional involving weights.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant numerically finds that \( \sum_{i,j=1}^N S^{-1}_{ij} = 2N \) and seeks understanding of the matrix identity's significance.
  • Another participant inquires about the method of discovery, suggesting the use of numerical computing software.
  • A different participant provides a mathematical expression for the inverse matrix elements and discusses the determinant's role in the cofactor expansion.
  • A participant shares their context of discovering the identity while attempting to maximize a specific functional involving weights and provides a numerical result that supports the identity for small values of \( N \).
  • One participant expresses difficulty in finding a closed form for the cofactor of an arbitrary \( N \times N \) matrix.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the significance of the matrix identity or the methods for deriving a closed form for the cofactor. Multiple viewpoints and approaches are presented without resolution.

Contextual Notes

The discussion includes unresolved mathematical steps regarding the cofactor and determinant expressions, as well as the numerical stability issues encountered when evaluating the identity for larger \( N \).

madness
Messages
813
Reaction score
69
Hi all,

I've come across an interesting matrix identity in my work. I'll define the NxN matrix as S_{ij} = 2^{-(2N - i - j + 1)} \frac{(2N - i - j)!}{(N-i)!(N-j)!}. I find numerically that \sum_{i,j=1}^N S^{-1}_{ij} = 2N, (the sum is over the elements of the matrix inverse). In fact, I expected to get 2N based on the problem I'm studying, but I don't know what this complicated matrix expression is doing or why it equals 2N. Does any of this look familiar to anyone here?

Thanks for your help!

P.S. If this is in the wrong subforum, please move it.
 
Physics news on Phys.org
Interesting, how did you come across this? using some numerical computing software like matlab?

@fresh_42 or @Mark44 might be interested in how you discovered this.
 
I haven't run through the math, but keep in mind that the inverse matrix element can be expressed as:
$$(S^{-1})_{ij} = \frac{1}{\det{S}}C_{ji}$$
where ##C_{ji}## is the element of the transposed cofactor matrix. Also remember that the determinant can be expressed as a cofactor expansion:
$$\det{S} = \sum_{i=1}^{N} S_{ij} C_{ij}$$
Also keep in mind that the cofactor expansion works for any row or any column, so that
$$(S^{-1})_{ij} = \frac{C_{ji}}{ \sum_{i=1}^{N} S_{ji} C_{ji}}$$
I dunno, maybe that helps. It might not hurt, too, to see if you can pull out a general formula for the cofactor.
 
Thanks for the help.

@jedishrfu I discovered this trying to maximise the following:

\frac{\left[ \int_0^\infty f(t) dt \right]^2}{\int_0^\infty f^2(t) dt } where f(t) = \sum_{i=1}^N w_i \frac{(ct)^{N-i}}{(N-i)!} e^{\lambda t} and w_i are weights which I want to maximise with respect to. I can show that the maximum is \frac{1}{-\lambda} \sum_{ij} \left(S^{-1}\right)_{ij} and using Matlab this turns out to be \frac{2N}{-\lambda} for N=1...15 (I stopped here as it became numerically unstable).

@TeethWhitener I can see that your approach must give the right answer, but finding a closed form expression for the cofactor seems difficult for an arbitrary NxN matrix.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 0 ·
Replies
0
Views
3K