# Solution to a linear equation of matrices

Hi,
How can I solve the equation below for M.

G*inv(A+G'*inv(M)*G)*G'+F+M=0

G' is the transpose of G and inv(.) is the inverse of a matrix.

Thanks

Hi again,

This looks a lot like some of the formula's you get for propagators in QFT for some background field calculations. (Combined with your derivative of a determinant question, the evidence is even stronger). Of course, in QFT the inner products are infinite dimensional, not finite.

I think that you might find that this will be impossible to get a closed form for this (though I'd like to be proven wrong). If you assume various commutativity properties then you might be able to do better.

Assuming the G's are invertible, I'd do something like

0 = G (A + G' M-1 G)-1 G' + F + M
0 = G G-1(G'-1 A G-1 + M-1)-1 G'-1 G' + F + M
0 = M (1 + G'-1 A G-1 M)-1 + F + M

M = -F (1 + (1 + G'-1 A G-1 M)-1)-1

This gives a continued fraction expansion for M that can be taken to any order that you need. You could also write down the exact answer using a noncommutatitve continued fraction notation.

In QFT if you expand the continued fraction, you get an expression that looks something like mass insertions:

perturbative propagator = -- = -F
mass 2-point function = x = G'-1 A G-1
exact propagator = --o-- = M = -- + --x-- + --x--x-- + --x--x--x-- + ....

Thanks man. Actually I'm just trying to maximize the marginal likelihood function by setting the first derivative to zero. The log of the marginal likelihood is (I'm just writing down the terms containing the parameter that should be estimated):

-ln(det(A+O'inv(G)O))+ln(det(inv(G))+f'*inv(G)*f

f is a let's say mx1 vector. Unfortunately O is not invertible (it's a full row rank matrix though). Then I take the first derivative of this equation wrt inv(G) and set it to zero to find the estimate of G.
It seems finding a closed form solution for it is impossible, right?

Now that I know where it comes from I can rederive your original equation (but I have a - sign in front of the first term). And yeah, for general G, A etc..., I think that the recursive type solution above (similar things can be made without taking various inverses) is the best that you can probably do...

You can bump the ln(det())s up to a http://en.wikipedia.org/wiki/Determinant#Block_matrices" (which is probably where they came from).

And write the difference of logs in terms of the integral
$$\int_0^\infty s^{-1} (e^{(-s M_1)} - e^{(-s M_2)}) ds$$

but none of this really helps you...
Unless there is more structure to use, then I don't think it's going anywhere.

Good luck with it.

Last edited by a moderator:
Thank you man. That was a good help.
I'm just concerned about the derivation with respect to G. Actually, in my formulation inv(G) is a symmetric matrix do you think that the derivation will change in this case?

No probs.

And no, I don't think that G being symmetric will change much...