# Solutions of Matrix Equation A X B^T = C

• A

## Summary:

For given matrices ##A##, ##B## and ##C##, solve the equation $$AXB^T = C$$ for ##X## in terms of the LU decompositions of ##A## and ##B##. When are there no solutions?

## Main Question or Discussion Point

I know that every ##m×n## matrix ##D## can be expressed in the form

$$P_1DP_2 = LU$$

where ##P_1## and ##P_2## are permutation matrices, ##L## is a unit lower triangular matrix, and ##U## has the form

$$\begin{bmatrix} U_1 & U_2 \newline 0 & 0 \newline \end{bmatrix}$$

where ##U_1## is an invertible upper triangular matrix (i.e. ##U_1## has nonzero diagonal entries) with the same rank as ##D##.

Furthermore, some useful properties of permutation matrices include ##P_1^{-1}=P_1^T##.

Let the LU decompositions of ##A## and ##B## be

$$P_1 A P_2 = L_A U_A, \quad Q_1 B Q_2 = L_B U_B.$$

Then we can write the matrix equation that we are supposed to solve as follows:

$$P_1^T L_A U_A P_2^T X Q_2 U_B^T L_B^T Q_1 = C.$$

My plan was to try writing this equation in the form ##GY=H## for known ##G## and ##H##, and unknown ##Y## (without assuming that ##U_A## and ##U_B## are invertible). This idea leads to

$$U_A P_2^T X Q_2 U_B^T = L_A^{-1} P_1 C Q_1^T (L_B^T)^{-1} .$$

$$G=U_A, \quad H = L_A^{-1} P_1 C Q_1^T (L_B^T)^{-1}, \quad Y=P_2^T X Q_2 U_B^T .$$

Am I on the right track? Where do I go from here?

Related Linear and Abstract Algebra News on Phys.org
StoneTemplePython
Gold Member
2019 Award
I guess it's ok.

1.)If it were me I'd clear out the permutation matrices from the get go (/WLOG assume they aren't needed) and use associativity of matrix multiplicatyion, i.e. in effect consider

##P_1^T L_A U_A P_2^T X Q_2 U_B^T L_B^T Q_1 = C##
## L_A U_A \big(Y\big)U_B^T L_B^T = L_A U_A \big(P_2^T X Q_2\big)U_B^T L_B^T =\big(P_1 C Q_1^T\big) = Z##

2.) I'm not sure if this is a homework problem. If it is you are probably constrained in what you can do here algebraicly-- and the problem would belong in homework forum.

If it is not a homework problem, since this thread is tagged A I'd strongly suggest using the kronecker product at this stage and in particular the identity

##\text{vec}\big(\mathbf {AXB}^T\big) = \big(\mathbf B \otimes \mathbf A\big)\text{vec}\big(\mathbf {X}\big)=\text{vec}\big(\mathbf {C}\big)##

where the vec operator stacks a matrix's columns one at a time on top of each other.

From here, if your field is ##\mathbb R## or ##\mathbb C## you can use SVD and how well kronecker products work with it. (Ok this basically deprecates LU factorization)

Alternatively if you are willing to possibly play around a bit with dimensions and zero pad some of the matrices so that you have legal matrix multiplications, I'd finish with distributivity of kronecker products, i.e. use the fact that
##\big(\mathbf A \otimes \mathbf B \big)\big(\mathbf M \otimes \mathbf N\big) = \big(\mathbf{AM} \big)\otimes \big(\mathbf{BN}\big)##

then use the fact that a lower triangular matrix ##\otimes## lower triangular matrix gives a lower triangular matrix (and the same phenomenon for upper triangular matrices)

i.e. the final equation should become
##\text{vec}\big(\mathbf {AXB}^T\big) = \big(\mathbf L_\text{Big} \mathbf U_\text{Big}\big)\text{vec}\big(\mathbf {X}\big)=\text{vec}\big(\mathbf {C}\big)##

which is how I'd like to read the problem off.
- - - -
Note the kronecker product here is very natural since it is a (tractable) form of tensor product between matrices and what I'm suggesting you do in effect is to linearize a bilinear map, and convert your problem from fragments of different LU decompositions into one nice clean big LU decomposition.

for more info on kronecker products (and closely related kronecker sum) see this post:
https://www.physicsforums.com/threa...vector-subspace-of-r-2-2.929266/#post-5868072

which includes an (indirect) link to a free chapter on them