Solutions of Matrix Equation A X B^T = C

  • Context: Graduate 
  • Thread starter Thread starter quark002
  • Start date Start date
  • Tags Tags
    Matrix
Click For Summary
SUMMARY

The discussion focuses on solving the matrix equation A X B^T = C using LU decomposition and the Kronecker product. The participants outline the process of expressing matrices in the form P_1DP_2 = LU, where P_1 and P_2 are permutation matrices. They suggest leveraging the Kronecker product identity, vec(AXB^T) = (B ⊗ A)vec(X), to simplify the problem and propose using SVD for further analysis. The conversation emphasizes the importance of understanding the properties of permutation matrices and the application of Kronecker products in matrix equations.

PREREQUISITES
  • Understanding of LU decomposition and its components (L and U matrices).
  • Familiarity with permutation matrices and their properties.
  • Knowledge of the Kronecker product and its applications in linear algebra.
  • Experience with the vectorization operator (vec) and its role in matrix equations.
NEXT STEPS
  • Study the properties and applications of the Kronecker product in matrix equations.
  • Learn about the singular value decomposition (SVD) and its relationship with Kronecker products.
  • Explore advanced topics in linear algebra, particularly focusing on tensor products and their implications.
  • Investigate the use of permutation matrices in solving linear systems and their impact on computational efficiency.
USEFUL FOR

Mathematicians, data scientists, and engineers working with linear algebra, particularly those involved in solving matrix equations and optimizing computational methods in numerical analysis.

quark002
Messages
3
Reaction score
0
TL;DR
For given matrices ##A##, ##B## and ##C##, solve the equation $$AXB^T = C$$ for ##X## in terms of the LU decompositions of ##A## and ##B##. When are there no solutions?
I know that every ##m×n## matrix ##D## can be expressed in the form

$$P_1DP_2 = LU$$

where ##P_1## and ##P_2## are permutation matrices, ##L## is a unit lower triangular matrix, and ##U## has the form

$$\begin{bmatrix}
U_1 & U_2 \newline
0 & 0 \newline
\end{bmatrix}$$

where ##U_1## is an invertible upper triangular matrix (i.e. ##U_1## has nonzero diagonal entries) with the same rank as ##D##.

Furthermore, some useful properties of permutation matrices include ##P_1^{-1}=P_1^T##.

Let the LU decompositions of ##A## and ##B## be

$$P_1 A P_2 = L_A U_A, \quad Q_1 B Q_2 = L_B U_B.$$

Then we can write the matrix equation that we are supposed to solve as follows:

$$P_1^T L_A U_A P_2^T X Q_2 U_B^T L_B^T Q_1 = C.$$

My plan was to try writing this equation in the form ##GY=H## for known ##G## and ##H##, and unknown ##Y## (without assuming that ##U_A## and ##U_B## are invertible). This idea leads to

$$U_A P_2^T X Q_2 U_B^T = L_A^{-1} P_1 C Q_1^T (L_B^T)^{-1} .$$

Then we can read off
$$G=U_A, \quad H = L_A^{-1} P_1 C Q_1^T (L_B^T)^{-1}, \quad Y=P_2^T X Q_2 U_B^T .$$

Am I on the right track? Where do I go from here?
 
Physics news on Phys.org
I guess it's ok.

1.)If it were me I'd clear out the permutation matrices from the get go (/WLOG assume they aren't needed) and use associativity of matrix multiplicatyion, i.e. in effect consider

##P_1^T L_A U_A P_2^T X Q_2 U_B^T L_B^T Q_1 = C##
or instead
## L_A U_A \big(Y\big)U_B^T L_B^T = L_A U_A \big(P_2^T X Q_2\big)U_B^T L_B^T =\big(P_1 C Q_1^T\big) = Z##

2.) I'm not sure if this is a homework problem. If it is you are probably constrained in what you can do here algebraicly-- and the problem would belong in homework forum.

If it is not a homework problem, since this thread is tagged A I'd strongly suggest using the kronecker product at this stage and in particular the identity

##\text{vec}\big(\mathbf {AXB}^T\big) = \big(\mathbf B \otimes \mathbf A\big)\text{vec}\big(\mathbf {X}\big)=\text{vec}\big(\mathbf {C}\big)##

where the vec operator stacks a matrix's columns one at a time on top of each other.

From here, if your field is ##\mathbb R## or ##\mathbb C## you can use SVD and how well kronecker products work with it. (Ok this basically deprecates LU factorization)

Alternatively if you are willing to possibly play around a bit with dimensions and zero pad some of the matrices so that you have legal matrix multiplications, I'd finish with distributivity of kronecker products, i.e. use the fact that
##\big(\mathbf A \otimes \mathbf B \big)\big(\mathbf M \otimes \mathbf N\big) = \big(\mathbf{AM} \big)\otimes \big(\mathbf{BN}\big)##

then use the fact that a lower triangular matrix ##\otimes## lower triangular matrix gives a lower triangular matrix (and the same phenomenon for upper triangular matrices)

i.e. the final equation should become
##\text{vec}\big(\mathbf {AXB}^T\big) = \big(\mathbf L_\text{Big} \mathbf U_\text{Big}\big)\text{vec}\big(\mathbf {X}\big)=\text{vec}\big(\mathbf {C}\big)##

which is how I'd like to read the problem off.
- - - -
Note the kronecker product here is very natural since it is a (tractable) form of tensor product between matrices and what I'm suggesting you do in effect is to linearize a bilinear map, and convert your problem from fragments of different LU decompositions into one nice clean big LU decomposition. for more info on kronecker products (and closely related kronecker sum) see this post:
https://www.physicsforums.com/threa...vector-subspace-of-r-2-2.929266/#post-5868072

which includes an (indirect) link to a free chapter on them
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 9 ·
Replies
9
Views
1K
  • · Replies 2 ·
Replies
2
Views
4K