# Linear Transformation to Block-wise Stack Matrix

by weetabixharry
Tags: linear, matrix, reshape, transformation, transpose
 Share this thread:
 P: 108 I have a matrix of the form X = [A B], where A and B are matrices of equal dimensions (M x N). I am looking for an elegant transformation to obtain Y = [A; B]. That is, the blocks are now stacked vertically. Normally, I'd look for a solution of the form Y = VXW, where V is (2M x M) and W is (N x N/2). However, I feel that one may not exist in this case. For example, in the simplest case where A and B are scalars, then XW is scalar and so, in general, no V will exist which can give Y as required. In that straightforward example, of course, we just use the transpose: Y = XT. However, I cannot see how to generalise this for when A and B are matrices. (Or, actually, I'm hoping eventually to find a solution for stacking many matrices [A,B,C,...]). Any help would be greatly appreciated!
 Math Emeritus Sci Advisor Thanks PF Gold P: 39,348 Use a "block diagonal" matrix with blocks of the form $$\begin{bmatrix}1 & 1 \\ 1 & 1 \end{bmatrix}$$ For example, if $$\begin{bmatrix}a & b & u & v \\ c & d & w & x\end{bmatrix}$$ then $$\begin{bmatrix}1 & 1 & 0 & 0 \\ 1 & 1 & 0 & 0 \\ 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1\end{bmatrix}\begin{bmatrix}a & b & u & v \\ c & d & w & x\end{bmatrix}= \begin{bmatrix} a & b \\ c & d \\ u & v \\ w & x \end{bmatrix}$$
 P: 108 Thank you for the reply. I'm not sure if I understand correctly, but it seems that you are taking a (2 x 4) matrix and left-multiplying it by a (4 x 4) matrix. Perhaps I'm missing something, but in general I cannot see how this is possible. If a solution exists by straightforward matrix multiplication, I would expect us to require 2 matrices (one to left-multiply X, one to right-multiply X): Y = VXW (2M x N) = (2M x M)(M x 2N)(2N x N) {Note: My original post lists the wrong dimensions. These should now be correct.}
P: 128
Linear Transformation to Block-wise Stack Matrix

 Quote by HallsofIvy Use a "block diagonal" matrix with blocks of the form $$\begin{bmatrix}1 & 1 \\ 1 & 1 \end{bmatrix}$$ For example, if $$\begin{bmatrix}a & b & u & v \\ c & d & w & x\end{bmatrix}$$ then $$\begin{bmatrix}1 & 1 & 0 & 0 \\ 1 & 1 & 0 & 0 \\ 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1\end{bmatrix}\begin{bmatrix}a & b & u & v \\ c & d & w & x\end{bmatrix}= \begin{bmatrix} a & b \\ c & d \\ u & v \\ w & x \end{bmatrix}$$
I'm left scratching my head by this one. Wouldn't your 4x4 matrix give elements like $(a+c)$?

More seriously: isn't it impossible (as the original poster rightly suggested) to get what he wants with a single matrix? Here's the simple proof: matrix multiplication makes an $m \times n$ matrix from an $m \times k$ matrix and a $k \times n$ one. $k$ can be anything, as long as it's the same for both matrices. If we could get by with a single transformation matrix, then for this example, the original matrix tells us $k=2$ and $n=4$, while the final matrix tells us $m=4$, $n=2$. That's a contradiction in $n$.

I don't know how to do what the OP wants, but he is perfectly correct that he'll need at least two matrices to do it.
 P: 4,572 I think he might have meant to put identity matrices instead of having all 1's.
P: 108
 Quote by chiro I think he might have meant to put identity matrices instead of having all 1's.
A block-wise identity matrix in that way would just be an identity matrix.

Either way, the matter still remains that a (2M x N) matrix cannot be obtained by only a single multiplication of an (M x 2N) matrix with one other matrix.

It's quite interesting and surprising to me that such a seemingly straightforward operation is so tricky to implement! I feel like I must be missing a trick...
 P: 128 Well, I now think there's no way to do it. Consider taking a 1x2 matrix into a 2x1 matrix. Then your two matrices must be 2x1 and 1x2. In terms of dimensions: $$(2,1) = (2,1)(1,2)(2,1)$$ (Since it has to begin and end with a 2 and 1, and have a 1 and 2 in the middle.) So we have: $$\left[\begin{array}{c} x\\ y\end{array}\right] = \left[\begin{array}{c} a\\ b\end{array}\right] \left[\begin{array}{cc} x & y\end{array}\right] \left[\begin{array}{c} c\\ d\end{array}\right]$$ Do the rightmost multiplication first: $$\left[\begin{array}{c} x\\ y\end{array}\right] = \left[\begin{array}{c} a\\ b\end{array}\right] (cx + dy)$$ There's no way that could be true for all $x,y$ if the other matrices are constant. So unless I missed something, I don't think you can do what you're trying to do.
P: 108
 Quote by chogg Well, I now think there's no way to do it. Consider taking a 1x2 matrix into a 2x1 matrix. Then your two matrices must be 2x1 and 1x2. In terms of dimensions: $$(2,1) = (2,1)(1,2)(2,1)$$ (Since it has to begin and end with a 2 and 1, and have a 1 and 2 in the middle.) So we have: $$\left[\begin{array}{c} x\\ y\end{array}\right] = \left[\begin{array}{c} a\\ b\end{array}\right] \left[\begin{array}{cc} x & y\end{array}\right] \left[\begin{array}{c} c\\ d\end{array}\right]$$ Do the rightmost multiplication first: $$\left[\begin{array}{c} x\\ y\end{array}\right] = \left[\begin{array}{c} a\\ b\end{array}\right] (cx + dy)$$ There's no way that could be true for all $x,y$ if the other matrices are constant. So unless I missed something, I don't think you can do what you're trying to do.
As I wrote in my original post, for a (1 x 2) input, the elegant solution is to simply use the matrix transpose. However, it is not clear how to generalise this for larger block structures (e.g. a solution that perhaps involves both matrix multiplications and transposes).
 P: 9 Well, the transpose of $$\begin{bmatrix}A & B \end{bmatrix}$$ would actually be $$\begin{bmatrix}A^{T} \\ B^{T} \end{bmatrix}$$ if I am not mistaken, so you would have to require A and B to be symmetric as well for the transpose to work. I don't think this is what you were looking for, but after playing around for a little while, I found that: $$\begin{bmatrix}I_{m} \\ 0_{m} \end{bmatrix} \begin{bmatrix}A & B \end{bmatrix} \begin{bmatrix}I_{n} \\ 0_{n} \end{bmatrix} + \begin{bmatrix}0_{m} \\ I_{m} \end{bmatrix} \begin{bmatrix}A & B \end{bmatrix} \begin{bmatrix}0_{n} \\ I_{n} \end{bmatrix} = \begin{bmatrix} A \\ 0 \end{bmatrix} + \begin{bmatrix} 0 \\ B \end{bmatrix} = \begin{bmatrix} A \\ B \end{bmatrix}$$ I'm not sure if the above can be simplified to something of the form AXB, but it can easily be generalized for matrices of the form [A | B | C | etc. ].
P: 128
 Quote by weetabixharry As I wrote in my original post, for a (1 x 2) input, the elegant solution is to simply use the matrix transpose. However, it is not clear how to generalise this for larger block structures (e.g. a solution that perhaps involves both matrix multiplications and transposes).
Okay; same argument as above, but now $x, y, a, b, c, d$ are block matrices with the required dimensions. It's still just as impossible (except that in my previous post, I clumsily transposed the order of the products $xc$ and $yd$).

Doing it right, you see that we need to simultaneously satisfy two equations:
• $x = a(xc + yd)$
• $y = b(xc + yd)$
We're only assuming that multiplication is associative here (not necessarily commutative), which is true for matrix multiplication, so it doesn't matter if these guys are scalars or block matrices. The first requires $c \ne 0$ and $d=0$, while the second requires $d \ne 0$ and $c=0$. Can't be done in this case; therefore, can't be done in general.

Stylish's solution works, of course, but I'm not sure if it's quite like what you had in mind. Actually, his solution reminds me of a similar thread we had recently. See here:
http://www.physicsforums.com/showthread.php?t=511243
for how to generate matrices that "pick out" an individual component of a matrix (along with a boneheaded mistake by yours truly).

Cheers,
Chip

PS On rereading your original post, I noticed you basically gave exactly the argument which I gave later (except where I pointed out it would apply to block matrices as well). Somehow I missed that on the first go'round -- sorry! :)
P: 108
 Quote by Stylish Well, the transpose of $$\begin{bmatrix}A & B \end{bmatrix}$$ would actually be $$\begin{bmatrix}A^{T} \\ B^{T} \end{bmatrix}$$ if I am not mistaken, so you would have to require A and B to be symmetric as well for the transpose to work. I don't think this is what you were looking for, but after playing around for a little while, I found that: $$\begin{bmatrix}I_{m} \\ 0_{m} \end{bmatrix} \begin{bmatrix}A & B \end{bmatrix} \begin{bmatrix}I_{n} \\ 0_{n} \end{bmatrix} + \begin{bmatrix}0_{m} \\ I_{m} \end{bmatrix} \begin{bmatrix}A & B \end{bmatrix} \begin{bmatrix}0_{n} \\ I_{n} \end{bmatrix} = \begin{bmatrix} A \\ 0 \end{bmatrix} + \begin{bmatrix} 0 \\ B \end{bmatrix} = \begin{bmatrix} A \\ B \end{bmatrix}$$ I'm not sure if the above can be simplified to something of the form AXB, but it can easily be generalized for matrices of the form [A | B | C | etc. ].
That's a very good point regarding the requirement of symmetricity if I want to use transposes. That rules that idea out!

Thanks for figuring out something that works. My hunch is that this is about as 'nice' a solution as we're going to get (of course, implementing this in a programming language is straightforward, but it would nice to be able to express it elegantly in mathematical terms too).

Thanks to all for helping out. Still, 10 points to anyone who can provide (or disprove the existence of) a general solution of the form Y = AXB.
 P: 128 I am happy to collect the 10 points for my above post, which disproves the existence of such a transformation in a particular case. If there's a particular case that doesn't work, then there can't be a general formula. (Let me know if I've missed anything!)

 Related Discussions Linear & Abstract Algebra 5 Linear & Abstract Algebra 3 Calculus & Beyond Homework 3 Calculus & Beyond Homework 3 Introductory Physics Homework 5