Register to reply 
Linear Transformation to Blockwise Stack Matrix 
Share this thread: 
#1
Jul1311, 04:05 PM

P: 108

I have a matrix of the form X = [A B], where A and B are matrices of equal dimensions (M x N). I am looking for an elegant transformation to obtain Y = [A; B]. That is, the blocks are now stacked vertically.
Normally, I'd look for a solution of the form Y = VXW, where V is (2M x M) and W is (N x N/2). However, I feel that one may not exist in this case. For example, in the simplest case where A and B are scalars, then XW is scalar and so, in general, no V will exist which can give Y as required. In that straightforward example, of course, we just use the transpose: Y = X^{T}. However, I cannot see how to generalise this for when A and B are matrices. (Or, actually, I'm hoping eventually to find a solution for stacking many matrices [A,B,C,...]). Any help would be greatly appreciated! 


#2
Jul1411, 08:04 AM

Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 39,348

Use a "block diagonal" matrix with blocks of the form
[tex]\begin{bmatrix}1 & 1 \\ 1 & 1 \end{bmatrix}[/tex] For example, if [tex]\begin{bmatrix}a & b & u & v \\ c & d & w & x\end{bmatrix}[/tex] then [tex]\begin{bmatrix}1 & 1 & 0 & 0 \\ 1 & 1 & 0 & 0 \\ 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1\end{bmatrix}\begin{bmatrix}a & b & u & v \\ c & d & w & x\end{bmatrix}= \begin{bmatrix} a & b \\ c & d \\ u & v \\ w & x \end{bmatrix}[/tex] 


#3
Jul1411, 08:37 AM

P: 108

Thank you for the reply.
I'm not sure if I understand correctly, but it seems that you are taking a (2 x 4) matrix and leftmultiplying it by a (4 x 4) matrix. Perhaps I'm missing something, but in general I cannot see how this is possible. If a solution exists by straightforward matrix multiplication, I would expect us to require 2 matrices (one to leftmultiply X, one to rightmultiply X): Y = VXW (2M x N) = (2M x M)(M x 2N)(2N x N) {Note: My original post lists the wrong dimensions. These should now be correct.} 


#4
Jul1411, 09:48 AM

P: 128

Linear Transformation to Blockwise Stack Matrix
More seriously: isn't it impossible (as the original poster rightly suggested) to get what he wants with a single matrix? Here's the simple proof: matrix multiplication makes an [itex]m \times n[/itex] matrix from an [itex]m \times k[/itex] matrix and a [itex]k \times n[/itex] one. [itex]k[/itex] can be anything, as long as it's the same for both matrices. If we could get by with a single transformation matrix, then for this example, the original matrix tells us [itex]k=2[/itex] and [itex]n=4[/itex], while the final matrix tells us [itex]m=4[/itex], [itex]n=2[/itex]. That's a contradiction in [itex]n[/itex]. I don't know how to do what the OP wants, but he is perfectly correct that he'll need at least two matrices to do it. 


#5
Jul1511, 01:02 AM

P: 4,572

I think he might have meant to put identity matrices instead of having all 1's.



#6
Jul1511, 04:22 AM

P: 108

Either way, the matter still remains that a (2M x N) matrix cannot be obtained by only a single multiplication of an (M x 2N) matrix with one other matrix. It's quite interesting and surprising to me that such a seemingly straightforward operation is so tricky to implement! I feel like I must be missing a trick... 


#7
Jul1511, 06:38 AM

P: 128

Well, I now think there's no way to do it. Consider taking a 1x2 matrix into a 2x1 matrix. Then your two matrices must be 2x1 and 1x2. In terms of dimensions:
[tex](2,1) = (2,1)(1,2)(2,1)[/tex] (Since it has to begin and end with a 2 and 1, and have a 1 and 2 in the middle.) So we have: [tex] \left[\begin{array}{c} x\\ y\end{array}\right] = \left[\begin{array}{c} a\\ b\end{array}\right] \left[\begin{array}{cc} x & y\end{array}\right] \left[\begin{array}{c} c\\ d\end{array}\right] [/tex] Do the rightmost multiplication first: [tex] \left[\begin{array}{c} x\\ y\end{array}\right] = \left[\begin{array}{c} a\\ b\end{array}\right] (cx + dy) [/tex] There's no way that could be true for all [itex]x,y[/itex] if the other matrices are constant. So unless I missed something, I don't think you can do what you're trying to do. 


#8
Jul1511, 10:43 AM

P: 108




#9
Jul1511, 11:28 AM

P: 9

Well, the transpose of
[tex]\begin{bmatrix}A & B \end{bmatrix}[/tex] would actually be [tex]\begin{bmatrix}A^{T} \\ B^{T} \end{bmatrix}[/tex] if I am not mistaken, so you would have to require A and B to be symmetric as well for the transpose to work. I don't think this is what you were looking for, but after playing around for a little while, I found that: [tex]\begin{bmatrix}I_{m} \\ 0_{m} \end{bmatrix} \begin{bmatrix}A & B \end{bmatrix} \begin{bmatrix}I_{n} \\ 0_{n} \end{bmatrix} + \begin{bmatrix}0_{m} \\ I_{m} \end{bmatrix} \begin{bmatrix}A & B \end{bmatrix} \begin{bmatrix}0_{n} \\ I_{n} \end{bmatrix} = \begin{bmatrix} A \\ 0 \end{bmatrix} + \begin{bmatrix} 0 \\ B \end{bmatrix} = \begin{bmatrix} A \\ B \end{bmatrix}[/tex] I'm not sure if the above can be simplified to something of the form AXB, but it can easily be generalized for matrices of the form [A  B  C  etc. ]. 


#10
Jul1511, 11:48 AM

P: 128

Doing it right, you see that we need to simultaneously satisfy two equations:
Stylish's solution works, of course, but I'm not sure if it's quite like what you had in mind. Actually, his solution reminds me of a similar thread we had recently. See here: http://www.physicsforums.com/showthread.php?t=511243 for how to generate matrices that "pick out" an individual component of a matrix (along with a boneheaded mistake by yours truly). Cheers, Chip PS On rereading your original post, I noticed you basically gave exactly the argument which I gave later (except where I pointed out it would apply to block matrices as well). Somehow I missed that on the first go'round  sorry! :) 


#11
Jul1511, 01:27 PM

P: 108

Thanks for figuring out something that works. My hunch is that this is about as 'nice' a solution as we're going to get (of course, implementing this in a programming language is straightforward, but it would nice to be able to express it elegantly in mathematical terms too). Thanks to all for helping out. Still, 10 points to anyone who can provide (or disprove the existence of) a general solution of the form Y = AXB. 


#12
Jul1511, 05:32 PM

P: 128

I am happy to collect the 10 points for my above post, which disproves the existence of such a transformation in a particular case.
If there's a particular case that doesn't work, then there can't be a general formula. (Let me know if I've missed anything!) 


Register to reply 
Related Discussions  
Linear transformation and matrix transformation  Linear & Abstract Algebra  5  
Linear transformation and its matrix  Linear & Abstract Algebra  3  
Matrix of a linear transformation  Calculus & Beyond Homework  3  
The Matrix Of A Linear Transformation  Calculus & Beyond Homework  3  
Matrix of linear transformation  Introductory Physics Homework  5 