Linear Transformation to Block-wise Stack Matrix


by weetabixharry
Tags: linear, matrix, reshape, transformation, transpose
weetabixharry
weetabixharry is offline
#1
Jul13-11, 04:05 PM
P: 96
I have a matrix of the form X = [A B], where A and B are matrices of equal dimensions (M x N). I am looking for an elegant transformation to obtain Y = [A; B]. That is, the blocks are now stacked vertically.

Normally, I'd look for a solution of the form Y = VXW, where V is (2M x M) and W is (N x N/2). However, I feel that one may not exist in this case. For example, in the simplest case where A and B are scalars, then XW is scalar and so, in general, no V will exist which can give Y as required.

In that straightforward example, of course, we just use the transpose: Y = XT. However, I cannot see how to generalise this for when A and B are matrices. (Or, actually, I'm hoping eventually to find a solution for stacking many matrices [A,B,C,...]).

Any help would be greatly appreciated!
Phys.Org News Partner Science news on Phys.org
SensaBubble: It's a bubble, but not as we know it (w/ video)
The hemihelix: Scientists discover a new shape using rubber bands (w/ video)
Microbes provide insights into evolution of human language
HallsofIvy
HallsofIvy is online now
#2
Jul14-11, 08:04 AM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,898
Use a "block diagonal" matrix with blocks of the form
[tex]\begin{bmatrix}1 & 1 \\ 1 & 1 \end{bmatrix}[/tex]

For example, if
[tex]\begin{bmatrix}a & b & u & v \\ c & d & w & x\end{bmatrix}[/tex]
then
[tex]\begin{bmatrix}1 & 1 & 0 & 0 \\ 1 & 1 & 0 & 0 \\ 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1\end{bmatrix}\begin{bmatrix}a & b & u & v \\ c & d & w & x\end{bmatrix}= \begin{bmatrix} a & b \\ c & d \\ u & v \\ w & x \end{bmatrix}[/tex]
weetabixharry
weetabixharry is offline
#3
Jul14-11, 08:37 AM
P: 96
Thank you for the reply.

I'm not sure if I understand correctly, but it seems that you are taking a (2 x 4) matrix and left-multiplying it by a (4 x 4) matrix. Perhaps I'm missing something, but in general I cannot see how this is possible.

If a solution exists by straightforward matrix multiplication, I would expect us to require 2 matrices (one to left-multiply X, one to right-multiply X):

Y = VXW
(2M x N) = (2M x M)(M x 2N)(2N x N)

{Note: My original post lists the wrong dimensions. These should now be correct.}

chogg
chogg is offline
#4
Jul14-11, 09:48 AM
P: 117

Linear Transformation to Block-wise Stack Matrix


Quote Quote by HallsofIvy View Post
Use a "block diagonal" matrix with blocks of the form
[tex]\begin{bmatrix}1 & 1 \\ 1 & 1 \end{bmatrix}[/tex]

For example, if
[tex]\begin{bmatrix}a & b & u & v \\ c & d & w & x\end{bmatrix}[/tex]
then
[tex]\begin{bmatrix}1 & 1 & 0 & 0 \\ 1 & 1 & 0 & 0 \\ 0 & 0 & 1 & 1 \\ 0 & 0 & 1 & 1\end{bmatrix}\begin{bmatrix}a & b & u & v \\ c & d & w & x\end{bmatrix}= \begin{bmatrix} a & b \\ c & d \\ u & v \\ w & x \end{bmatrix}[/tex]
I'm left scratching my head by this one. Wouldn't your 4x4 matrix give elements like [itex](a+c)[/itex]?

More seriously: isn't it impossible (as the original poster rightly suggested) to get what he wants with a single matrix? Here's the simple proof: matrix multiplication makes an [itex]m \times n[/itex] matrix from an [itex]m \times k[/itex] matrix and a [itex]k \times n[/itex] one. [itex]k[/itex] can be anything, as long as it's the same for both matrices. If we could get by with a single transformation matrix, then for this example, the original matrix tells us [itex]k=2[/itex] and [itex]n=4[/itex], while the final matrix tells us [itex]m=4[/itex], [itex]n=2[/itex]. That's a contradiction in [itex]n[/itex].

I don't know how to do what the OP wants, but he is perfectly correct that he'll need at least two matrices to do it.
chiro
chiro is offline
#5
Jul15-11, 01:02 AM
P: 4,570
I think he might have meant to put identity matrices instead of having all 1's.
weetabixharry
weetabixharry is offline
#6
Jul15-11, 04:22 AM
P: 96
Quote Quote by chiro View Post
I think he might have meant to put identity matrices instead of having all 1's.
A block-wise identity matrix in that way would just be an identity matrix.

Either way, the matter still remains that a (2M x N) matrix cannot be obtained by only a single multiplication of an (M x 2N) matrix with one other matrix.

It's quite interesting and surprising to me that such a seemingly straightforward operation is so tricky to implement! I feel like I must be missing a trick...
chogg
chogg is offline
#7
Jul15-11, 06:38 AM
P: 117
Well, I now think there's no way to do it. Consider taking a 1x2 matrix into a 2x1 matrix. Then your two matrices must be 2x1 and 1x2. In terms of dimensions:
[tex](2,1) = (2,1)(1,2)(2,1)[/tex]
(Since it has to begin and end with a 2 and 1, and have a 1 and 2 in the middle.)

So we have:
[tex]
\left[\begin{array}{c} x\\ y\end{array}\right] =
\left[\begin{array}{c} a\\ b\end{array}\right]
\left[\begin{array}{cc} x & y\end{array}\right]
\left[\begin{array}{c} c\\ d\end{array}\right]
[/tex]
Do the rightmost multiplication first:
[tex]
\left[\begin{array}{c} x\\ y\end{array}\right] =
\left[\begin{array}{c} a\\ b\end{array}\right]
(cx + dy)
[/tex]
There's no way that could be true for all [itex]x,y[/itex] if the other matrices are constant. So unless I missed something, I don't think you can do what you're trying to do.
weetabixharry
weetabixharry is offline
#8
Jul15-11, 10:43 AM
P: 96
Quote Quote by chogg View Post
Well, I now think there's no way to do it. Consider taking a 1x2 matrix into a 2x1 matrix. Then your two matrices must be 2x1 and 1x2. In terms of dimensions:
[tex](2,1) = (2,1)(1,2)(2,1)[/tex]
(Since it has to begin and end with a 2 and 1, and have a 1 and 2 in the middle.)

So we have:
[tex]
\left[\begin{array}{c} x\\ y\end{array}\right] =
\left[\begin{array}{c} a\\ b\end{array}\right]
\left[\begin{array}{cc} x & y\end{array}\right]
\left[\begin{array}{c} c\\ d\end{array}\right]
[/tex]
Do the rightmost multiplication first:
[tex]
\left[\begin{array}{c} x\\ y\end{array}\right] =
\left[\begin{array}{c} a\\ b\end{array}\right]
(cx + dy)
[/tex]
There's no way that could be true for all [itex]x,y[/itex] if the other matrices are constant. So unless I missed something, I don't think you can do what you're trying to do.
As I wrote in my original post, for a (1 x 2) input, the elegant solution is to simply use the matrix transpose. However, it is not clear how to generalise this for larger block structures (e.g. a solution that perhaps involves both matrix multiplications and transposes).
Stylish
Stylish is offline
#9
Jul15-11, 11:28 AM
P: 9
Well, the transpose of
[tex]\begin{bmatrix}A & B \end{bmatrix}[/tex]
would actually be
[tex]\begin{bmatrix}A^{T} \\ B^{T} \end{bmatrix}[/tex]
if I am not mistaken, so you would have to require A and B to be symmetric as well for the transpose to work.

I don't think this is what you were looking for, but after playing around for a little while, I found that:
[tex]\begin{bmatrix}I_{m} \\ 0_{m} \end{bmatrix} \begin{bmatrix}A & B \end{bmatrix} \begin{bmatrix}I_{n} \\ 0_{n} \end{bmatrix} + \begin{bmatrix}0_{m} \\ I_{m} \end{bmatrix} \begin{bmatrix}A & B \end{bmatrix} \begin{bmatrix}0_{n} \\ I_{n} \end{bmatrix} = \begin{bmatrix} A \\ 0 \end{bmatrix} + \begin{bmatrix} 0 \\ B \end{bmatrix} = \begin{bmatrix} A \\ B \end{bmatrix}[/tex]

I'm not sure if the above can be simplified to something of the form AXB, but it can easily be generalized for matrices of the form [A | B | C | etc. ].
chogg
chogg is offline
#10
Jul15-11, 11:48 AM
P: 117
Quote Quote by weetabixharry View Post
As I wrote in my original post, for a (1 x 2) input, the elegant solution is to simply use the matrix transpose. However, it is not clear how to generalise this for larger block structures (e.g. a solution that perhaps involves both matrix multiplications and transposes).
Okay; same argument as above, but now [itex]x, y, a, b, c, d[/itex] are block matrices with the required dimensions. It's still just as impossible (except that in my previous post, I clumsily transposed the order of the products [itex]xc[/itex] and [itex]yd[/itex]).

Doing it right, you see that we need to simultaneously satisfy two equations:
  • [itex] x = a(xc + yd)[/itex]
  • [itex] y = b(xc + yd)[/itex]
We're only assuming that multiplication is associative here (not necessarily commutative), which is true for matrix multiplication, so it doesn't matter if these guys are scalars or block matrices. The first requires [itex]c \ne 0[/itex] and [itex]d=0[/itex], while the second requires [itex]d \ne 0[/itex] and [itex]c=0[/itex]. Can't be done in this case; therefore, can't be done in general.

Stylish's solution works, of course, but I'm not sure if it's quite like what you had in mind. Actually, his solution reminds me of a similar thread we had recently. See here:
http://www.physicsforums.com/showthread.php?t=511243
for how to generate matrices that "pick out" an individual component of a matrix (along with a boneheaded mistake by yours truly).

Cheers,
Chip

PS On rereading your original post, I noticed you basically gave exactly the argument which I gave later (except where I pointed out it would apply to block matrices as well). Somehow I missed that on the first go'round -- sorry! :)
weetabixharry
weetabixharry is offline
#11
Jul15-11, 01:27 PM
P: 96
Quote Quote by Stylish View Post
Well, the transpose of
[tex]\begin{bmatrix}A & B \end{bmatrix}[/tex]
would actually be
[tex]\begin{bmatrix}A^{T} \\ B^{T} \end{bmatrix}[/tex]
if I am not mistaken, so you would have to require A and B to be symmetric as well for the transpose to work.

I don't think this is what you were looking for, but after playing around for a little while, I found that:
[tex]\begin{bmatrix}I_{m} \\ 0_{m} \end{bmatrix} \begin{bmatrix}A & B \end{bmatrix} \begin{bmatrix}I_{n} \\ 0_{n} \end{bmatrix} + \begin{bmatrix}0_{m} \\ I_{m} \end{bmatrix} \begin{bmatrix}A & B \end{bmatrix} \begin{bmatrix}0_{n} \\ I_{n} \end{bmatrix} = \begin{bmatrix} A \\ 0 \end{bmatrix} + \begin{bmatrix} 0 \\ B \end{bmatrix} = \begin{bmatrix} A \\ B \end{bmatrix}[/tex]

I'm not sure if the above can be simplified to something of the form AXB, but it can easily be generalized for matrices of the form [A | B | C | etc. ].
That's a very good point regarding the requirement of symmetricity if I want to use transposes. That rules that idea out!

Thanks for figuring out something that works. My hunch is that this is about as 'nice' a solution as we're going to get (of course, implementing this in a programming language is straightforward, but it would nice to be able to express it elegantly in mathematical terms too).

Thanks to all for helping out. Still, 10 points to anyone who can provide (or disprove the existence of) a general solution of the form Y = AXB.
chogg
chogg is offline
#12
Jul15-11, 05:32 PM
P: 117
I am happy to collect the 10 points for my above post, which disproves the existence of such a transformation in a particular case.

If there's a particular case that doesn't work, then there can't be a general formula.

(Let me know if I've missed anything!)


Register to reply

Related Discussions
Linear transformation and matrix transformation Linear & Abstract Algebra 5
Linear transformation and its matrix Linear & Abstract Algebra 3
matrix of a linear transformation Calculus & Beyond Homework 3
The Matrix Of A Linear Transformation Calculus & Beyond Homework 3
Matrix of linear transformation Introductory Physics Homework 5