# Difficult linear algebra problem

1. Aug 5, 2011

### Klandhee

Hi, my problem is simple enough to write down but (to me) seems quite difficult to solve.

My equation is as follows

A[x1 x2] = I.

Here I is some known matrix, and A is an operator which applies a shifting matrix and sums. That is A[x1 x2] = s1x1 + s2x2, where s1 and s2 are two shifting matrices (continuously it can be thought of as convolving with a delta function). x1 and x2 are two unknown matrices of the same dimension as I. Ultimately I wish to find a matrix form for A so that I can invert it and obtain x1 and x2

So as you can see [x1 x2] can be thought of as a "stack" of matrices, or a 3D matrix (or a tensor?). However I'm very unfamiliar with the mathematics of tensors so one idea I had was to convert x1 and x2 into columns (i.e., just shopping the matrix into slices and adding one ontop of the other). That way [x1 x2] would be a matrix, and I would have lost no information.

From here, however, I am very confused and not sure where to go.

If anyone has any ideas on what to do (or if this problem is impossible) it would be GREATLY appreciated, thanks!

2. Aug 5, 2011

### rasmhop

This doesn't sound like homework so I will assume it's not (and therefore not feel bad about providing a "solution").

I'm not sure exactly what you mean by a shifting matrix. The only definition I know of is matrices which are 0 everywhere except on precisely one diagonal either below or above the main diagonal where they are 1. For instance in the 2x2 case
$$\left[\begin{array}{cc} 0 & 0 \\ 1 & 0 \end{array} \right], \quad\left[\begin{array}{cc} 0 & 1 \\ 0 & 0 \end{array} \right]$$
are the shift matrices and in the 3x3 case we have:
$$\left[\begin{array}{ccc} 0 & 0 & 0 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{array} \right], \quad\left[\begin{array}{ccc} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{array} \right]$$
If this is the case, then we can easily see that it is impossible to solve in general. For example define:
$$A[x_1,x_2] = \left[\begin{array}{cc} 0 & 0 \\ 1 & 0 \end{array} \right] x_1 +\left[\begin{array}{cc} 0 & 0 \\ 1 & 0 \end{array} \right]x_2$$
$$I = \left[\begin{array}{cc} 0 & 0 \\ 0 & 0 \end{array} \right]$$
Then we have infinitely many solutions of the form
$$x_1 = \left[\begin{array}{cc} a & b \\ c & d \end{array} \right] \qquad x_2 = \left[\begin{array}{cc} -a & -b \\ e & f \end{array} \right]$$
for arbitrary reals a,b,c,d,e,f.