Hi, as no unique unique analytical solution exists to my problem (as another poster pointed out), I have taken to solving it through a least squares method. My equation is as follows: (s1x1 + s2x2 - I).^2 = min. Here s1 and s2 are shift matrices (I know them), and I is a matrix of size nxm, also known. x1 and x2 are two unknown matrices (obviously of size nxm). The .^2 is my way (and MATLAB's) of expressing every element in the matrix is multiplied by itself (similar to least squares). Now I could do this iteratively but the matrices are very large and I'm sure it would require several iterations to converge and thus it becomes very computationally intensive. Thus I've resorted to try to work this out in a least squares method. I understand least squares for a set of values, however I'm not sure how to extend it to matrices (if possible). Again, if anyone has any guidance in this it would be greatly appreciated! Thanks very much!